
Vladimir Lumelsky, Professor Emeritus at the University of Wisconsin-Madison will give a guest lecture titled Human-Robot Interaction and Whole-Body Robot Sensing.
Abstract
The ability by a robot to operate in an uncertain environment, such as near humans or far away under human control, potentially opens a myriad uses. Examples include robots preparing the Mars surface for human arrival; robots for assembly of large space telescopes; robot helpers for the elderly; robot search and disposal of war mines. So far advances in this area have been focusing on small categories of tasks rather than on a universal ability typical in nature. Challenges appear both on the robotics side and on human side: robots have hard time adjusting to unstructured environment, whereas human cognition has serious limits in adjusting to robots and grasping complex 2D and 3D motion. As a result, applications where robots operate near humans – or far away under their control – are exceedingly rare. The way out of this impasse is to supply the robot with a whole-body sensing - an ability to sense surrounding objects at the robot’s whole body and utilize these data in real time. This calls for large-area flexible arrays - sensitive skin covering the whole robot body akin to the skin covering the human body. Whole-body sensing brings interesting, even unexpected, properties: powerful robots become inherently safe; human operators can move them fast, with “natural” speeds; robot motion strategies exceed human spatial reasoning skills; it becomes realistic to utilize natural synergy of human-robot teams and allow a mix of supervised and unsupervised robot operation. We will review the mathematical, algorithmic, hardware (materials, electronics, computing), as well as control and cognitive science issues involved in realizing such systems.