Human Robotics: Modeling Human Cognition via Wearable Robotics

Byoung-Tak Zhang

Abstract

Behaviorism has focused on measurable stimulus-response relationships of human behavior while ignoring cognition. In contrast, cognitivism has focused on the internal information processing mechanisms of the mind while ignoring the body and action. Recent studies in cognitive science emphasize the embodied cognition and its interaction with the environment within the perception-action cycle. However, many researchers believe that, despite its significance, the progress of embodied and situated mind research would be quite slow due to the technical difficulties of sensing and modeling the experimental data in real world. In this talk we argue that the emerging wearable technology, such as smart glasses and wearable EEG devices, and machine learning come to the rescue. Based on this idea, we present a new research paradigm for studying human thoughts and acts in ecologically-valid environments using wearable devices and robotics technology. The proposed “Human Robotics” approach to cognitive science views the wearable devices as robots that continually sense and track the everyday activity of the wearers (a “Wearable Robotics” problem from the robotics point of view). Using machine learning technology combined with mobile and cloud computing, the wearable robots can model the human mind and life in real-world in real-time over an extended period. We take our “Cognome” initiative as an example to illustrate the human robotics paradigm and discuss its experimental setups, applications, prospects, and the challenges in cognitive modeling.