Interfaces and Interaction

For intuitive interaction, autonomous and mechatronic systems have to acquire and interpret multimodal sensor data and provide feedback, taking into account the physical abilities of the users. Especially in wearable robotic systems, user interface design is of utmost importance because the system should be perceived as a part of the user’s own body.

Therefore, our research focuses on improving our understanding of human perception and experience as well as the development of user-oriented interfaces for technical systems in general. To improve the integration of wearable robotic systems into their users’ body experience and the user experience of cognitive systems, we investigate the influence of different sensors, stimulators, and interaction strategies. To this end, we modify experimental approaches from psychological and cognitive science research into human-in-the-loop experiments to investigate underlying mechanisms and their practical consideration.


Current projects on this topic

Integrated models of cognitive and physical human-robot interaction

Funded by the Volkswagen Foundation: 9B 007

Combining the perspectives from engineering, biomechanics (Univ. Tübingen), and cognitive science (TU Berlin), we aim at developing unified models of physical and cognitive human-robot interaction. Working towards that goal, we prepare novel concepts for interdisciplinary discourse and teaching to motivate structural changes supporting inter-institutional collaboration. We expect the resulting interaction models will cover cognitive, musculo-skeletal, and control aspects, to improve joint cognitive decisions and physical robot assistance through user-centered human-robot interaction design

Active transfer learning with neural networks through human-robot interaction (TRAIN)

Funded by the DFG: BE 5729/16

In order to be able to use autonomous robots flexibly in interaction with humans in the future, procedures are needed that enable the learning of various motor and manipulation skills and that can also be applied not only by experts. We aim to improve the learning of robot skills with neural networks, taking into account human feedback as well as the experience and instructions of the users. To implement this systematically, we evaluate subjective feedback and physiological data from user studies and develop assessment criteria for the development of human-oriented methods of transfer learning and the shared autonomy of humans and robots.

More information can be found here.


EFFENDI – EFficient and Fast text ENtry for persons with motor Disabilities of neuromuscular orIgin

Funded by the DFG: FE 936/6

People with motor impairments are often unable to operate a computer keyboard efficiently and therefore have to need alternative input methods. For users with neuromuscular diseases, this project will develop alternatives that can adapt to the individual symptoms of individual persons through modular, multi-sensory interfaces. The practical use of the resulting input devices is ensured as part of a human-centered development process through the continuous involvement of the target group.

Completed Projects

Human-oriented methods for intuitive and fault-tolerant control of wearable robotic devices

Supported by the “Athene Young Investigator” program of TU Darmstadt

In this project, control approaches for wearable robotics systems for movement support and augmentation were developed to provide efficient and natural support and prevent users from feeling “controlled by the robot”. Psychophysical experiments of how users experience device elasticity help to tune adaptive impedance control to ensure versatile locomotion and fault tolerance. Human-in-the-loop experiments were applied to investigate the body scheme integration of wearable robotics systems by their users.


Users’ body experience and human-machine interfaces in (assistive) robotics

Funded by the DFG: BE 5729/3 & 11

The scientific network dealt with the body experience of individuals who use assistive or wearable robots. For a better understanding of technical possibilities to improve experiences, the participating scientists analyzed measures for the assessment of body representations and their consideration in novel design methods. This includes the identification of suitable perceptual channels and supports the development of new human-machine interfaces and human-in-the-loop experiments, i.e., robot hand/leg illusions.

Further information on the activities of the network can be found here.


Electromyographic (learning) control of robotic legs for the exploration of human body experience

Funded by the DFG: BE 5729/4

This cooperation with Arizona State University included the control of a robot leg to examine human body experience. A combination of electromyographic muscle activity measurement and machine learning was investigated for motion detection and control to improve the imitation of human motion and to extend evaluation methods for the experimental investigation of human body experience.


Tactile interfaces to enhance human-machine interaction of wearable robotic systems by mediating affective and social touches

Funded by MERCUR: An-2019-0032

Tactile interfaces are essential for achieving a realistic human-robot interaction and the resulting body experience during wearable robot use. Sensory feedback to the users is very relevant here. In addition to purely functional information, feedback should include so-called affective signals, which mediate positive sensations when being touched slowly and are particularly relevant for social contacts. We analyze requirements for the transmission of such signals and develop demonstrators and algorithms for corresponding human-machine interfaces.