Index

Human-Machine-Centered Design Methods

Human-machine-centered methods are crucial for the development of human-oriented technologies that synergistically interact with their users. Current research indicates that human factors and technical aspects should be equally considered to design efficient solutions that are accepted by users. For this purpose, we combine user and expert studies with established engineering design methods frameworks for human-machine-centered design.

Our research is concerned with identifying and modeling human factors and the development of design methods. To this end, we aim at a holistic understanding of the influence of human factors on the development of robotic systems and considering these relations methodically.

 

Current projects on this topic

Active transfer learning with neural networks through human-robot interaction (TRAIN)

Funded by the DFG: BE 5729/16

In order to be able to use autonomous robots flexibly in interaction with humans in the future, procedures are needed that enable the learning of various motor and manipulation skills and that can also be applied not only by experts. We aim to improve the learning of robot skills with neural networks, taking into account human feedback as well as the experience and instructions of the users. To implement this systematically, we evaluate subjective feedback and physiological data from user studies and develop assessment criteria for the development of human-oriented methods of transfer learning and the shared autonomy of humans and robots.

More information can be found here.

 

EFFENDI – EFficient and Fast text ENtry for persons with motor Disabilities of neuromuscular orIgin

Funded by the DFG: FE 936/6

People with motor impairments are often unable to operate a computer keyboard efficiently and therefore need alternative input methods. For users with neuromuscular diseases, this project will develop alternatives that can adapt to the individual symptoms of individual persons through modular, multi-sensory interfaces. The practical use of the resulting input devices is ensured as part of a human-centered development process through the continuous involvement of the target group.

 

Completed projects on this topic

Users’ body experience and human-machine interfaces in (assistive) robotics

Funded by the DFG: BE 5729/3 & 11

The scientific network dealt with the body experience of individuals who use assistive or wearable robots. For a better understanding of technical possibilities to improve experiences, the participating scientists analyzed measures for the assessment of body representations and their consideration in novel design methods. This includes the identification of suitable perceptual channels and supports the development of new human-machine interfaces and human-in-the-loop experiments, i.e., robot hand/leg illusions.

Further information on the activities of the network can be found here.

Human-oriented methods for intuitive and fault-tolerant control of wearable robotic devices

Supported by the “Athene Young Investigator” program of TU Darmstadt

In this project, control approaches for wearable robotics systems for movement support and augmentation were developed to provide efficient and natural support and prevent users from feeling “controlled by the robot”. Psychophysical experiments of how users experience device elasticity help to tune adaptive impedance control to ensure versatile locomotion and fault tolerance. Human-in-the-loop experiments were applied to investigate the body scheme integration of wearable robotics systems by their users.

Optimized measurement, adjustment, and manufacturing of lower limb prosthetic sockets

Funded by AiF / IGF: 18873 N / 2

In this project, methods for measuring, adapting, and manufacturing lower limb prosthetic socket systems were developed. Based on biomechanical measurements, the know-how of orthopedic specialists, and the assessment evaluation by people with amputation, models of the interaction between residual limb and socket were developed and suggestions to use them in technical design processes were made.

Interfaces and Interaction

For intuitive interaction, autonomous and mechatronic systems have to acquire and interpret multimodal sensor data and provide feedback, taking into account the physical abilities of the users. Especially in wearable robotic systems, user interface design is of utmost importance because the system should be perceived as a part of the user’s own body.

Therefore, our research focuses on improving our understanding of human perception and experience as well as the development of user-oriented interfaces for technical systems in general. To improve the integration of wearable robotic systems into their users’ body experience and the user experience of cognitive systems, we investigate the influence of different sensors, stimulators, and interaction strategies. To this end, we modify experimental approaches from psychological and cognitive science research into human-in-the-loop experiments to investigate underlying mechanisms and their practical consideration.

 

Current projects on this topic

Integrated models of cognitive and physical human-robot interaction

Funded by the Volkswagen Foundation: 9B 007

Combining the perspectives from engineering, biomechanics (Univ. Tübingen), and cognitive science (TU Berlin), we aim at developing unified models of physical and cognitive human-robot interaction. Working towards that goal, we prepare novel concepts for interdisciplinary discourse and teaching to motivate structural changes supporting inter-institutional collaboration. We expect the resulting interaction models will cover cognitive, musculo-skeletal, and control aspects, to improve joint cognitive decisions and physical robot assistance through user-centered human-robot interaction design

Active transfer learning with neural networks through human-robot interaction (TRAIN)

Funded by the DFG: BE 5729/16

In order to be able to use autonomous robots flexibly in interaction with humans in the future, procedures are needed that enable the learning of various motor and manipulation skills and that can also be applied not only by experts. We aim to improve the learning of robot skills with neural networks, taking into account human feedback as well as the experience and instructions of the users. To implement this systematically, we evaluate subjective feedback and physiological data from user studies and develop assessment criteria for the development of human-oriented methods of transfer learning and the shared autonomy of humans and robots.

More information can be found here.

 

EFFENDI – EFficient and Fast text ENtry for persons with motor Disabilities of neuromuscular orIgin

Funded by the DFG: FE 936/6

People with motor impairments are often unable to operate a computer keyboard efficiently and therefore have to need alternative input methods. For users with neuromuscular diseases, this project will develop alternatives that can adapt to the individual symptoms of individual persons through modular, multi-sensory interfaces. The practical use of the resulting input devices is ensured as part of a human-centered development process through the continuous involvement of the target group.

Completed Projects

Human-oriented methods for intuitive and fault-tolerant control of wearable robotic devices

Supported by the “Athene Young Investigator” program of TU Darmstadt

In this project, control approaches for wearable robotics systems for movement support and augmentation were developed to provide efficient and natural support and prevent users from feeling “controlled by the robot”. Psychophysical experiments of how users experience device elasticity help to tune adaptive impedance control to ensure versatile locomotion and fault tolerance. Human-in-the-loop experiments were applied to investigate the body scheme integration of wearable robotics systems by their users.

 

Users’ body experience and human-machine interfaces in (assistive) robotics

Funded by the DFG: BE 5729/3 & 11

The scientific network dealt with the body experience of individuals who use assistive or wearable robots. For a better understanding of technical possibilities to improve experiences, the participating scientists analyzed measures for the assessment of body representations and their consideration in novel design methods. This includes the identification of suitable perceptual channels and supports the development of new human-machine interfaces and human-in-the-loop experiments, i.e., robot hand/leg illusions.

Further information on the activities of the network can be found here.

 

Electromyographic (learning) control of robotic legs for the exploration of human body experience

Funded by the DFG: BE 5729/4

This cooperation with Arizona State University included the control of a robot leg to examine human body experience. A combination of electromyographic muscle activity measurement and machine learning was investigated for motion detection and control to improve the imitation of human motion and to extend evaluation methods for the experimental investigation of human body experience.

 

Tactile interfaces to enhance human-machine interaction of wearable robotic systems by mediating affective and social touches

Funded by MERCUR: An-2019-0032

Tactile interfaces are essential for achieving a realistic human-robot interaction and the resulting body experience during wearable robot use. Sensory feedback to the users is very relevant here. In addition to purely functional information, feedback should include so-called affective signals, which mediate positive sensations when being touched slowly and are particularly relevant for social contacts. We analyze requirements for the transmission of such signals and develop demonstrators and algorithms for corresponding human-machine interfaces.

Research

The overarching research objective of the Chair of Autonomous Systems and Mechatronics is to develop technical systems that functionally support their users and provide them with a positive experience. To specifically promote user acceptance, human factors and technical requirements are equally considered. Our research approach is outlined in the figure and based on a mixture of methods from engineering and human sciences. We demonstrate our findings on wearable systems such as prostheses or exoskeletons, cognitive systems such as collaborative or humanoid robots, and general applications with tight human-robot interaction.

The key research questions of the Chair of Autonomous Systems and Mechatronics are:

  • Which design, control, and interface implementations support human-machine interaction?
  • How do human factors influence human-machine interaction and how can they be considered systematically in design?

Components and Control

Interfaces and Interaction

Wearable Systems

Cognitive Systems

 

 

 

Components and Control

Current designs of autonomous and mechatronic systems are no longer based solely on rigid mechanisms but contain specific elastic properties. Through this, safe human-machine and human-robot interaction and significantly reduced energy consumption by exploiting system dynamics are achieved. Both aspects help to meet the requirements of systems operating close to humans, but also raise new technical questions. In addition to increasing system complexity, this particularly relates to the requirements arising from direct interaction between humans and machines.

Our recent research focuses on the design of components, e.g., elastic actuators, aiming at energy efficiency and fault tolerance. In addition to developing mechanisms and actuators, we also tackle control and signal processing challenges. These play an important role in the implementation of fault-tolerant designs with safety management and thus support the practical applicability of elastic actuators. By considering human perception, components and control are designed to ensure intuitive and robust human-machine interaction.

 

Current Projects

Fault diagnosis and tolerance for elastic actuation systems in robotics: physical human-robot interaction

Funded by the DFG: BE 5729/1

Increased system complexity and operation in potentially critical operating states such as (anti)resonances can result in technical faults in elastic robotic actuators. To counteract this, we are examining fault diagnosis methods and fault-tolerant designs. One of our key interests is human-robot interaction, which can be influenced by suitable control algorithms for elastically actuated robots. Based on investigating human perception, we design elastic actuators to provide safe and reliable physical human-robot interaction through fault diagnosis and compensation.

Completed Projects

Human-oriented methods for intuitive and fault-tolerant control of wearable robotic devices

Supported by the “Athene Young Investigator” program of TU Darmstadt

In this project, control approaches for wearable robotics systems for movement support and augmentation were developed to provide efficient and natural support and prevent users from feeling “controlled by the robot”. Psychophysical experiments of how users experience device elasticity help to tune adaptive impedance control to ensure versatile locomotion and fault tolerance. Human-in-the-loop experiments were applied to investigate the body scheme integration of wearable robotics systems by their users.

 

Natural dynamics analysis and stiffness control of series- and parallel-elastic robotic actuators

Funded by the DFG: BE 5729/2

In cooperation with Vrije Universiteit Brussel, we investigated the influence of actuator-elasticity configurations on the inherent dynamics of elastic actuators and their power/energy requirements. For this purpose, rigid, series, and parallel elastic configurations were compared in simulations and experiments. The results inform actuator design and stiffness control.