Human-Computer/Human-Robot Interaction

In human-computer interaction the target system can either be a computer itself or an arbitrary technical system mediated by a computer, e.g., a robot. Intuitive interfaces are essential to enable users to exploit possibilities of computers or act cooperatively with a robot.

To assess situations, multimodal sensory data needs to be acquired and interpreted by the interface. For intuitive control, it is important to give appropriate feedback to the users, considering that their capabilities might be limited, e.g., by disease or disability. Regarding body-proximal robots such as prostheses or exoskeletons, interfaces are major aspects to achieve satisfaction and an integration of those artifacts into the body schemata of their users.

Research issues concern understanding human perception and human factors to design human-oriented interfaces.

Our research deals with human-computer interaction systems that enable people with neuromuscular diseases to control a personal computer and human-robot interfaces aiming at the improvement of body schema integration of body-proximal robots. In both, perceptual channels are investigated regarding their suitability. Regarding interfacing personal computers, focus is set on developing input devices and software that demand low physical effort. To improve human-robot interfaces, fundamental aspects of body schema integration are investigated and considered for technical implementation.

Current Projects Related to this Key Topic:

Sponsored by “Athene Young Investigator” – Program of TU Darmstadt

This project combines methods from engineering and human sciences to tackle the multidisciplinary field of wearable robotic devices for motion support and augmentation.

Through considering human factors in control design, algorithms are envisioned to provide efficient and natural assistance and prevent users‘ from feeling to be “controlled by the device”. Psychophysical exploration of how humans experience the stiffness of wearable robots guides impedance control design. With appropriate adaptation, those algorithms facilitate versatile locomotion types and become fault-tolerant. Additionally, psychometric and human-in-the-loop studies examine the impact of the algorithms on the embodiment of the devices by their users. For practical validation, an adaptive shank prosthesis and a powered knee orthosis are used as wearable robotic demonstrators. Finally, all results inform the specification of a human-oriented control design method to improve user acceptance and satisfaction.

Contact: Philipp Beckerle,

Funded by DFG: FE 936/6

Persons with physical disabilities are often not capable of efficiently using a regular computer keyboard (if at all), and therefore rely on fast and efficient alternatives – beyond the standard keyboard – to enter text. This project concentrates on the group of disabled users, whose disability is induced by a neuromuscular disease. Due to specific symptomatologies, certain alternative input methods are better than others. For example, the voice of someone with a neuromuscular disease is mostly subject to dysarthria (motor speech disorder), which makes automatic speech recognition (ASR) almost unusable. The goal of this project is to identify and investigate suitable alternatives. Besides, the usability of newly developed tools shall be evaluated in tests with members of the target population.

Contact: Philipp Beckerle,

Picture: IMS

Funded by DFG: BE 5729/3&11

The scientific network deals with the body experience of subjects using assistive robots or other body-/user-proximal robotic devices. The objectives are to explore the technical potential of improving experience by appropriate human-robot interfaces and robot designs. Therefore, the participating researchers jointly analyze and discuss measures to assess experience (body image and body schema) and consider it in novel design methods. This includes the identification of promising perceptual channels as well as the preparation of foot- and hand-robots for the experimental investigation of rubber limb illusions and interface designs.

Further information about the network activities can be found here.

Picture: IMS

Funded by DFG: BE 5729/4

The cooperation with Arizona State University focuses the control of a robotic leg that will be used in the exploration of human body experience. For motion detection and control, a combination of electromyographic muscle activity measurements and machine learning is examined. This is intended to improve the imitation of human movements by the robotic leg. Furthermore, evaluation methods for experimental investigation of body experience are prepared within the project.

Contact: Philipp Beckerle,

Picture: IMS

Completed Projects Related to this Key Topic:

Funded by DFG: FE 936/3

This project was about developing and perfecting a control idea for hands-free computer operation, targeted at persons with very severe physical disabilities. The method was based on the detection of intentional contractions of an arbitrary muscle. Thanks to a clever amplification circuit in the piezo-driven input sensor, a minimum of physical effort (e.g., required to raise the eyebrow) was enough to generate noticeable control signals. Applications of the muscle-based input method were not limited to computer operation alone, but more or less included any technical system (with an input interface to be adapted accordingly). Particular examples (which were thoroughly investigated in addition to the computer interface) were a hands-free wheelchair control system as well as an environment control system.

Picture: IMS

Funded by the TU Darmstadt, this project aimed at active lower limb prostheses which are user-friendly and energy efficient. Human Factors have been analyzed psychologically and integrated into engineering methods to develop user-oriented technologies. To increase energy efficiency, elastic actuation systems and appropriate control algorithms were designed based on simulations of human gait with and without prosthesis.