KoBo34 - Intuitive HRI

KoBo34 („Intuitive Interaktion mit kooperativen Assistenzrobotern für das 3. und 4. Lebensalter”) is a R&D project funded by the German Federal Ministry of Education and Research. It aims at improving social participation and maintaining personal independence of elderly people by developing an assistive humanoid robot.

TU Darmstadt’s Centre for Cognitive Science participates with its research groups Psychology of Information Processing (Constantin Rothkopf) and Intelligent Autonomous Systems (Jan Peters). The project consortium comprises robotics company Franka Emika, Technical University of Munich (TU München), Rosenheim University of Applied Sciences (TH Rosenheim), and Technical University of Darmstadt (TU Darmstadt).

Intention Recognition. One crucial project objective is intuitive communication between elderly people and assistive robots. In order to develop a robot assistant that is really helpful in everyday life activities effortless communication between human and robot is indispensable.

Therefore, a central aspect of our work in KoBo34 is research of automated intention recognition in order to support robotic “understanding” of human behaviour. Since humans express intentions through different modalities such as hand gestures, gaze directions or speech, it is reasonable to also exploit the data of different modalities for automatically recognizing human intentions. Particularly, this can reduce the uncertainty over the intention to be predicted. Thus, one of the goals of the KoBo34 project is to reduce this uncertainty by combining information from different modalities [3].

Human-robot interaction with intention recognition
Human-robot interaction with intention recognition

Interactive Learning. Another core objective of the project is continuous improvement of the interaction by enabling the robot to learn from human input. Here, we work on learning interactive skills by demonstration.

A skill library containing robot movements as components of complex interactive tasks can be taught by kinesthetic teaching. However, many learning algorithms require the number of skills to be taught beforehand. For additional skills the whole library needs to be retrained. In KoBo34 we developed a new approach allowing open-ended learning of a collaborative skill library. This approach facilitates to iteratively refine existing skills and to add new ones without such retraining [1] [5].

While the previously discussed work focused on learning new skills we also worked on methods to adapt the robot’s movements online according to human movements, e.g. to avoid collisions. We developed two different modes of adaptation, spatial deformation of the respective robot trajectories and temporal scaling. Both modes were evaluated and compared on naïve human subjects [4].

All these approaches regarding skill learning by imitation learning, adaptation of robot trajectories to human movements and improvement of robot’s skills to fit the human’s needs should be applicable to assistive tasks until the project’s end. One example task, preparing a tray of snacks for being served in an elderly home, is demonstrated here:

Assistant robot Kobo learns how to prepare a tray of snacks (Demo)

Conference, Journal and Magazine Articles:

[1] Koert, D.; Trick, S.; Ewerton, M.; Lutter, M.; Peters, J. (2018): Online Learning of an Open-Ended Skill Library for Collaborative Tasks. Proceedings of the IEEE-RAS 18th International Conference on Humanoid Robots (HUMANOIDS), November 2018, Beijing, China. (opens in new tab)

[2] Balfanz, D. (2018). „KoBo lernt helfen. Forscher entwickeln Roboter, der ältere Menschen im Alltag unterstützt.“, hoch3, Technische Universität Darmstadt, Dezember 2018.

[3] Trick, S.; Koert, D.; Peters, J.; Rothkopf, C. (2019): Multimodal Uncertainty Reduction for Intention Recognition in Human-Robot Interaction, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), November 2019, Macao, China. (opens in new tab)

[4] Koert, D.; Pajarinen, J.; Schotschneider, A.; Trick, S., Rothkopf, C.; Peters, J. (2019). Learning Intention Aware Online Adaptation of Movement Primitives, IEEE Robotics and Automation Letters (RA-L), with presentation at the IEEE International Conference on Intelligent Robots and Systems (IROS), November 2019, Macao, China. (opens in new tab)

[5] Koert, D.; Trick, S., Ewerton, M.; Lutter, M.; Peters, J. (2020). Incremental Learning of an Open-Ended Collaborative Skill Library, International Journal of Humanoid Robotics Vol. 17, No. 1 (2020) 2050001 (23 pages)

[6] Koert, D.; Kircher, M.; Salikutluk, V.; D'Eramo, C.; Peters, J. (2020). Multi-Channel Interactive Reinforcement Learning for Sequential Tasks, Frontiers in Robotics and AI Human-Robot Interaction

[7] Vorndamme, J.; Carvalho, J.; Laha, R.; Koert, D.; Figueredo, L. F. C.; Peters, J.; Haddadin, S. (2022). Integrated Bi-Manual Motion Generation and Control shaped for Probabilistic Movement Primitives. Proceedings of the International Conference on Humanoid Robots (HUMANOIDS) 2022. Best Interactive Paper Award Finalist (opens in new tab)

[8] Trick, S.; Rothkopf, C. A. (2022). Bayesian Classifier Fusion with an Explicit Model of Correlation. Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:2282-2310, 2022. (opens in new tab)

[9] Trick, S., Rothkopf, C. A., & Jäkel, F. (2023). Parameter estimation for a bivariate beta distribution with arbitrary beta marginals and positive correlation. METRON, 1-18. (opens in new tab)

Preprints:

[10] Trick, S.; Jäkel, F.; Rothkopf, C. A. (2021). A Bivariate Beta Distribution with Arbitrary Beta Marginals and its Generalization to a Correlated Dirichlet Distribution. arXiv preprint arXiv:2104.08069. (opens in new tab)

Theses:

[A1] Trick, S. (2018): Multimodal Uncertainty Reduction for Intention Recognition in a Human-Robot Environment. Master’s Thesis, Advisors: C. A. Rothkopf, J. Peters, D. Koert. Technische Universität Darmstadt, 2018.

[A2] Knaust, M. (2019): Intuitive Imitation Learning for one-handed and bimanual tasks using ProMPs. Master’s Thesis, Advisors: J. Adamy, J. Peters, D. Koert. Technische Universität Darmstadt, 2019.

[A3] Koert, D. (2020): Interactive Machine Learning for Assistive Robots. Doctoral Thesis, Advisors: J. Peters, H. B. Amor. Technische Universität Darmstadt, 2020

[A4] Schramm, M. (2020): Development of an Interactive Gesture Recognition System for an Assistive Robot. Bachelor’s Thesis, Advisors: J. Peters, D. Koert. Technische Universität Darmstadt, 2020.

[A5] Scherf, L. (2021): Learning to segment human sequential behavior to detect the intention for interaction. Master’s Thesis, Advisors: C. Rothkopf, S. Trick. Technische Universität Darmstadt, 2021.

Project Details

Project: KoBo34 – Intuitive Interaktion mit kooperativen Assistenzrobotern für das 3. und 4. Lebensalter
Project partners: Franka Emika GmbH (Coordinator), Technical University of Darmstadt (TU Darmstadt), Technical University of Munich (TU München), Rosenheim University of Applied Sciences (TH Rosenheim)
Project duration: July 2018 – December 2021
Project funding: 1.94 Mio EUR (joint project)
Funded by: German Federal Ministry of Education and Research (BMBF)
Grant no.: 16SV7984 (TU Darmstadt)
Website https://www.technik-zum-menschen-bringen.de/projekte/kobo34