More Info soon!

Daniel Ferris

J. Crayton Pruitt Family Department of Biomedical Engineering, University of Florida

Short bio

Daniel Ferris is the Robert W. Adenbaum Professor of Engineering Innovation at the University of Florida. His main appointment is in biomedical engineering and he has faculty affiliations with mechanical and aerospace engineering, and neurology. His research focuses on the neuromechanical control of human movement, with particular regard to human-machine interactions (mechanical and electrical). Specific projects focus on robotic exoskeletons, bionic prostheses, mobile brain imaging, and immersive virtual reality. Dr. Ferris is a fellow of the American Association for the Advancement of Science (AAAS), American Institute of Medical and Biological Engineering (AIMBE), the American Society of Biomechanics (ASB), and the National Academy of Kinesiology (NAK). He has published over 150 journal papers on the biomechanics and neural control of human movement.

Robotic exoskeletons, bionic prostheses, and immersive virtual reality as tools to better understand brain and body connections

Abstract

Recent advances in actuator, sensor, and controller technology have enabled new autonomous robots and new robotic devices for assisting human rehabilitation and augmentation. Perhaps underappreciated, the advances in robotic technology also provide new tools for investigating the link between brain and body in biological organisms. Robotic exoskeletons and bionic prostheses offer an innovative perturbation to the body dynamics that were not previously possible. Immersive virtual reality and haptic interactions offer innovate perturbations to the environment dynamics that were also not previously possible. Combining these perturbations with mobile brain-body imaging offers a powerful approach for better understanding how humans and other animals perceive and explore their realities.

Sangbae Kim

Department of Mechanical Engineering, MIT

Short bio

Sangbae Kim is the director of the Biomimetic Robotics Laboratory and a professor of Mechanical Engineering at MIT. His research focuses on bio-inspired robot development achieved by extracting principles from animals. Kim’s achievements include creating the world’s first directional adhesive inspired by gecko lizards and a climbing robot named Stickybot that utilizes the directional adhesive to climb smooth surfaces. One of Kim’s recent achievements is the development of the MIT Cheetah, a robot capable of stable running outdoors up to 13 mph and autonomous jumping over obstacles at the efficiency of animals. Kim is a recipient of best paper awards from the ICRA (2007), King-Sun Fu Memorial TRO (2008) and IEEE/ASME TMECH (2016). Additionally, he received a DARPA YFA (2013), an NSF CAREER award (2014), a Ruth and Joel Spira Award for Distinguished Teaching (2015) and Nagamori Award (2023). He is a member of DSSG (Defense Science Study Group) 2019~2024. He gave talks at prestigious meetings such as TED xMIT, DAVOS economic Forum, Amazon MARS meetings, and Keynote speech at IROS2019.

Physical intelligence and Cognitive Biases Toward AI

Abstract

When will robots be able to clean my house, dishes, and take care of laundry? While we source labor primarily from automated machines in factories, the penetration of physical robots in our daily lives has been slow. What are the challenges in realizing these intelligent machines capable of human level skill? Isn’t AI advanced enough to replace many skills of humans? Unlike conventional robots, which are optimized mainly for position control with almost no adaptability, household tasks require a kind of 'physical intelligence' that involves complex dynamic interactions with overwhelming uncertainties. While advanced language models exemplify AI's prowess in data organization and text generation, a significant divide exists between AI for virtual and physical applications. In this conversation, we'll delve into the cognitive biases that often lead us to underestimate this technological gap.

Helen Huang

UNC/NC State Joint Department of Biomedical Engineering, North Carolina State University & the University of North Carolina at Chapel Hill

Short bio

Dr. Helen Huang is the Jackson Family Distinguished Professor in the Joint Department of Biomedical Engineering at North Carolina State University (NC State) and the University of North Carolina at Chapel Hill (UNC). She is also the Director of the Closed-Loop Engineering for Advanced Rehabilitation (CLEAR) core. Her research focuses on neural-machine interfaces, wearable robotics (prosthetic limbs and exoskeletons), wearer-robot interaction and co-adaptation, and human motor control and biomechanics. Dr. Huang has received numerous awards, including the Delsys Prize for Innovation in Electromyography, the NIDILRR Switzer Fellowship, the NSF CAREER Award, the ASA Statistics in Physical Engineering Sciences Award, and the NC State ALCOA Foundation Distinguished Engineering Research Award. She is a Fellow of the American Institute for Medical and Biological Engineering (AIMBE), a Fellow of the IEEE, and an NC State Faculty Scholar. Additionally, she serves as the Editor-in-Chief of the IEEE Transactions on Neural Systems and Rehabilitation Engineering.

Towards Human-Prosthesis Symbiosis

Abstract

As the number of individuals living with limb loss in the United States reaches millions, there is an urgent need for advanced prosthetic technologies that can provide this large population with the best restoration of normal function possible. While commercially available robotic prostheses, such as dexterous prosthetic hands and motorized prosthetic legs, have become commercially available, their adoption and functional capabilities remain limited.

This presentation highlights our research aimed at building a symbiotic relationship between humans and robotic prosthetic legs—where the prosthesis controller and the wearer’s motor control system coordinate and adapt to function as a seamless, unified entity. We investigate human-prosthesis interactions in gait and balance and develop learning-based control to enable prosthesis adaptation to its amputee users physically and cognitively. Our innovative approaches may further advance the function of modern, intelligent prostheses and significantly improve the quality of life for individuals with limb amputations.

Madhusudhan Venkadesan

Yale University

Short bio

Madhusudhan Venkadesan studies various problems in mechanics and control, that include animal locomotion, the geometry of joints, and the statistical mechanics of muscle. The main motivation for his work is curiosity about everyday observations, with applications in biomedical sciences, evolutionary biology, and robotics. He is an Associate Professor in the Department of Mechanical Engineering & Materials Science at Yale University, where he has been since 2015.

The human foot

Abstract

In this talk, I will describe how the foot's shape and morphology influence its behavior as a structural member, how anatomical structures within the foot control the flow and delivery of power and work during propulsion, and how foot-ground interactions affect whole body stability. In the different aspects of foot function, we find a recurring theme that several control challenges associated with locomotion are offloaded to the mechanical response of the foot.

Oliver Brock

Robotics and Biology Laboratory

Science of Intelligence (Cluster of Excellence)

Technische Universität Berlin

Short bio

Oliver Brock is the Alexander-von-Humboldt Professor of Robotics in the School of Electrical Engineering and Computer Science at the Technische Universität Berlin. He received his Ph.D. from Stanford University in 2000 and held postdoctoral positions at Rice University and Stanford University. He was an Assistant and Associate Professor in the Department of Computer Science at the University of Massachusetts Amherst before moving back to Berlin in 2009. Oliver Brock directs the Research Center of Excellence “Science of Intelligence”. He is an IEEE Fellow and was president of the Robotics: Science and Systems Foundation from 2012 until 2019.

Is Dexterous Manipulation in Robots Converging Towards How Humans Do It?

Abstract

Since the beginning of the field of robotics about 60 years ago, dexterous manipulation seemed like a very complex problem. The initial decades of research brought about some deep conceptual clarifications but practical success was lagging behind substantially. This seems to have changed over the past approximately seven years, possibly starting with the inhand manipulation performed at OpenAI in 2018. In this talk I will discuss the conceptualizations that led to progress in robotic manipulation, discussing aspects of hardware, motor control, sensing, learning, programming, and planning. And: What’s still missing to match human performance?

Katja Mombaur

BioRobotics Lab, Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology (KIT), Germany

CERC Human-Centred Robotics & Machine Intelligence, University of Waterloo, Canada

Short bio

Katja Mombaur joined the Karlsruhe Institute of Technology in Germany in 2023 as Full Professor, Chair for Optimization & Biomechanics for Human-Centred Robotics and Director of the BioRobotics Lab. In addition, she holds an affiliation with the University Waterloo in Canada where she has been Full Professor and Canada Excellence Research Chair (CERC) for Human-Centred Robotics & Machine Intelligence since 2020. Prior to moving to Canada, she has been a Full Professor at Heidelberg University where she directed the Optimization, Robotics & Biomechanics Chair, as well as the Heidelberg Center for Motion Research. Her international experience includes research activities at LAAS-CNRS in Toulouse and Seoul National University, as well as in the USA. She studied Aerospace Engineering at the University of Stuttgart and SupAéro and holds a PhD in Mathematics from Heidelberg University. Katja Mombaur currently serves as the Vice President for Member Activities of the IEEE Robotics & Automation Society and as Senior Editor of the IEEE Transactions on Robotics. Katja’s research focuses on understanding human movement by a combined approach of model-based optimization and experiments and using this knowledge to improve motions of humanoid robots and the interactions of humans with exoskeletons, prostheses, and external physical devices. Her goal is to endow humanoid and wearable robots with motion intelligence that allows them to operate safely in a complex human world. The development of efficient algorithms for motion generation, control, and learning is a core component of her research.

Endowing humanoid robots with embodied intelligence: the roles of bio-inspiration, optimization and learning

Abstract

Humanoid robots acting in the real world require embodied intelligence which makes them aware of how they it moves in and interacts with their dynamic environment and with humans. As humanoidss need to either physically interact with humans or replace human actions, they must be able to predict human behavior, recognize human intent, know how to safely interact with them and how to best support them. In addition, they should convey their motion intent to humans by moving in a human-like way and thus making their behavior more predictible and facilitate interactions for humans.

Our research aims to reveal and describe general principles of human movement such as stability and efficiency or optimization principles underlying certain behaviors. We aim not only for a qualitative understanding but also for quantitative knowledge about human movement that can be formulated in terms of mathematical models. These can serve as bioinspiration for humanoid motions, but also support understanding of human intent. Building on top of this understanding, we develop efficient computational methods combining detailed models of the robotic systems with optimization and learning that allow to control and improve motions for humanoids adapted to the specific situation.

I will show a range of different motions for our robot which all require an effective coordination of the whole body and control of stability. Motions covered include walking on different terrains, balancing, bimanual manipulation of objects, riding personal transporters and skateboards, natural body language during multimodal communication, and finally close-proximity physical-social human-robot interactions which are important in many applications ranging from dancing to healthcare.

Thorsten Zander

BTU

Jan Veneman

Hocoma

Short bio

Jan Veneman received his doctoral degree in 2007 from the Faculty of Engineering Technology, University of Twente, the Netherlands, based on his contributions to developing the gait rehabilitation robot LOPES.

Since 2018, he works with Hocoma, global manufacturer of robotic technology for functional movement therapy, at this moment in the function of Head of Development. From 2017 – 2021 he was chairing a European COST (Collaboration in Science and Technology) Action on Wearable Robots. He has also been active for over 10 years as national representative in standardization committees under ISO and IEC on the topic of Medical Robot Safety. He is teaching a master course on rehabilitation robotics at the MCI Innsbruck.

Applying Robotics to Rehabilitate and Support Human Mobility

Abstract

Robotic technology has been used in rehabilitation and mobility support for decades. However, despite these advancements, individuals with mobility impairments have not experienced groundbreaking improvements in functional recovery. This talk explores key challenges that have limited progress, including the complexity of neuroplasticity, individual variability in recovery, and barriers to technology adoption, such as medical device regulations. Potential strategies to enhance the effectiveness and integration of robotic rehabilitation will be discussed, highlighting opportunities for interdisciplinary innovation and more personalized approaches.

Koh Hosoda

Adaptive Robotics Laboratory, Kyoto University

Short bio

Koh Hosoda received his Ph.D. in Mechanical Engineering from Kyoto University, Japan, in 1993. He works for Osaka University from 1993 to 2023 as Assistant/Associate/Full Professor in Graduate School of Engineering, Information Science and Technology, and Engineering Science. He is an Emeritus Professor of Osaka University.

He was a guest professor in Artificial Intelligence Laboratory at the University of Zurich (the host was Prof. Rolf Pfeifer) from April 1998 to March 1999. He is currently a Professor in the Graduate School of Engineering, Kyoto University.

Quadruped Robots driven by Pneumatic Artificial Muscles

Abstract

Adopting pneumatic artificial muscles enable us to build many kinds of soft quadruped robots that emerges adaptive locomotion. In the presentation, I will introduce PneuHound, pneumatic GrayHound, and small dog robots driven by pneumatic artificial muscles. Soft actuation allow the legs comply to the terrain so that the robot can walk/run with only feedforward control.

Poramate Manoonpong

School of Information Science and Technology, Vidyasirimedhi Institute of Science and Technology (VISTEC), Thailand; SDU Biorobotics, University of Southern Denmark (SDU), Denmark

Short bio

He is a Professor at the School of Information Science & Technology, Vidyasirimedhi Institute of Science & Technology (VISTEC), located in Rayong, Thailand, as well as a Professor of Biorobotics at the University of Southern Denmark (SDU), Denmark. As author or co-author, he has published over 120 publications in journals (e.g., Nature Physics, Nature Machine Intelligence, IEEE Transactions on Cybernetics, IEEE Transactions on Neural Networks and Learning Systems, IEEE Robotics and Automation Letters) and conferences (e.g., IROS, ICRA). He has been the principal investigator or co-principal investigator on more than 10 funded projects, including those funded by EU Horizon 2020, Human Frontier Science Program (HFSP), and Doctoral Networks – Marie Skłodowska-Curie Actions. Currently, he serves as an associate editor of IEEE Robotics and Automation Letters, Robotics Reports, Frontiers in Neuroscience (Neurorobotics), and Adaptive Behavior (SAGE). He also serves on the editorial board of the Scientific Reports and Journal of the Royal Society Interface. The central goal of his research is to understand “how brain-like mechanisms and biomechanics can be realized in robots so they can become more intelligent like living creatures?”.

According to this, his team has developed various bio-inspired robots with embodied neural control and learning and could show that these robots can perform complex behaviors. In addition to this, his team also focuses on transferring biomechanical and neural developments of robots to other real-world applications, like inspection, healthcare, industry, and service. The research results of his groups have been featured in news outlets, such as IEEE Spectrum (Video Friday), Advanced Science News, TechXplore, and the cover page of Nature Machine Intelligence, Advanced Intelligent Systems, Advanced Theory and Simulations, and Advanced Science.

Bio-inspired Neural Control with Online Adaptation for Personalized Locomotion Assistance of Interactive Exoskeletons

Abstract

While lower-limb exoskeletons are increasingly used for gait assistance and rehabilitation, most function as purely assistive devices rather than personalized interactive companions, limiting user-exoskeleton collaboration and seamless integration. In this talk, I will present our neural control strategy, which transforms exoskeletons into personalized interactive companions. This strategy generates versatile, personalized gait assistance for symmetrical and asymmetrical walking on a split-belt treadmill at different speeds, as well as for stair ascent and descent. It minimizes joint assistance torque across all exoskeleton joints and empowers users to actively control their gait patterns. This enables the exoskeleton to operate as an interactive, compliant, assist-as-needed device, enhancing user interaction and comfort.

Klaus Gramann

TU Berlin

Short bio

Klaus Gramann received his Ph.D. in psychology from RWTH Aachen, Aachen, Germany. He was a postdoc with the LMU Munich, Germany, and the Swartz Center for Computational Neuroscience, University of California at San Diego. After working as a visiting professor at the National Chiao Tung University, Hsinchu, Taiwan he became the chair of Biopsychology and Neuroergonomics with the Technical University of Berlin, Germany in 2012. He has been a Professor with the University of Technology Sydney, Australia and is an International Scholar at the University of California San Diego. His research covers the neural foundations of cognitive processes with a focus on the brain dynamics of embodied cognitive processes. He directs the Berlin Mobile Brain/Body Imaging Labs (BeMoBIL) that focus on imaging human brain dynamics in actively behaving participants.

Imaging the Human Brain in Real and Virtual Worlds

Abstract

The human brain is inherently embodied, closely tied to our physical form and utilizing this connection to optimize perception in complex, ever-changing environments. Traditional brain imaging methods have often neglected this embodied aspect of cognition. However, recent advancements have driven a paradigm shift, with established imaging techniques now being adapted to study brain dynamics in individuals actively engaging with their surroundings. In addition, virtual reality (VR) enables controlled experiments beyond standard laboratory protocols while offering immersive and realistic stimulus presentations. In combination with Mobile Brain/Body Imaging (MoBI), VR offers new opportunities in cognitive neuroscience research introducing hitherto unknown possibilities for mapping out human brain function in ecologically valid scenarios. While a combination of virtual reality, motion capture, and brain imaging can assess the most important aspects of embodied cognitive processes, it further provides unprecedented opportunities for systematically manipulating the constituent factors of sensory-motor integration underlying natural cognitive processes with protocols that would not be possible without VR. Experiments conducted at the Berlin Mobile Brain/Body Imaging Labs reveal striking differences in brain dynamics underlying active behavior as compared to stationary desktop protocols. The results give new insights into human brain activity during active behaviors and a critical perspective on problems arising from the combination of new technologies as well as problems when comparing new results from mobile protocols with established physiological parameters stemming from traditional desktop-based and movement-restricted protocols.

Rebecca Kramer Bottiglio

Yale University

Short bio

Rebecca Kramer-Bottiglio is the John J. Lee Associate Professor of Mechanical Engineering at Yale University. She has received multiple early career awards including the NSF Career Award, the NASA Early Career Award, the AFOSR Young Investigator Award, and the ONR Young Investigator Award. She was named to the Forbes “30 under 30” list for her work on liquid metal-based stretchable electronics. She received the Presidential Early Career Award for Scientists and Engineers (PECASE), the highest honor bestowed by the U.S. government on outstanding scientists and engineers beginning their independent careers, for her development of robotic skins that turn inanimate objects into multifunctional robots. She received the Alan T. Waterman award, NSF’s highest honor for early-career scientists and engineers, “for groundbreaking contributions to robotics, particularly in advancing the understanding of how to design and build machines that evolve on demand.” She was named a National Academy of Engineering (NAE) Gilbreth Lecturer in 2022 and a National Academy of Science (NAS) Kavli Fellow in 2023. She also serves on the Technology, Innovation & Engineering Committee of the NASA Advisory Council.

Soft robots that adapt to changing tasks and environments

Abstract

Soft robots have the potential to adapt their morphology and behavioral control policy to changing tasks and environments. Inspired by the dynamic plasticity of living organisms and the general adaptability of animals, this talk will discuss several shape-shifting soft robot platforms for multi-task performance and multi-environment locomotion—for example, robotic skins, robotic fabrics, and robots with morphing limbs. The talk will also explore the active material components, such as stretchable electronics and computation, soft actuation, and variable stiffness materials, that enable predictable robot morphology changes. By harnessing these engineered materials and mechanisms, we aim to unlock a wide range of capabilities for increasingly adaptive, evolving robots.