Neural Human-Robot Interfaces for Intuitive Collaboration

Our research focuses on developing advanced interfaces that enable seamless and natural interactions between humans and robots. By integrating neural signals and movement intent inference, the project aims to enhance collaborative tasks requiring physical interaction.

Key scientific contributions include the development of algorithms to infer intended movement directions during human-human physical interactions, facilitating more intuitive human-robot collaborations . Moreover, we create control strategies that manage six degrees of freedom (DOF) interactions, accommodating scenarios where humans and robots contribute differently to a shared task.

These advancements collectively contribute to more effective and user-friendly human-robot collaboration systems.

Three Representative Papers (see Publications for a complete list)

Keivan Mojtahedi, Bryan Whitsell, Panagiotis Artemiadis and Marco Santello, “Communication and Inference of Intended Movement Direction during Human-Human Physical Interaction,” Frontiers in Neurorobotics, vol. 11 (21), pp. 1-12, 2017. [link to pdf]

Bryan Whitsell and Panagiotis Artemiadis, “Physical Human–Robot Interaction (pHRI) in 6 DOF With Asymmetric Cooperation,” IEEE Access, vol. 5, pp. 10834-10845, 2017. [link to pdf]

Christopher A. Buneo, Stephen Helms Tillery, Marco Santello, Veronica J. Santos, and Panagiotis Artemiadis. “Effective Neural Representations for Brain-Mediated Human-Robot Interactions.” In Neuro-robotics: From brain machine interfaces to rehabilitation robotics, as part of the Springer series on Trends on Augmentation of Human Performance, Artemiadis, P., Ed., New York:Springer, 2014, Vol. 2., pp. 207-237. Springer Netherlands, 2014. [link to publisher]