Are Machines Becoming Emotionally Aware? Our Design Explores How Robots Could Adapt to Human States

We were thrilled to present our research, BioSyncHRI, a real-time adaptive system that integrates biosignals like EEG, sEMG, and HRV into VR surgical simulators at at CHI 2025 in Yokohama as part of the Workshop on Envisioning the Future of Interactive Health.

In this research, we examine how robots can adjust to the emotional and cognitive states of human operators—improving training, well-being, and surgical outcomes.

Why it matters:
In high-stakes fields like robotic surgery, performance is not just technical—it also involves a complex interplay between the operator’s physiological and emotional states. Teaching robots to interpret human states brings us closer to safer, more responsive, and human-centered technology.

A huge thank you to our co-authors Melanie Baumgartner, Aydin Javadov, Rayna Ney, and Joseph Ollier for their collaboration.

Learn more about the research:
https://lnkd.in/eGbeUgYY