Publication date: 15th December 2025
Thanks to their versatility, rich capabilities, and potential to interact naturally with humans, humanoid robots are emerging as a solution for a wide range of tasks across domains spanning manufacturing, service robotics, and healthcare.
In this talk, I will provide an overview of my group’s work on developing humanoid robots that are capable of acting autonomously in human environments. Specifically, I will discuss advances in visual recognition, tactile perception, and object manipulation.I will focus on the challenges of building systems that can learn autonomously from interaction with the real world while integrating knowledge acquired both off-line and on-line. I will then illustrate how these methods can be integrated in complete systems and deployed across different applications, with examples from service robotics and human–robot collaboration. The talk will highlight how humanoid embodiment and recent advances in AI support the development of robots capable of flexible and effective interaction in everyday scenarios.
This work received support from the European Union’s Horizon research and innovation programme G.A. n. 101070227 (CONVINCE), from the National Institute for Insurance against Accidents at Work (INAIL) project ergoCub-2.0, from PNRR MUR Project PE0000013 "Future Artificial Intelligence Research (FAIR)" and RAISE (Robotics and AI for Socio-economic Empowerment) funded by the European Union– NextGenerationEU, Fit for Medical Robotics (Fit4MedRob)- PNRR MUR Cod. PNC000000.
