Loading Events

« All Events

  • This event has passed.

Dialogue Games as Scaffolding for Building Open-ended Embodied AI for the Real World

April 17 @ 11:00 am - 12:00 pm
Welcome to the reading group presentation!

Human-Robot Interaction focuses on developing robots capable of collaboratively solving tasks with humans in robust and safe ways. This aims to foster more intuitive and effective collaborations, enhancing the potential for robots to meet complex human needs and adapt to real-world environments. Future reliance on robots is crucial as they can handle repetitive and hazardous tasks, improve efficiency, and provide support in areas such as healthcare and disaster response. This not only enhances human safety and productivity but also frees humans to engage in more creative and strategic endeavours, fostering innovation and improving quality of life.

To contribute to these aims, Alessandro will present my research agenda in Embodied AI, developed around the concept of embodied dialogue games. These activities enable AI agents to leverage multimodal signals and learn from interactions with the world and other agents. Key capabilities for these agents include embodied perceptual experience, the ability to represent and learn from interleaved multimodal sequences, and a training regime for learning through interaction. This talk will introduce future steps to advance research in Embodied AI and create a step-change in the way AI-powered robots are implemented so that they are able to develop a symbiotic relationship with humans to solve tasks in robust and safe ways. This will represent a potential path towards “Smart Machines” and get closer to the long-term vision outlined by the UK’s Robotics Growth Partnership.

Coming soon!

View the presentation material as:

📄 Presentation

Alessandro Suglia is a Lecturer in Embodied Natural Language Processing and a Fellow of the Generative AI Laboratory at the University of Edinburgh. He is co-lead of the “Generative AI for Robotics” theme at the National Robotarium, and a member of ELLIS. He also won the “Italy Made Me Award 2024” from the London Italian Embassy for outstanding Italian early-career researchers in the UK. Alessandro’s research focuses on designing artificial agents that learn language by leveraging multimodal sensory information derived from interacting with the world and with other agents. During his PhD, he was one of the main developers of Alana, the Heriot-Watt conversational AI which ranked 3rd in the Amazon Alexa Prize challenge in 2018. In his role as Assistant Professor at HWU, he led the HWU team “EMMA”, the only non-American university team which was one of the finalists of the Amazon Simbot Challenge—the first Amazon competition to push the boundaries of Embodied Conversational AI. He was Head of Visual Dialogue at AlanaAI where he led a team of research engineers working on Multimodal Foundation Models for Embodied Visual Question Answering. Alongside several academic collaborations, he has also completed research collaborations with industry including Amazon Alexa AI, Meta AI, and the European Space Agency— focused on developing innovative Multimodal Generative AI models for embodied and situated human-robot interaction tasks.

Details

  • Date: April 17
  • Time:
    11:00 am - 12:00 pm

Organizer

Venue