Welcome to the website of the Interactive Machines Group (IMG)!
IMG is an interdisciplinary research group led by Marynel Vázquez in Yale's Computer Science Department.

Shutter, our robot photographer, being deployed in Becton.
Furhat tries to balance participation in group interaction (project in collaboration with S. Gillet, M. T. Parreira and Iolanda Leite at KTH).
Our Social Environment for Autonomous Navigation (SEAN). See sean.interactive-machines.com for more details
June 2020 group picture (captured in 4 different time zones!)
Fall 2019 group picture. The lab keeps growing!
Cozmos in new study about motivating prosocial human behavior (to appear at HRI'20).
Testing Shutter in the lab.
Kuri being tested in the lab for a new NSF NRI project!
Robots and computers for CPSC-459/559 Building Interactive Machines.
First big qualitative analysis of human-robot interaction data in the lab.
Summer 2019 group picture - blurry but taken by a robot that tried to make us smile!

Our group studies fundamental problems in Human-Robot Interaction, which often results in exploring research directions that can advance Human-Computer Interaction, Robotics and applied Machine Learning more broadly. In particular, our current research agenda is focused on creating a new generation of robots that can effectively adapt to varied social contexts, engaging and sustaining interactions with multiple users in dynamic human environments like our university campus or museums. We spend our day-to-day building novel computational tools, prototyping interactive systems, and running user experiments to both better understand interactions with technology and validate our methods. More information about current research directions can be found in the research page.

Want to interact with robots? Information about ongoing studies can be found in the studies page.

News

  • IMG participates in HAI'22 and CORL'22

    Kate Candon, Debasmita Ghose and Sarah Gillet will be presenting new work in New Zealand! Their presentations will be about how people perceive helping behaviors from an agent, how robots can learn visual object representations tailored to human requirements, and how a robot can learn who to address during group interactions.
  • New NeurIPS 2022 publication

    IMG students develop novel approach to approximate metrics based on confusion matrix values (like the F-β score) and use them to train binary neural network classifiers. The approach works particularly well with imbalanced datasets, as shown in a new NeurIPS paper.
  • Kayla Matheus receives the Best Student Paper award at RO-MAN 2022

    Kayla Matheus, third year PhD student, received the Best Student Paper award at the 31st IEEE International Conference on Robot & Human Interactive Communication (RO-MAN 2022). Her paper presents Ommie, a novel robot that supports deep breathing practices for the purposes of anxiety reduction.
  • New RA-L paper about SEAN 2.0! Checkout the related SEANavBench benchmark

    New RA-L paper describes our new Social Environment for Autonomous Navigation 2.0, which is now easier to use and has more complex crowd motion. Also, the related SEANavBench benchmark for social robot navigation is now open after being announced at ICRA'22.
  • IMG Celebrates Recent Achievements

    Kate Candon received a Honorable Mention from the NSF Graduate Research Fellowships Program and Marynel Vázquez was awarded an NSF FRR CAREER Award!
  • IMG Participates in HRI 2022

    Several students will be presenting their latest work in the main HRI conference and associated workshops.

Older news...

Lab Location

Yale University, AKW 400
51 Prospect Street
New Haven, CT 06511