This is a bit of a delayed post on the Human Robotics Interaction (HRI) 2024 held from March 11-14 in Boulder Springs Colorado (#hri2024 on X). But the past few weeks have seen an intense pace of news and developments, and it has been hard to catch up.
For those unfamiliar with HRI, it is the prestigious conference on all things related to the ways and means by which humans and robots will interact.. we have been covering HRI every year. Please refer to our notes on HRI 2022 and HRI 2023 for previous conferences. We have also covered a lot of work from faculty and researchers who present at HRI and similar venues.. you can find a lot of material in the conferences tab.
Workshops
HRI is typically organized with one day of workshops and three days of single track sessions on many topics. There were two workshops on related themes on how Large Language Models (LLMs) will impact robotics: Scarecrows in Oz and Human- LLM Interaction. There is interesting research on how multi-modal models can help understand robots about their surroundings by processing sensor inputs and data. If you are interested in this area, there is a great workshop article from researchers at Rutgers University that explores how multi-modal models can enhance the cognitive capabilities of robots. The theme of the use of LLMs in robots continued in the main conference with several interesting topics including a study from University of Wisconsin-Madison that looks at how user interaction with a LLM powered social robot differs from a text agent and a voice.
Robots in Practice
One of my favorite parts in HRI is learning about studies of robots used in realistic environments where they would frequently interact with humans: be it classrooms, home, cafes, or just on the streets. As an example, researchers from Cornell University studied how three robots: Sony Aibo, Sharp RoBoHoN and GrooveX LOVOT are used in Japanese society by visiting stores, cafes, and meetups in which people interact with the robots. This study by itself is very interesting and thought provoking… I will plan to do a post just on this topic. I liked the quote from the authors who summarize their work saying:
“It takes a transmediated village to nurture a social robot”
Likewise, in the session on families, researchers from University of Wisconsin Madison suggest that the presence of a robot in the household leads to complex relationships built with all members of the family and aims to understand these robots better in the context of social robots. There is another interesting study by researchers at University of Southern California exploring how Attention Deficit Hyperactivity Disorder (ADHD) diagnosed students can benefit of having an in-dorm social robot companion to help them in coursework. The study found that 91% of the students preferred to keep the company of the robot (Blossom robot) beyond the first week of introduction.
Editor’s Note: It is National Robotics Week from April 6-14 in the USA, and there are numerous events aiming to motivate children to learn about robotics and STEM related fields. You can find a lot of events near you. I encourage you to find something interesting and take the initiative to visit some of them.
Our friends at Petoi (the makers of Bittle) also have a ongoing contest where you can post projects that you did with your Bittle robot and win some very cool prizes.
Vector Robot as a Detective Agent
For our readers interested in the Vector robot… researchers from University of Chicago designed an immersive game where users take help of two robots who role-play as detectives to solve a crime. A Misty II robot plays the role of a senior advisor while our beloved Vector robot plays the role of a peer detective. Depending on the experimental setup, the role of the robot is to ask questions and guide the user towards the goal. The study found that users responded positively when interacting with the robots that provide them with opportunities to contribute to an experience’s narrative and game-play. You can read about the study here.
Emotions
One of my favorite sessions at HRI was on emotions. One of the best papers at HRI from the University of Wisconsin-Madison explored how users would respond to emotions from a fiber-embedded actuator based soft cylindrical robot named Sprout that stands 10cm tall and has a diameter of 5cm. Sprout can deform its body to create expressions and emotions… and it is interesting to see that humans can recognize these emotions. The following video made by the author is entertaining and useful to understand this technology, which we hope sees some commercial adoption. Sprout for sure would be a cute addition to my desk.
Conclusion
HRI has a lot of material for robot enthusiasts. I have learnt a lot by just looking through slides, skimming through papers (the entire proceedings are available for free) , looking a X posts and You Tube videos. In the coming months, I will probably do a deep dive on a couple of papers. If you attended HRI 2024, or are planning to submit your work to HRI 2025 in Melbourne, Australia… feel free to jot down your experiences in the comments.