When humans talk to robots
It's human-robot experiment day, the culmination of a year of hard work by five Ira A. Fulton Schools of Engineering seniors.
The robot looks like a miniature ATV, with two jaunty antennas and bright orange rims on its chunky tires. The human, a student who is guiding the robot through an obstacle course, laughs at the chirping sounds the robot makes when she tells it to turn right and left.
"It sounds so cute!" she exclaims. Just before it reaches the end of the course, she says, "Stop. Stop! STOP!" The robot finally halts just before colliding with a wall. The student is asked to rate her experience, then completes the same course with a hand-held controller. She gives another rating, then the next participant in the experiment steps up to the course.
Robots are becoming a part of everyday life, but human-robot interactions can be complicated. Many robots require expensive, complex training that is inaccessible to the general public. For their capstone project, a team of five seniors, advised by Assistant Professor of Information Systems Victor Benjamin, sought a simpler solution by exploring spoken-language interfaces for human-robot interaction.
"We think language is a more natural way people can interact with robots without training," Benjamin says.
The team designed the robot to use hand-held controller and spoken-language interfaces. They measured the efficacy of each by timing participants' completion of the course and counting their errors. While the obstacle course captured objective metrics, a survey asking participants to rate their satisfaction captured subjective feelings.
Courtney Dozier (BS Computer Science '22), who booted up the robot, says "Most of the time, remote control tends to take a lot of training for people. We wanted to see if you could reduce that by introducing a new interface. We hope to learn how people like voice and remote control."
Midway through the experiment, Henry O'Scannlain-Miller (BS Computer Science '22), who designed the client code for the robot, observes, "The people who kind of struggle with the joystick also struggle with the voice commands. But people who are very comfortable with the joystick are more comfortable with voice commands as well."
The ease with which participants operated the controller may be due to the popularity of video gaming, Benjamin says. As for the voice interface, lags with the robot receiving and reacting to commands made it a bit harder to control. This will be addressed and improved in future work with the robot, according to Benjamin. He also notes that students who struggled with the voice interface still said they were satisfied with their experience.
Perhaps people enjoy the growing conversational nature between humans and robots.
The team worked hard to make the human-robot interaction as seamless as possible. Tyler Nichols (BS Computer Science '22) designed the software to run the robot and Justin Haught (BS Computer Systems Engineering '22) designed the Bluetooth communication framework. Haught says his goal was to make the interfaces operate like second nature for participants.
"I want the voice control and the physical control method to be like an extension of themselves, almost as comfortable as driving a car," he says.
As the students run the experiment, Abraham Lords (BS Computer Science '22), who designed the controller client code, explains how the information they are gathering could be used in real-world settings.
"There are a lot of different applications for various drones in the workplace or military, for example," he says. "People want to have things more under control with robots and it's not always intuitive. We want to make it as easy as possible to have that alternative workforce implemented."
Post-experiment, Benjamin reflects on the unique, interdisciplinary nature of the project, which combined business and social science concepts to design robot interfaces for the general public and engineering skills to implement them. He calls his students' efforts "amazing."
"I give them serious kudos for pushing through the engineering of both the robot itself as well as the experimental design," Benjamin says. "It was a long year for them, but they felt very accomplished seeing people test out their robot at the end."
The robot will be homed in the W. P. Carey Actionable Analytics Lab and will be used to support future research projects involving human-robot interaction. Benjamin says W. P. Carey pursues robotics research because it's becoming increasingly important to businesses.
"Because we join forces to do interdisciplinary work, we can go farther. The robot the students developed can support future research and help W. P. Carey become one of the first business schools to publish robotics research."
Latest news
- Khalifa University, MS-EI student Ayesha Alkatheeri combines entrepreneurial teachings with engineering background
The W. P.
- Soccer league collaboration spurs innovation
Phoenix Rising Football Club welcomed ASU's Small and Medium-sized Businesses (SMB) Lab to its…
- Fall 2024 W. P. Carey Dean's Medalists honored at celebratory luncheon
The W. P.