A recent study has demonstrated the blind confidence some people have in a robot, even when common sense tells them something else.
In recent years, many people from various fields of activity have detailed the potential risks to which we are exposed at the moment in which we increasingly rely on robots. It seems, however, that those opinions have not arrived at the ears of the 30 participants in a study that has been conducted in the United States of America.
A group of researchers from the Institute of Technology in Atlanta, Georgia was stunned when they completed an experiment involving a robot. The premise was simple, 30 people were forced to leave a kind of maze in an emergency. There were two options, either to follow the obvious signs on walls and doors to the exit or to go after the autonomous robot from the image below, named Emergency Guide Robot.
“We were stunned. We believe that there won’t be enough confidence and that we may be forced to do something for the robot to generate confidence,” stated Paul Robinette, the student in charge of the study, who remained stunned by the results, but was unable to find an argument for them. It is likely that the reason people follow a robot in preference to the exit signs to have a connection with the ubiquity of GPS applications on which we rely on a daily basis. Whether we are talking about Google Maps, Waze or indirectly we expect robots to help us orient ourselves in space. Even in circumstances in which a GPS application has the wrong path and you went in the middle of the forest, is unlikely not to attempt to use the same GPS to redress.
At the end of the study, from 30 participants, 26 decided to follow the robot in the event of an emergency, giving up fully to the principles of common sense. Two people have ignored signs of the emergency situation and remained in the room, and two individuals were eliminated from the study for reasons unknown.
Get more stuff like this
in your inbox
Subscribe to our mailing list and get interesting products and updates to your email inbox.
Thank you for subscribing.
Something went wrong.