Are you emotionally attached to your robot?
Two independent studies from Pierre and Marie Curie University (UPMC) in Paris and Georgia Institute of Technology (Georgia Tech) in the United States researched the human connection to "non-humanoid" robots and found the brain processed emotional responses to robots irrespective of their shape.
On March 1, the team at UPMC published their findings in Social Cognitive & Affective Neuroscience, an Oxford University Journal, confirming that, "at the behavioural level, emotion shortened reaction times similarly for robotic and human stimuli" and concluded, "results suggest that the early brain processing of emotional expressions is not bounded to human-like arrangements embodying emotion."
Beki Grinter, an associate professor at Georgia Tech, found humans attached to their robots are "... more willing to work with a robot that does have issues because they really, really like it. It sort of begins to address more concerns: If we can design things that are somewhat emotionally engaging, it doesn't have to be as reliable."
Grinter commented, "this sort of notion that someone would dress a vacuum cleaner seemed strange. A lot more was going on," and Ja Young Sung, PhD, another Georgia Tech researcher of "emotional design," found people who loved their Roomba, the iRobot vacuum, named and engendered their inanimate friends.
Neither study looked at the healthiness of the relationship between human and robot. John O'Neill, director of addictions services for the Menninger Clinic in Houston, told LiveScience, a science news site, "I believe that technology has benefited us greatly, but my concern is that many of us have taken it too far, and it's become a substitute for those necessary face to face conversations."
On February 1, Current Directions in Psychological Science, a journal of the US-based Association for Psychological Science (APS) published a study from Harvard University looking at the psychology of anthropomorphism and attributed it to loneliness and a coping mechanism for unpredictability.
On March 3, Adam Waytz, PhD and the Harvard study's lead researcher, told LiveScience, "we have this need to belong and to affiliate," and "when people are deprived of connections with other humans, they'll form connections with non-humans through anthropomorphism."
Waytz continued, "there may be nothing like the real thing, but that's a question that we want to test in the future."
O'Neil offered this as a easy self-check if you are a "techno-junkie" perhaps applicable for monitoring a robot relationship:
"The first thing to do is take a long, hard look at how you are using technologies, and then to start to set some limits. You have to take off a couple hours and make those hours important enough that you don't allow yourself to be interrupted. I think we should have certain rules. We don't break up, fire people or break traumatic news to people via e-mail or text message."
Study abstract, "Human brain spots emotion in non-humanoid robots ": http://scan.oxfordjournals.org/content/early/2010/02/28/scan.nsq019.abstract
Full study, "The Interplay of Context and Emotion for Non-Anthropomorphic Robots": http://www.cc.gatech.edu/grads/b/bmcgregg/InterplayContextEmotion.pdf
Full study, "Social Cognition Unbound: Insights into Anthropomorphism and Dehumanization ": http://www.psychologicalscience.org/journals/cd/19_1_inpress/Waytz_final.pdf
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies