Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Machines have feelings too

A new book argues that computers should be given emotions. But, asks Charles Arthur, would you really want a stand-off with a temperamental PC?

Wednesday 05 November 2003 01:00 GMT
Comments

Design is a subtle art. Great design doesn't necessarily shout out about itself; often, instead, it's understated and quiet. Don Norman, the design critic and author of the forthcoming book Emotional Design, asked a number of people to name objects that they loved and hated: marvellous and dreadful designs. One named a souvenir mug with a heat-sensitive glazing - "One look and I know when my coffee is no longer drinkable." Another suggested their Volkswagen Passat, which they bought because "the controls in the car were pleasurable to use and look at". Not really what you'd expect to be named as classic design, is it?

And what things didn't people like? "Almost every appliance in my house is so ill-designed," one said. Another said, "Almost nothing about the PC is pleasurable."

Professor Norman's thesis is that emotion - that is, gut reaction - is an essential part of our reaction to anything we interact with. Don't dismiss emotion, he argues: it's a useful function that evolution has equipped us with so that we don't have to think about everything. We can just react, on the visceral level, which could save essential seconds in a tight moment. Disgust and fascination are opposite sides of the same emotional coin. So are frustration and absorption: we get frustrated when we're trying to accomplish a task but are thwarted, and we get absorbed when we try to accomplish something and are drawn in further.

But Professor Norman goes rather further than this. He doesn't just consider what makes us react to machines and objects that we use: he takes the thinking forward to pondering the question of whether machines - such as robots and computers - should have emotions. He thinks they should, as does Professor Rosalind Pickard, an artificial intelligence expert. She told him: "I wasn't sure [machines] had to have emotions until I was writing a paper on how they would respond intelligently to our emotions without having their own. In the course of writing that paper, I realised it would be a heck of a lot easier if we just gave them emotions."

I can see that it would be "easier" in that you'd be able to tell an "emotional machine" that the human gnashing his or her teeth in front of it was angry, and wanted something done, if it could feel anger too, and recall the emotion, and realise that it would be good to remove it in us. Sounds good, doesn't it? We'd like a computer (or other electro-mechanical object) that could see we needed to get something done, and would be eager to help us out for fear of us doing, well, something to it.

But I can see a few problems with this. We already anthropomorphize our machines - we attribute moods to them, because they do things for no reason we can fathom. In humans, we might hope for some dialogue on common ground to reach understanding; but with machines, one gets only obscure diagnostic codes (and big "OK" buttons to press, when things are definitely not OK).

The problem with giving machines emotions is that it's all or nothing. Professor Norman suggests that we'd want computers or robots to have pride - that is, to feel some satisfaction from achieving a task. But before you can get pride, you need happiness, and desire (to achieve a goal). What sort of processing overhead would it demand if every keystroke on the keyboard was weighed not just against the processes running in the heart of the computer, but also against whether the machine was willing to be interrupted? When I'm working to a tight deadline, I really dislike being interrupted. I don't want to be surrounded by machines that might exhibit the same characteristics.

Similarly, I think there's a mental sleight of hand going on in suggesting that we could somehow achieve machines that would have pride and that we could interact with. Pride is an advanced emotion, at least in terms of doing a task. Do dogs feel pride? Do babies? Any computer we built capable of emotions would not, initially, be like HAL, the calm-sounding computer of 2001: A Space Odyssey. It would be more like a baby, perhaps prone to tantrums, inexplicable refusal, and occasional complicity.

Of course the "example" of HAL (since it wasn't a real computer) is that recognising emotion in others doesn't necessarily help you. HAL recognises the anger of Dave Bowman, who comes back into the spaceship to disconnect him. But one could argue that HAL's plan is insufficiently well-laid all the way through. He didn't consider how the two crewmen would react emotionally to his attempt to take control of the ship. A human murderer could have forecast his crewmates' reactions more cleverly, and made a better plan.

But that argues too against having emotion in computers, or having them try to read ours. Far better for them to be disinterested parties in our efforts, referees of the games we play with data and applications. People have problems enough understanding each others' emotions. If we have to deal with machines' too, it could all get very ugly.

network@independent.co.uk

'Emotional Design: Why we love (or hate) everyday things' by Donald A Norman is published by Basic Books in January 2004

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in