As an emotionally sensitive man who has formulated a grossly over-inflated estimate of his own emotional sensitivity, I want to assure you that if you're feeling bad, I can tell. It's etched into your face, it's detectable in the tone of your voice, and more than evident by the way you keep shouting "bollocks!" at loud volume.
My phenomenal levels of empathy then allow me to adjust my behaviour accordingly by bringing you cups of tea and singing you a soothing collection of ballads by Johnny Mathis. It's just what humans do for each other. Computers, however, struggle to compete in this field. My laptop can't tell when I bristle with indignation. Yes, my phone might be able to recognise the words I'm saying, but it always delivers the same implacable response regardless of whether I'm cooing with delight or screeching with fury.
And some might say that this is how it should be – that devices should pay no heed to our wildly fluctuating emotions, that we rely on them to remain dispassionate and functional while we claw at carpets and chew our fingernails. They can already tell what we like, by the way we click, reply, skip forward and rewind; it's probably better if they don't know how we feel, as well.
The capability of computers to recognise emotion, however, is gathering pace. Over the summer, an Israeli firm called Beyond Verbal raised millions of dollars in funding; its vice president described how its software can "understand a speaker's transient mood and emotional decision-making characteristics in real time", by analysing the modulations of the voice. Applied to one of the 2012 presidential debates between Romney and Obama, it managed to detect practicality, anger, strength, provocation, cynicism and ridicule in Obama's tone – which is either unerringly accurate or completely wrong, depending on your own political allegiances.
Earlier this week, the New York Times reported that a US company, Affectiva, is releasing software to mobile developers early next year that can sense human emotion by analysing the expression on our faces. We already know that Microsoft's newest version of Kinect can track muscle and skeletal movement in fine detail, and that analysing emotional reactions to games is in the pipeline, but Affectiva is doing it already, thanks to the collation and examination of 1.5bn videos of volunteers' faces while they watched online entertainment. The resulting software could potentially distinguish, say, nose wrinkles of mild contempt from nose wrinkles of cute affection, although whether it can spot good acting is another matter entirely.
So what use is voice analysis, facial scanning, sweat detection? How can it benefit us and improve our lives in the long term? Analysts fast-forward to a time when machines can respond to us more tenderly, perhaps by playing the soothing music of the panpipes if we're stuck in roadworks, or cueing up an episode of Parks And Recreation if we're feeling down. In other words, the technology itself is more remarkable than its applications. Its real value, of course, is to companies who are desperate to assess brand perception and brand loyalty. Computers may never be able to emulate human empathy, but they'll certainly be able to point you in the direction of a product that'll make you feel better – if you have the money to spend. Ker-ching.
Register for free to continue reading
Registration is a free and easy way to support our truly independent journalism
By registering, you will also enjoy limited access to Premium articles, exclusive newsletters, commenting, and virtual events with our leading journalists
Already have an account? sign in
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies