When Siri is spot on with cultural references – who makes it happen, man or machine?

Rhodri Marsden explains why it's a bit of both

Rhodri Marsden
Friday 17 July 2015 07:49 BST
Comments
Say my name: Caitlyn Jenner accepting a courage award this week
Say my name: Caitlyn Jenner accepting a courage award this week (Getty)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Do we want our phones, tablets and watches to have a personality? Would we like them to be meek and compliant, bold and outspoken or hovering somewhere in between? This week it was discovered that Apple's voice assistant, Siri, certainly doesn't mess about if anyone dares to ask how tall former American Olympic decathlete Bruce Jenner is.

"Caitlyn Jenner is 6ft 2in tall," it replies, with no reference to the recent transitioning of arguably the world's most famous transgender woman. Siri has since been widely applauded for having no truck with people who insist on using Jenner's old name, but others have railed against the "left-leaning tech elites" who dared to program Siri to respond in such a way.

There was similar disagreement when the Breaking Bad actor Aaron Paul tweeted his delight at the answer Siri gave to the mathematical problem of dividing zero by zero.

Ignoring the centuries of wrangling that led to a general agreement among mathematicians that 0/0 is an "indeterminate form", Siri instead chose to outline a fictitious scenario. "Imagine that you have zero cookies and you split them evenly among zero friends," it replied. "How many cookies does each person get? See? It doesn't make sense. And Cookie Monster is sad that there are no cookies, and you are sad that you have no friends."

Some were thrilled by this human-like response, while others berated Apple for wasting time programming Siri with party tricks, "Easter eggs" (hidden extras) and light whimsy.

Siri is a three-stage process. Voice recognition (to work out what we're saying), natural language processing (to work out what we want), and natural language generation (to formulate a reply). Enabling Siri to politely summarise dull statistical information (e.g. the weather) involves great technological skill, but it's almost impossible to make computers likeable or witty.

Their sense of fun is limited, their jokes are poor, and they require humans to write witty responses for them. In 2013, Apple advertised for someone to "develop and write dialogue to support new Siri capabilities", and that's a clue to how it works: teams of people writing upbeat replies to store in a database in anticipation of certain questions being asked.

Siri's suggestion not to "put Descartes before the horse" is not something a computer could ever have generated by itself. But some of us like to pretend that it did.

Why? Some psychologists believe that the "rule of reciprocation", which compels us to mirror the kind of behaviour people dish out to us, also applies to machines. Apple clearly shares that vision; while an intelligent concierge service such as Google Now prides itself on efficiency and usefulness, it has no inherent charm.

Apple, ever keen on establishing some kind of emotional bond with its customers, wanted Siri to have, as one of the service's co-founders put it, "a light attitude… friendly and humble, with an edge." In other words, there's a belief that the device's pseudo-personality is crucial to us liking it – even if we know, deep down, that the responses have been written months or even years beforehand by copywriters in Cupertino.

If you persist in asking Siri for a bedtime story, it'll eventually read one to you about a personal assistant called Siri; some people adore this, but others find it to be a loathsome extension of something they already hate about Apple – a manifestation of style over substance.

Apple is looking to increase that "substance" with the release of iOS9, upgrading Siri to become a "proactive assistant" that not only improves speech recognition but handles the information we give it far more intelligently.

But it will always retain that "edge" that causes the occasional viral sensation, coupled with a humble recognition of its own inherent limitations. "It's a riddle wrapped in an enigma," says Siri when asked what "Siri" stands for, "tied with a pretty ribbon of obfuscation".

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in