I'm writing the first draft of this opening paragraph using my favourite rollerball pen on a fresh sheet of A4 lined paper. As antiquated as that may sound – especially coming from someone who mainly writes about technology – I've always done it this way; the words come more easily, the flow of ideas isn't interrupted by distractions from the internet and, most importantly, I like using pen and paper. Of course, the drag will come later today when I have to type laboriously everything out into a word-processing document. I'd love there to be a way for the arcs and squiggles taking shape now on this page to be converted into editable text later, but as yet the world of technology offers me only workarounds in the form of dedicated digital pens.
Apple's Siri software for the iPhone brought usable voice recognition to millions of people, but this other natural form of human expression – handwriting – is rarely harnessed as a way of getting data into computers. We're slaves to our keyboards; many of us hardly use pens at all, other than to scrawl our wishes into friends' birthday cards.
The history of technology and handwriting has experienced the same kind of hiccups and false starts as voice recognition. In the 1990s, while people were fruitlessly shouting commands at Macintosh computers equipped with PlainTalk software, Apple's handheld Newton PDA employed a handwriting-recognition system that had similar teething troubles. And while later versions of the software improved immeasurably, our first impressions of handwriting recognition were poor. Palm PDAs had a system named Graffiti, which required us to adapt our handwriting slightly into a series of one-stroke gestures that the device could recognise; less room for error, but harder to learn.
The descendants of these technologies can be found in touchscreen devices today, but there have also been parallel developments in the world of pen and ink that allow us to write on paper using digital pens, and for those strokes to be digitally captured. There are two competing systems: the first combines a pressure-sensitive pen with a device you clip to the top of the piece of paper that senses the pen's movement (such as the e-pens Mobile Notes; see box); the second is a standalone pen with a small camera in the tip, which you use to write on special paper imprinted with a dot pattern (e.g. the Livescribe Echo, also in box). Both pens generate "digital ink", which is then transferred to a computer (via USB or wirelessly) for analysis.
The most common piece of software bundled with these pens to perform that analysis is called MyScript, developed by a company called Vision Objects. The human brain is pretty good at deciphering other people's handwriting, and the company has spent the past 13 years trying to raise MyScript's game to that level. "It can certainly outperform humans now," founder Stefan Knerr says. "We sit people in front of handwriting, pay them to spend hours typing in what they're reading – and then we compare the results to MyScript's. And yes, the average person is outperformed by MyScript."
The software's sophistication lies in its ability to emulate the human skill of assessing the context of a word within a sentence. "For example," Knerr says, "the lower-case letters 'r' and 'v' are very ambiguous, so 'cave' can look identical to 'care'. But in the sentence 'We care for our customers', context allows us to work out which one is very easily. That's what we've spent time and money on: taking handwriting recognition beyond character recognition and bringing in sentence-level context. Now we're working towards the next level, where we look at an overall feeling across a page of text to improve results."
Learning an individual's handwriting tics is another contextual factor being worked on by MyScript – indeed, this was something that Apple's Newton software attempted many years ago. Nudging accuracy rates up towards 100 per cent is always the aim, in the hope that it'll prompt individuals and companies to revert to the humble pen – which does, after all, have its plus points. One company that has embraced it is the commercial cleaning-services firm Jani-King, whose employees use a Bluetooth-enabled Destiny digital pen to fill in time sheets – information that is immediately beamed to head office for processing. "It's about accuracy and speed," Jani-King director Paul Haworth says. "We've not come across any technology that does it any better than this. The information that comes in daily from the pens helps us manage productivity and efficiency – and it gives us a commercial edge over competitors who don't work this way."
It's unlikely that the pen will ever replace physical or virtual keyboards as the primary digital-input method. The pens are getting smaller, but they're still fairly inelegant. And while keyboards produce countless typographical mistakes, they're generally considered to be faster and more accurate than pens. But the explosion of touchscreen devices brings new opportunities for scribbling characters with a finger or a stylus. "We're particularly excited about Samsung's Galaxy Note," Knerr says. "It's a device you can use to naturally take notes – all the benefits of handwriting recognition without needing additional bits of technology."
The way handwriting and computing will dovetail in the future, according to Knerr, will be less about conversion and more about background understanding. "Your handwriting will stay as handwriting," he says, "but the computer will understand what you're writing. We believe profoundly that handwriting is a more versatile way of expressing yourself than a keyboard. On a page you have all kinds of underlines, circles, exclamation marks in the margins and arrows being drawn across the whole page. And that's your intelligence. That's the stuff that you can't do with a keyboard." The page I've got in front of me now is certainly a mess; hopefully one day a computer will be able to decipher it – but right now, it's down to me.