The last word in geek chic

What will the well-dressed cyborg be wearing next season? The leading researchers in to wearable computing gathered in Pittsburgh recently to strut their stuff.

Stephen McLaren
Monday 02 November 1998 00:02 GMT
Comments

If, as Martin Amis claims, America is test-driving the future, then it was being hurled around the track at a screaming pace recently in Pittsburgh, Pennsylvania, at the second International Symposium on Wearable Computing.

Wearable computing might sound like a contradiction in terms, but if the would-be cyborgs who attended the symposium are to be believed, it is the cutting edge of computing research - and who could argue when most of the participants were packing more hardware and processing power about their person than most of us have on our desktop?

The notion of building computing power in and around the body is not by itself new, but it is only since computers have become portable, and communication between them fast and reliable enough, that it has become a reality. Hence this symposium organised by the IEEE (America's Institute of Electrical and Electronic Engineers) Computer Society was only the second major gathering of those with an interest in the field, and it is a testament to its novelty that the wide-eyed enthusiasts, rather than sharp-suited tech salesmen, were in the majority.

In its broadest sense, wearables can encompass any piece of digital kit which a person carries around in their day-to-day life - hence pagers, Palmpilots and mobile phones can be included. However, for those who see wearable computing as a totally new computing paradigm, the definition becomes more ambitious. For true believers, the wearable computer is always on, always acting on behalf of its user and always sensing the surrounding environment so that it can offer a more productive interface to the real world.

Most of the big computer research labs from American universities were well-represented at the conference by the pony-tailed and bespectacled men often seen at such hi-tech conferences, but the proceedings opened with a presentation by Lieutenant Colonel Bob Serino of the US Army.

The Land Warrior programme is by far the largest financed wearable computing project yet mounted: the aim is to make the 21st century US infantryman a computer-enhanced killing machine.

"In the digital battlefield, the soldier is part of an integrated assembly, fighting as a collective system. The soldier is the weapons platform," Lt-Col Serino intoned.

The current Land Warrior spec is a Pentium processor attached to a video- capture system and wireless modem, which allows the soldier to transmit shots from the battlefield back to a command unit for evaluation. A GPS satellite locator lets him know exactly where he is at all times and a head-mounted display (HMD) enables maps and text instructions to be overlaid on his vision. A laser range-finder enables the target to be "serviced" and, should he ever be injured, his vital signs can be relayed to the nearest medics. All that in a package weighing just 80lb, the same as is carried by a current foot-soldier.

With $4bn spent so far and with 34,000 of the completed systems on order, it seems that only the tendency for batteries to explode is holding back deployment.

The world-renowned MIT (Massachusetts Institute of Technoloy) Media Lab has been pioneering wearables research for about a decade now, and several of their students could be seen strutting around the venue, typing with one hand, and laughing at e-mails they had received by cellular modem, which were displayed on their HMDs.

Bradley Rhodes, who was wearing a purple computer ensemble with a fetching monocular display, is one of MIT's leading researchers. "We're very applications- driven at the lab. We build things, see how they work, and think how we might use them in our everyday lives. To do that, you've got to wear them all day long. I rely on this totally: as well as it being my address book and messaging service, I write all my class notes on it and, because I think better when I'm pacing around, I even wrote my master's thesis on it." Even though he no longer suffers the third-degree about his electronic protuberances when walking through airport security, Rhodes yearns for the time when he can get rid of the heavy batteries and cables he still has to lug around.

A possible future product from the Media Lab, the Startlecam, was the hot presentation. According to its inventor, Jennifer Healey, it is "an implementation of the cyborg idea of man and computer working in quasi- symbiotic union". The Startlecam works on the principle that human memory is too imperfect for serious jobs like remembering the face of a mugger about to rob us, especially since crucial detail tends to fade over time. A camera that is attached to our clothing and is constantly on standby to record such incidents suggests a better solution. Better still, why not endow it with the ability to transmit automatically its evidence to a network of websites, where it can be safely stored for the purpose of identifying and punishing said bad guy?

What's ingenious about Startlecam is the notion that the camera should automatically record and transmit when physiological responses are reacting to such fearful moments. In the current experiments, a skin conductivity sensor is being used as the trigger for switching on the camera - when you're scared, the skin generates extra sweat and makes for a better conductor. Cue! Camera, and record.

So basically you have a personal CCTV system that creates a "flashbulb" memory archive which is time-stamped and stored on safe servers anywhere in the world. Truly a technology for the nervous Nineties and beyond. And just in case you are doubting whether sweaty skin is the ideal way of booting-up Startlecam, its inventor claims other vital signs such as heart rate and blood pressure work just as well.

For those who want to get down to PC World and get spec'd-up, a peek under the bonnet reveals that Startlecam is basically a stripped-down 100MHz, 486 computer with a 2Gb hard drive and cellular modem. The processor runs a real-time signal processing algorithm to detect the startle response, and once activated, the camera runs a 15-frames-a-second black and white image with a five-second buffer to ensure nothing is missed. A tidy haul for a would-be mugger in a mask!

Research into devices like Startlecam point to a new direction in computing. Up to now, the holy grail has been to make computers smarter, whereas the wearables community want to make humans smarter by augmenting them in areas where computers are at an advantage.

"Augmented reality" - as it's known in the business - is where it's at with cutting-edge wearable research. A quick tour around the exhibitor stands showed a number of real-world solutions in areas where humans don't quite match-up. For instance, two companies, Via Inc and Xybernaut, are already offering commercial belt-worn computers which are finding their way on to the bodies of hi-tech maintenance staff in companies like Boeing.

If wearable computing is to escape "Geek-central", then it is imperative that real-world applications make it to market as soon as possible. Jerry Bowskill from BT Labs at Martlesham Heath, Suffolk, presented a paper on virtual conferencing, which he believes will revolutionise the way we collaborate with colleagues in different physical locations.

Instead of video conferencing, which Bowskill believes disorientates users because it is difficult to hold a conversation with more than one person at a time, BT propose a virtual conferencing environment in which users at remote locations are represented by avatars. To simulate distance and point-of-view, speech is spatialised and users can move nearer to or farther away from their colleagues within the environment, depending on the importance of the conversation.

"In collaborative virtual environments, spatialised visual and audio cues can combine in natural ways to aid communication," Bowskill says. "The advantage of virtual conferencing is that it is intuitive. If you mix avatars and an information-rich space, you find that it is about 30 per cent more efficient than a conventional desktop display." But Bowskill's research isn't about to make it into Dixons any time soon. Bandwidth problems mean that spatialised audio and 3D graphics currently can only be relayed in real time by the speediest deskbound PCs running on the fastest office networks.

In many respects, those who attended this symposium are indeed literally test-driving the future. Many, it seems, would like to assimilate themselves with a slice of silicon at the earliest opportunity, while others just seem annoyed that they cannot do 10 things at once without a computer. Nevertheless, US Army soldiers and airplane assembly-line workers are already being immersed in this new world of wearable computing, and who's to say that the humble mobile phone, given a bigger chip and a faster network, won't be the first device to bring wearables to the masses?

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in