Britain's first virtual studio is in daily use at the London-based children's channel Nickelodeon, where a presenter links programmes from the Slime Cave. Without the electronics, the studio is nothing but an all-blue room, but switch the computer on and it becomes ... anything you like. Virtual images have been used for a while in television. The BBC Nine O'Clock News's opening shot of a giant glass globe and curving desk was created on computer. But the Slime Cave goes well beyond this because everything the viewer sees, except the presenter, is computer-generated. Whenever the camera moves, it retains the correct perspective - just as a real, three-dimensional set would.
Virtual studios have the advantage of flexibility - a set can be changed completely at the touch of a button. They can create a Tardis-like impression of size. "Wherever the blue is, we can superimpose our virtual environment so we are not limited by the space that the real room is giving us," says Christian Stark of Virtual Studio Hamburg.
This does not necessarily make it cheaper - the systems cost from pounds 300,000 to pounds 1m - but it does open up new possibilities. Sets can appear to extend to infinity in any direction. If the camera points up into the lighting grid or at the studio audience, the computer simply replaces them with the graphics.
In Germany, the same system is being used for news and sports shows, as well as new programmes which would not be possible otherwise. Virtual Studio Hamburg, which supplied Nickelodeon, helps to produce an interactive game show where viewers can control what happens in the virtual studio via their telephone.
To match camera movements, all virtual studio systems use Silicon Graphics Onyx parallel processing supercomputers. Although about 2,000 times as fast as a typical Pentium PC, they cannot yet produce truly realistic sets or give a convincing outdoors feel. "Indoors lighting can be fairly well simulated on a computer using technology such as radiosity, where the computer calculates the effects of standard lights and shadows are cast on objects," says Patrick Renvoise of Accom, "but simulating outdoors lighting is an extremely complex problem."
To enable the computer to generate the correct perspective as they move, the cameras usually need sensors to record their position, zoom and focus. However, one Israeli company, Orad, uses pattern recognition that allows even a hand-held camera to be used. On a normal blue screen, the technicians use a grid of lighter blue rectangles so the computer can work out where the camera is looking and alter the graphics to match.
At Softimage, they have combined the virtual studio with virtual characters who can be manipulated by a real actor or puppeteer, whose voice controls the lip movements. "The actor is also hooked up to a motion capture system for the motion of his arms; when the actor moves his arms, the virtual space alien is also moving his arms," says David Morin, director of special projects at Microsoft/Softimage, Canada. Nickelodeon already has a virtual character, Bert the Fish, who has his own games review show, Fish and Chips, but it has not yet put the two systems together. The system is similar to that used for VActors, or virtual actors.
Motion capture technology is used in games and interactive videos, such as Sega's Virtua Fighter. Where it once took months to create graphics, the virtual studio will allow live production. It will also have a big impact on special-effects movies. Previously, there was no way to see the computer special effects and live actors together, which meant cameramen had to imagine the result. "That led to mistakes that had to be corrected afterwards, which was very lengthy and costly," Mr Morin says. "Now, because we can do some of these special effects in real time, we can stick them right in the viewfinder so that the cameraman sees that, okay, my dinosaur is going to be here, so if I want to shoot the head of the dinosaur I can zoom over here and I'm not missing the point."
The live image quality will not be good enough for film until computers get even faster, but this doesn't matter as it will automatically be replaced later with a more detailed version, each frame of which may take several minutes to be generated.
Queen Mary and Westfield College (University of London), which is part of the European Mona Lisa programme to develop virtual studio technology, has a Web page at: http://www.dcs.qmw. ac.uk/monalisaReuse content