The Origins of the year 2000 problem - also known as the "millennium bug" or the "millennium time bomb" - seem extraordinarily trivial, writes Paul Mungo. In the 1960s, when computing was in its infancy and the cutting edge of IT was represented by vast machines that were probably less powerful than modern-day pocket calculators, computer memory was expensive and precious.

Programers looked for every possible way of squeezing more data into less space: one trick they came up with was to represent the year with two digits instead of four - as in 96, 97, 98, and so on. The trick became a convention; until very recently software and hardware suppliers routinely used the two-digit code.

The implications of this apparently minor bit of programing shorthand are quite startling. Computers operate in years they know only as 96, 97 or 98; they are unaware of the concept of "century". Come the millennium, most computers will tick over to a year they will see as 00, not 2000, or even 1900, just 00. Computers are aware that 00 is a smaller number than 96 or 97 or 98, and should therefore come before them. This will create interesting problems for operations that compare dates or need to carry forward from 99 to 00.

Not all computers will tick over to 1 January 00 on the big night. Some will just get stuck; others will go back to a default date of 4 January 80 (the date that DOS, the most popular computer operating system in the world, was invented). But the implications are the same: the computer is, in a sense, going back in time - by 100 years or 20 years or whatever. And those computers that get stuck on 31 December 1999 will forever be living through the millennium night.

You might want to think of the implications of this as comparable to having a particularly incompetent secretary. Documents created in January 2000 will be filed by whatever date the secretary happens to think it is, and could well be lost forever. Financial records will become gibberish as balances are carried back instead of forward.

The effects of this on businesses - particularly in the financial sector - are potentially enormous. Indeed, any computer operation that requires date-comparison could go haywire. As a simple example, think of a man born in 1935. At the age of 65, in the year 2000, he is eagerly awaiting his well-earned pension. Except that the computer in his pension company's office takes a look at his two-digit birth year - 35 - and subtracts that from the current two-digit year - 00 - and comes to the none-too-clever conclusion that the man is minus 35 years old and is therefore not entitled to a pension for another 100 years.

Because dates are used throughout computer systems - in programs, in software, in operating systems - the only way to upgrade computers for the millennium (to become "millennium compliant" or "year 2000 compliant" in industry jargon) is to go through every line of computer code and update it. This is what companies and businesses should be doing now. If they aren't, you may not get your pension.

It should be noted, though, that the problem is not insoluble. The solutions exist to fix computer systems, and most home PC users should be largely unaffected by the year 2000 bug.

Most equipment and software now on the market is "year 2000 compliant". But even non-compliant computers will function, and most manufacturers offer free "quick and easy" fixes. At least that's what they say.