Goats and axes: the trouble with barter
Before the invention of money, the only way to trade goods was via barter. If you had a spare goat but wanted a big pile of wheat, you had to find someone in the reverse position. This would have been fine in a community where the only possessions were wheat and goats, but once you threw a few more items into the mix the limitations of the barter system became glaringly obvious.
First, there was the problem of matching up desires. What if the person with the wheat was only interested in swapping it for hand axes? It could be a long time before the first man found an axe-rich individual in desperate need of a goat. In the meantime, he might starve.
Second, what if the wheat man did want a goat, but not until his daughter's wedding in three months' time? Had they been able to read and write, the protagonists could have drawn up a contract, but at this stage there were no literate societies. Even if there had been, a system relying on written records of who owed what to whom would have become hopelessly unwieldy once there was a reasonable amount of trade going on.
As it was, the parties to the deal would have had to rely on memory. This was open to abuse ("What are you talking about? I never said I'd give you a whole goat!") and still more liable to become overtaxed ("If Ug owes Stig seven axes, Stig owes Rok four clubs and Rok owes Ug two goats, how many bear hides does Stig owe Yed?"). And there was always the risk that the poor old goat might turn up its toes before the wedding day.
One day, an anonymous Neolithic farmer came up with an ingenious solution. Why not settle on something that everyone agreed was worth having? This could then be used to buy goats, axes, clubs and all the rest of it. The dollar bill signs flashed in the proto-economist's mind. Eventually his big idea would become known as "money".
After a period of trial and error, it became apparent that candidates for this exciting new function had to have certain characteristics. They needed to be:
Scarce – Ordinary pebbles would be useless as a form of money, at least in most places. They are just too easy to obtain. It wouldn't seem right to trade the goat you had lovingly reared for years for a handful of stones that the buyer could simply pick off the ground.
Difficult to fake – This is related to the scarcity imperative. A monetary system which is based on pieces of paper with simple crosses drawn on to them would be doomed to failure. If people could literally make money with the minimum of effort, the temptation would be very difficult to resist. The result would be galloping inflation and a total loss of confidence in the currency.
Portable – You can certainly trade items like houses and mountains, but taking them to the local market is not really practical. OK, the Yap islanders did have stone "coins" that weighed several tons, but they also had little ones for everyday transactions.
Durable – A commodity such as salt might seem a potential goer as a currency, but you would be rapidly disabused of this idea if you left your supply out in the rain.
Easily divisible – Imagine a society where the only form of money was the $100 bill. Although this wouldn't be a problem with relatively large transactions, it would be a mighty headache if you just wanted a box of matches. To work efficiently, units of currency need to be divisible into smaller units. For this reason, live goats are not a viable option.
Desirable – Nowadays, we are all delighted to receive pieces of paper adorned with the right symbols, but this is only because of the symbolic value they have acquired through long and complex chains of events. An early farmer would have been distinctly underwhelmed to be handed a wad of £50 notes. To part with his goods, he would have needed to be given items directly exciting in themselves. Shiny, pretty or inherently useful things would have come to mind.
With these criteria in mind, or perhaps more likely operating unconsciously, our ancestors developed the first primitive forms of money. Cowrie shells were a popular early choice, as were precious or semi-precious stones and metals. The latter became increasingly important in the payment of tribute, which the empires that were now beginning to form started to extract from their subject territories. Such forms of what can loosely be described as cash are technically known as "commodity money".
At this stage there were no coins. Instead, the value of metal was judged by its weight. The legacy of this can be seen in words such as the English "spend", which is derived from the Latin verb expendere, meaning "to weigh".
Before coins (and later, bank-notes) could come into existence, the institution of banking had to be invented. This order of events may seem surprising, but if you think about it, notes and coins need pre-existing bureaucratic structures to give them their validity. One of their crucial features until relatively recently was the "promise to pay the bearer" clause. The holder had to have confidence that there were institutions that would exchange these symbolic forms of money for the commodity that underpinned them (often gold), or at least that they would do so in theory.
The first banking institutions were royal palaces, temples and state warehouses in Mesopotamia and Egypt. Because they were well guarded, these places were considered ideal for the storage of grain and other commodities. Those making deposits would be issued with receipts, which could then be used to conduct transactions with other parties. Written on clay tablets or papyrus, they can be regarded as the forerunners of today's transferable cheques and banknotes. As the practice of using such receipts to make purchases caught on, private banking houses began to appear. They are mentioned (and regulated) in the Babylonian Code of Hammurabi (c1760BC).
The scene was almost set for the invention of coins. In the meantime, the Chinese came up with a couple of interesting intermediate stages between commodity money and true coinage. In about 1000BC, after at least two centuries of conducting transactions with the genuine articles, they started using bronze and copper models of cowrie shells. Then, around the eighth century BC, "spade" and "knife" monies came into vogue. These were miniaturised versions of agricultural implements that were too small to be of practical use, but gave nods to items that were. They came in standardised weights and tended to be marked by the issuing authority.
The world's first true coins are widely believed to have been minted in Lydia in Asia Minor around 640BC. They were made of an alloy of gold and silver called electrum and were probably made in order to guarantee the purity of the constituent metal.
The design – a roaring lion's head symbolising the ruling Mermnad dynasty – was stamped on one side only. This was a result of the primitive method of manufacture. Blank pieces of electrum were placed over dies and blows were struck against their reverses. At first such "hammer marks" were plain, but when the practice of minting coins spread to other parts of the Greek-speaking world, they began to incorporate the badges of the issuing cities. In time, the Persians and other ancient peoples started to produce coins of their own.
The Rise of Paper Money
The next phase in the evolution of currency was the invention of paper money. The first banknotes were issued in China during the reign of Emperor Hien Tsung (AD806-821), but not as a result of any great financial insight. The sole reason for their introduction was an acute copper shortage that precluded the striking of new coins. Eventually, China got carried away with the ease of producing this new form of cash. Too much of it was printed and this led to inflation. In 1455, the Chinese abandoned the use of paper money and did not return to it for several centuries.
The Chinese experience was repeated when Sweden became the first European nation to experiment with paper money. In 1661, a banker named Johan Palmstruch began to issue credit notes that could be exchanged at his Stockholm bank for stated numbers of silver coins. Unfortunately for Palmstruch, who had consulted the Swedish government before launching the scheme, he got carried away with his licence to print money. He issued more notes than his bank had silver deposits to redeem, and in 1668 was prosecuted for fraud. He was initially sentenced to death, but the penalty was later commuted to imprisonment.
Despite the less than glorious outcomes to these early trials of paper money, the tide of history was firmly on the side of the new form of currency. As economic activity increased in Europe, it became apparent that the money supply needed to be expanded beyond the limits imposed by holdings of precious metals.
This recognition led to the establishment of the first national central banks. People were much more likely to trust notes backed by government reserves than those issued by private institutions. They even proved willing to accept temporary governmental bans on the redemption of banknotes for silver, as happened in Britain during the "Restriction Period" of 1797 to 1821.
The Gold Standard
Entrusting the issue of banknotes to one central authority effectively removed the danger of bankruptcy, but it did raise the spectre of inflation. This would happen if a central bank printed too much money. (To understand this, imagine you were an egg seller and everyone suddenly had twice as much cash; you would feel foolish, not to say cheated, if you continued to sell your eggs at the old price.) It was the risk of inflation, among other factors, that propelled governments into joining the Gold Standard, a measure that harked back to the days when all money really was made of precious metal.
The Gold Standard was a mechanism that fixed the values of the coins and banknotes of participating nations in terms of specified quantities of gold.
The Standard operated both domestically and internationally. On the domestic front, it forestalled inflation by ensuring that the money supply remained relatively constant. In the international sphere, it had the effect of fixing exchange rates between the nations involved. If the US set the price of gold at $20.67 per ounce, for example (as it did from 1834 until 1933), and the UK set it at three pounds 17 shillings and 10.5 pence, as it did from 1844 until 1931 (apart from a period after the First World War), an exchange rate of $4.867 dollars to the pound necessarily followed.
The benefits of fixed exchange rates included stability and a balancing of prices between subscribing nations. If the UK made a technological breakthrough that increased economic output, its prices would fall. Assuming US prices stayed the same, this would make UK products more attractive from an American perspective, and American products less attractive to the UK. The upshot would be that gold – that is, the stuff in which payments were made – would flow out of the US and into Britain.
As the money supply/amount of gold in Britain had now increased, its prices would rise. At the same time, US prices would fall in line with the nation's own money supply. Hey presto, everyone ended up more or less where they had been in the first place and price stability was restored.
The Gold Standard worked very well so long as everyone played nicely. In the US, for example, inflation between 1880 and 1914 averaged a mere 0.1 per cent per year. The trouble was that many nations were inclined to cheat, particularly when the going got tough. When the First World War broke out in 1914, the countries involved threw their rule books out of the window. They started printing money to finance their war efforts and the Gold Standard broke down. It was reinstated in modified form in 1925, but collapsed again due to instability caused by the Great Depression.
In 1931, Britain left the Gold Standard as a result of massive outflows of gold from the nation's coffers. The successor to the Gold Standard was the Bretton Woods system, named after the New Hampshire resort where the Second World War Allies thrashed out the details in 1944. Once again, exchange rates were fixed (within a margin of 1 per cent), but the key feature was that all participating nations apart from the US were allowed to settle their debts in US dollars. The US promised to redeem the dollar holdings of other countries for gold at a fixed rate of $35 per ounce.
Unfortunately, this offer was taken up to the extent that the US started running out of gold, which placed the entire system in jeopardy. In 1971, President Nixon announced that the US would no longer be paying out gold for dollars. For the Gold Standard, this was the final nail in the coffin.
Since 1971, the world economy has largely run on a system of floating exchange rates, with gold-backed currency replaced by what is called " fiat money". This is money that has no intrinsic value and obtains its worth entirely on the basis of governmental decree. ("This piece of paper can be used to pay debts because we say it can.") The use of fiat money obviously places a greater responsibility on governments than they had in the days when currency had to be backed by precious metals. Print too much of it and you end up in a right mess.
Credit cards, debit cards and cheques
Not so long ago, it was relatively difficult to open an account with one of the clearing banks, and the Mainwarings and Wilsons who ran the institutions long catered principally to the professional classes, discussing their affairs over a glass of Amontillado in the manager's office.
Though bounders could be relied upon to write bouncing cheques, for most of the 20th century the possession of a current account denoted respectability. The manual working classes relied on a little brown envelope of notes and coins at the end of the week, and remained "unbanked" until well into the 1970s. If they wanted to send money away they relied on the postal order, now almost extinct.
Then we all became more prosperous, the banks discovered marketing (black horses running across the landscape) and students were being offered railcards and book tokens (another quaint form of money, happily still with us) just so that the bank could enjoy the mixed pleasure of them running up enormous overdrafts.
In the 1980s, National Westminster Bank even raised eyebrows by allowing the customers the option of "pictorial" cheque-books, featuring images of badgers and bunny rabbits. Cheque use peaked in 1990, with 2.7 billion passing through the system; it's about one-third of that level now, and falling.
The reason is the debit card, a sort of cheque guarantee card without the cheque. The first Switch card transaction took place in 1988; by 1995, they had overtaken credit cards in popularity, and cheques fell behind them in 1998. The advent of "chip and PIN" greatly reduced the scope for fraud. Last year, each debit-card holder used their cards 166 times on average, acquiring £3,848 in "cashback" and making purchases worth £4,799.
However, British consumers were not to be constrained by such trivial considerations as how much money they had in the bank. The launch of the Barclaycard in 1966 (and its now defunct but long-running rival Access in 1972) was the start of "plastic" – the discovery that a small rectangle of polyvinylchloride (always measuring 85.60 by 53.98mm) could transform your life. Until, that is, the astronomical APR (annual percentage rate of interest) and overlenient credit limits led to the inevitable personal mini credit crunch.
The modern British addiction to debt can be traced back precisely to the advent of the credit card. By the 1990s, 0 per cent cards were being offered to lure customers, and those who took advantage of the initial free offers and then transferred the balance to the next free offer when the interest became due were known as "rate tarts".
Our "flexible friend" (as the Access card marketing line went) had a nasty habit of landing us in economic trouble; now that he's been allowed to make the acquaintance of the "sub prime" community in the United States and Britain, the extent of the credit card's true perniciousness is becoming apparent. At any rate, we save less than at any time since the 1950s; the teenager of today is far more likely to have plastic than a building society book.
E-money: the future of cash
We may not be that far away from a world where cash follows the chequebook into oblivion and few transactions are conducted face to face. There are in excess of 20 billion payments of less than £10 made every year; they could all go cashless.
E-money comes in three forms, two of them specifically creations of the internet. First, there is the "card not present" phenomenon, where you have sufficient faith in the online retailer – nowadays, anyone from Tesco to Amazon and lastminute.com – that you feel happy to tap your payment card details on to a web page. You and the "shopkeeper" never actually meet, and you never leave your home or office.
Money thus moves from being a physical commodity – a gold coin, a paper banknote or a plastic card – to being a purely virtual commodity (though of course banks themselves have long held your current account in virtual form, as a series of binary codes in a computer file).
Second, we have seen the growth of outfits specifically set up to facilitate payments on the web. Perhaps the most high-profile of these is PayPal, as featured, and trusted, on eBay. Barclays Bank can chart its origins back to 1685, the Royal Bank of Scotland to 1727 and Lloyds to 1765; PayPal dates back only to 2000, yet it now operates in 103 markets, manages more than 133 million accounts and allows customers to send, receive and hold funds in currencies from the US dollar to the Polish zloty.
The real revolution, though, may be the abolition of cash, cheques, credit cards and debit cards and their replacement by one single means of payment which you just wave, possibly nonchalantly, at the shop assistant. This is what the "contactless" card promises, so called because you don't even have to put it into a reader to buy something.
The Barclaycard OnePulse card, for example, was launched only a month ago, with 4,000 guinea-pig customers in London. It will combine the functions of an Oyster card (Transport for London's existing "cashless" method of prepaying for bus and Tube journeys), a Barclaycard, and a "One Touch " contactless technology card.
This is the novel bit. It allows cardholders to make purchases of £10 or under more quickly and conveniently with a single touch of their card against a reader instead of entering a PIN or signature, thus reducing the need to use and carry cash. In a Bourne-style nightmare, your every move and tiniest purchase will then be tracked by your bank and, if legislation allows, officialdom. Thus can "they" know about your purchase of The Independent, a flapjack and day trip to Tate Modern. Very subversive.
Alternatively, the SIM card in your mobile phone could be used to pay for the little things in life (they're trying this out in South Korea). Either way, you will be being monitored. Money is what money does, according to the old adage. And in the future, your money may even spy on you.
The first six sections of this article are from Minted: the story of the world's money by Johnny Acton, published by Think Books on 31 October. To order a copy (free P&P), call Independent Books Direct on 0870 079 8897 or visit www.independentbooksdirect.co.uk
Currency affairs: the history of cash
The first mention of sterling is in 1078 as "sterilensis", and by the 13th century the term was in common usage. England had a uniform national currency 600 years before France, and 900 years before either Germany or Italy.
The pound was established in 1560 by Elizabeth I and its value was one Troy pound (373.2417216 grams) of sterling silver.
Until decimalisation in 1971, a pound was made up of 20 shillings – each shilling worth 12 pence. After decimalisation, the system was simplified, so that 100 "new pence" made up a pound.
Today, pound sterling is only used in the United Kingdom of Great Britain, Northern Ireland, the Isle of Man, the states of Jersey and the states of Guernsey. It is the world's oldest currency still in use.
The idea of a single European currency was first mooted in 1957 in the Treaty of Rome, to increase economic prosperity and promote links between the people of Europe.
Twelve years later, a European summit at The Hague made a single currency an official objective, but it wasn't until 1999 that the euro was introduced to an ambivalent world. Initially, it was used in a "non-physical" form – by banks and for traveller's cheques. In 2002, coins and notes entered circulation.
More than 320 million Europeans, from 13 nations, now use the euro as their sole currency, making it the currency with the highest combined value of cash in circulation in the world.
According to the European Commission, the euro symbol is a combination of the Greek epsilon, an E for Europe and parallel lines to signify the stability of the currency.
The dollar was adopted by the Congress of the Confederation of the United States on 6 July 1785. Prior to that, official British coinage was used, although it was frequently in short supply, resulting in an array of currency substitutes including foreign coins such as Spanish pesos or dollars and certificates for tobacco.
The use of foreign coins was supposed to be prohibited after the US dollar came into circulation, but a shortage of gold and silver meant that Spanish dollars remained legal tender until 1857.
The dollar is made up of 100 cents or 10 dimes, although "dime" is used only to describe a 10-cent coin. Although one-dollar coins are still minted, notes are significantly more common and are dubbed "bucks" or "greenbacks". And although the US declared independence from Britain in 1776, Americans still refer to their one-cent coin as a penny.
The dollar is used as the standard unit of currency for commodities such as gold and petroleum. The United States, the British Virgin Islands, East Timor and El Salvador are among the many countries that use the dollar as their sole currency.
Join our new commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies