Origins
The gold standard emerged as the dominant monetary system of the 19th century, reaching its classical form with the Bank Act of 1844 in Britain. Under this system, the value of a nation’s currency was defined as a fixed weight of gold, and the central bank stood ready to exchange currency for gold at this fixed rate. The doctrine promised price stability by anchoring money to a physical commodity that could not be created at will.
The intellectual appeal of the gold standard lay in its automaticity. When a country ran a trade deficit, gold would flow out, reducing the domestic money supply, lowering prices, and restoring competitiveness. This “price-specie flow mechanism” theoretically ensured that international imbalances were self-correcting without requiring discretionary policy decisions.
Structure & Function
The gold standard operated through the commitment of central banks to maintain convertibility—the promise to exchange paper currency for gold at a fixed rate. The Bank of England’s rate of £3 17s 10½d per ounce became the anchor of the international system. Other major currencies—the dollar, franc, mark—were similarly defined in gold terms, which fixed exchange rates between countries.
The doctrine’s core mechanism was the “rules of the game”: when gold flowed out, central banks were expected to raise interest rates to attract capital and reduce domestic demand. When gold flowed in, they would lower rates. This disciplined policy, limiting the ability of governments to inflate their way out of problems. However, the system was highly deflationary, as economic growth required either new gold discoveries or falling prices.
Historical Significance
The gold standard created an era of unprecedented monetary stability and global trade expansion in the late 19th century. Major economies maintained fixed exchange rates for decades, facilitating international investment and commerce. The pound sterling, backed by the Bank of England’s gold reserves and the Royal Navy, became the world’s reserve currency.
However, the doctrine’s rigidity proved catastrophic during economic crises. World War I forced countries to abandon gold convertibility to finance military spending. The interwar attempt to restore the system—championed by Montagu Norman and Benjamin Strong—is widely blamed for causing the Great Depression. Britain’s return to gold at the pre-war parity in 1925 created deflation and unemployment. The final abandonment came in 1971 when President Nixon ended dollar-gold convertibility, ending the Bretton Woods system.
Key Developments
- 1717: Isaac Newton, as Master of the Mint, establishes the de facto gold standard in Britain.
- 1821: Britain formally adopts the gold standard after the Napoleonic Wars.
- 1844: The Bank Act codifies gold backing for banknotes.
- 1870s: Germany, France, and the United States adopt gold standards.
- 1890s: The “Cross of Gold” speech by William Jennings Bryan represents populist opposition.
- 1914: Major powers suspend gold convertibility at outbreak of World War I.
- 1920s: Benjamin Strong and Montagu Norman coordinate efforts to restore the gold standard.
- 1925: Britain returns to gold at pre-war parity, causing deflation.
- 1931: Britain abandons the gold standard amid economic crisis.
- 1933: The United States goes off gold domestically under Roosevelt.
- 1944: Bretton Woods establishes a modified gold-exchange standard with the dollar at $35 per ounce.
- 1971: President Nixon ends dollar-gold convertibility, definitively ending the gold standard.