Era

The Keynesian Era

1936–1970

A Theory Born From Catastrophe

The Great Depression was not supposed to happen. According to the prevailing orthodoxy of the early twentieth century — the classical economics refined by Alfred Marshall, refined further by Arthur Cecil Pigou — markets cleared. Prices adjusted. Wages fell until everyone who wanted work could find it. If unemployment persisted, the fault lay with workers who refused to accept lower pay, or with governments that interfered with the natural adjustment. The Depression made a mockery of this view. By 1933, a quarter of the American workforce was idle. Industrial output had fallen by nearly half. Banks failed by the thousands. And the self-correcting mechanism that theory promised never arrived.

John Maynard Keynes had been warning for years that something was deeply wrong with orthodox economics. A Cambridge don with a gift for prose, a talent for making money in financial markets, and an unshakable confidence in his own intellectual powers, Keynes had already savaged the Treaty of Versailles in The Economic Consequences of the Peace (1919) and sparred with the Treasury over Britain’s disastrous return to the gold standard in 1925. But his 1936 masterwork, The General Theory of Employment, Interest and Money, was something else entirely: a frontal assault on the theoretical foundations of laissez-faire.

The core argument was deceptively simple. Economies could get stuck. Total spending — what Keynes called effective demand — might be insufficient to employ all available workers and capital, and there was no automatic mechanism to close the gap. Interest rates might fall to a floor (the “liquidity trap”) without stimulating enough investment. Wages might be sticky downward, not because workers were irrational, but because no individual worker could accept a pay cut without losing ground relative to everyone else. In such circumstances, only government spending could break the deadlock. Deficit spending was not fiscal irresponsibility; it was therapy for a sick economy.

Building the Postwar Order

The Second World War proved Keynes right in the most brutal way imaginable. Massive government spending on armaments did what a decade of orthodox policy had failed to do: it ended the Depression. Unemployment in the United States fell from 14.6 percent in 1940 to 1.2 percent in 1944. The lesson was not lost on policymakers. If government spending could mobilize an economy for war, it could surely sustain prosperity in peace.

The institutional architecture of the postwar world bore Keynes’s fingerprints. At the Bretton Woods Conference in 1944, delegates from 44 nations hammered out a new international monetary system. Currencies would be pegged to the dollar, and the dollar would be convertible to gold at $35 an ounce. An International Monetary Fund would provide short-term financing to countries with balance-of-payments difficulties, and a World Bank would channel capital to developing nations. Keynes, leading the British delegation despite failing health, had wanted something more ambitious — a global clearing union with an international currency he called the “bancor” — but American power prevailed. The resulting system was a compromise, but it provided the monetary stability that underpinned two decades of extraordinary growth.

In the United States, the Employment Act of 1946 codified the new consensus. The federal government formally accepted responsibility for maintaining “maximum employment, production, and purchasing power.” It was a remarkable philosophical shift. The state was no longer a night watchman, keeping order while markets did their work. It was a macroeconomic manager, obligated to steer the economy toward full employment using the fiscal and monetary tools that Keynesian theory prescribed.

The Golden Age and Its Machinery

The results were spectacular, at least for a time. The period from roughly 1948 to 1973 is sometimes called the “Golden Age of Capitalism” — a quarter-century of rapid growth, rising living standards, declining inequality, and low unemployment across the industrialized world. Real GDP per capita in the United States doubled. Europe and Japan, rebuilt with American aid under the Marshall Plan, grew even faster. The business cycle did not disappear, but recessions were shallow and brief compared to the catastrophes of the prewar era.

Keynesian economists refined their toolkit. The Phillips Curve, introduced by A.W. Phillips in 1958 based on nearly a century of British wage and unemployment data, appeared to offer policymakers a stable menu of choices: they could accept a little more inflation in exchange for a little less unemployment, or vice versa. This tradeoff became the operating manual for macroeconomic management. When the economy slowed, governments could boost spending or cut taxes to push unemployment down, accepting a modest rise in prices. When inflation crept up, they could tighten policy and tolerate a temporary increase in joblessness. It was economics as engineering — precise, technocratic, and supremely confident.

President Lyndon Johnson’s Great Society programs, launched in 1965, represented the high-water mark of this ambition. The federal government simultaneously waged war on poverty at home and escalated military operations in Vietnam abroad, all while economists in the Council of Economic Advisers believed they could fine-tune the economy to absorb the strain. Walter Heller, who had advised Kennedy, boasted that the 1964 tax cut proved fiscal policy could deliver growth on demand. The age of the business cycle, it seemed, was over.

The Cracks in the Consensus

It was not over. The very success of Keynesian demand management carried the seeds of its unraveling. Johnson’s refusal to raise taxes to pay for both the Great Society and Vietnam fed inflationary pressures that proved far more stubborn than the models predicted. By the late 1960s, prices were rising at rates that made the Phillips Curve tradeoff look increasingly costly. And then came the blow that shattered the framework entirely: inflation and unemployment rose together.

The phenomenon, dubbed “stagflation,” was not supposed to be possible in the Keynesian worldview. If unemployment was high, demand was weak, and prices should be falling, not rising. Milton Friedman and Edmund Phelps had predicted exactly this outcome, arguing that any attempt to exploit the Phillips Curve tradeoff would eventually shift expectations and cause the curve itself to move. Workers and firms, they argued, would learn to anticipate inflation and demand higher wages and prices in advance, neutralizing the stimulus and leaving only the inflation behind. The “natural rate of unemployment” could not be permanently reduced by demand management.

The Bretton Woods system crumbled in parallel. America’s persistent trade deficits and the inflationary financing of the Vietnam War undermined confidence in the dollar’s gold convertibility. In August 1971, Richard Nixon closed the gold window, ending the fixed exchange-rate regime that had anchored the postwar monetary order. The oil shocks of 1973 and 1979 delivered further blows, driving inflation into double digits while economies stagnated.

By the end of the 1970s, the Keynesian consensus was in ruins. Its policy prescriptions appeared to have failed. A new generation of economists — monetarists, rational-expectations theorists, supply-siders — was already offering alternatives. But the Keynesian era left a permanent mark. The idea that governments bear responsibility for macroeconomic outcomes, that mass unemployment is a policy failure rather than a natural condition, and that institutions matter as much as markets — these convictions survived the stagflation crisis and remain central to economic debate. Keynes’s ghost still walks the corridors of every central bank and finance ministry in the world.