Like the clock that loses a second an hour, the American economy has lost
ground so gradually over the past twenty years that we don't realize how far
behind we have fallen. The economic expansion of the first half of the 1990s
has made it even more difficult for Americans to judge how weak our economy has
been over the past two decades compared with the rest of our industrial
history. The main reasons for this decline are not inflation, government budget
deficits, low levels of investment, faltering education, the irresponsibility
of Democrats or Republicans, excessive spending on the military, the aged, or
the poor--or the many other explanations for America's economic dilemma that we
repeatedly hear. Rather, these presumed causes are themselves largely the
consequences of a more persistent problem: a sharp slowdown in economic growth
from our historical average of about 3.4 percent a year, and often higher,
since the Civil War to a little more than 2 percent a year since
1973.1 Meanwhile, our economic expectations have not declined
accordingly. For the most part, we and our government carry on as though our
economy were still growing at its historical rate. The results are repeated
disappointment in our personal lives,
waning confidence in long-standing
institutions, and rising cynicism in our public life that threaten our best
convictions as a nation.
To most of us, the apparently small decline in the annual rate of growth may
not seem like much. We are, after all, a vast and rich economy, and we are
still growing, even if more slowly. But the impact of slow growth, like the
compound interest in a savings account, accumulates rapidly over time, and
eventually makes a huge difference. During the mid-1970s, the loss of a percent
a year in the rate of growth was on average a relatively small $100 billion a
year. By 1993, however, the damage had grown enormously. In that year alone the
gap between what the U.S. economy might have produced had we grown since 1973
at about our historical rate and what we actually did produce amounted to $1.2
trillion after inflation. This translates into approximately $4,600 of lost
production for every American man, woman, and child. Over the twenty years
since 1973 the accumulated losses in goods and services due to slow growth have
come to nearly $12 trillion, or more than $40,000 a person.2
The stylized graph on the facing page shows how quickly the losses caused by
reduced growth accumulate. Between 1870 and 1973, and despite the many ups and
downs in good times and bad, the U.S. economy had grown, as noted, at an
average rate of 3.4 percent a year, excluding the effects of inflation. But
between 1973 and 1993 (the last year for which we have complete data) the
average rate of growth flattened to 2.3 percent a year after inflation, even
though the workforce was expanding rapidly. In the graph below, the cyclical
ups and downs of the economy are smoothed into straight lines. The graph
represents how the rate of growth of gross domestic product (GDP) has declined
from its historic norm and how the gap between the norm and the actual
performance of the economy since 1973 widens. The shaded area in between shows
how an initially small loss in income and production expands over twenty
years.
The enormity of the $12 trillion shortfall since 1973 can be envisioned in many
ways. Twelve trillion dollars is more than enough to have bought each of
America's homeowners a new house, or paid off all of our government, mortgage,
and credit-card debt, or replaced all of our nation's factories, including
capital equipment, with new ones.3 As the shaded portion grows over
time, so does the cumulative damage, so that by the year 2013, the total
shortfall, assuming the economy grows at about 1 percent a year less than our
historical norm, will amount to more than $35 trillion of lost production since
1973. If the population grows from 260 million to 310 million as expected, this
will amount to a loss of well over one hundred thousand dollars a person.
This graph, then, is the economic expression of why we feel the way we do
today. In the shaded area lie our lost jobs, falling and stagnating wages,
eroding markets, closed factories, rising level of poverty, insecure pensions,
and reduced homeownership. Though we blame it on other things, the widening gap
between what had been our normal rate of growth for a century and the actual
performance of the economy is the main source of the American public's
declining confidence, which has shown up in survey after survey since the early
1970s. It explains why, by the 1992 presidential election, most Americans
believed the nation had turned down the wrong economic road, and why, by the
elections of 1994, despite several years of a moderate economic expansion, they
still felt the same way.
Another useful, if more technical, way of looking at the extent of our decline
since 1973 is to see how sharply the rate of growth itself has fallen from its
historical level. It is illustrated in Figure 1 in the Appendix, which shows
that the rate of growth has dropped by almost a full third. To raise growth
back to its former rate of more than 3 percent a year, the economy would have
to grow over time almost half again as fast as it has actually grown since
1973. In round terms, as economist Herbert Stein says, "The difference between
2 percent and 3 percent growth is not 1 percent, it is 50
percent."4
Though economic growth remained subnormal on average between 1973 and 1993,
there were periods when prosperity seemed to return. In fact, we were told that
our economic problems were solved whenever the rate of growth temporarily
quickened. As we can see in Figure 2 in the Appendix, in both the 1970s and
1980s there were stretches of several years in which the economy expanded at an
average rate of nearly 5 percent a year. Yet the average rate of growth since
1973, as represented by the dark line, never rose higher than 2.5 percent a
year. These spurts of growth were neither strong enough nor sustained enough to
compensate sufficiently for the steep recessions that preceded them or the
unusually slow growth over the remaining years of the two-decade period.
Despite the economic expansion of the 1990s, and the claims by some economists
that the economy was now growing too fast, economic growth as of late 1994 was
again not robust enough to raise the long-term rate more than marginally. As of
the end of 1994 the rate of growth since the end of the moderate recession in
the spring of 1991 averaged only about 3.5 percent a year (compared with nearly
5 percent a year at this stage in several previous expansions). As we know, the
rate of growth over the twenty years that ended in 1993 was only 2.3 percent a
year. The annual rate of growth reached 4 percent during 1994, but even so, the
long-term rate of growth since 1973 would rise to only 2.4 percent a
year.5
On the other hand, any serious slowdown or recession in coming years would
again reduce this long-run average performance significantly. Such a slowdown
became increasingly likely when the Federal Reserve raised interest rates
several times in 1994. The Federal Reserve reasoned that given the slow growth
of our productive capacity over the preceding decades the economy could not
grow as quickly as it once did without risking substantially higher inflation.
Too much demand would be chasing too little capacity. Although the Federal
Reserve may have raised rates prematurely, as some critics maintain, financial
markets might well have pushed interest rates to restrictive levels even if the
central bank had not acted.6 For these and other reasons, which we
will discuss later, a significant slowdown or even a recession was likely by
1995 or 1996, keeping our long-term economic performance far below our
historical rate of growth.
How does such a slowdown in the rate of growth affect most of us? Let's look
first at the impact on the federal deficit. Had the economy grown about 1
percent a year faster, federal tax revenues would have been roughly $2.4
trillion greater over twenty years than they actually were, assuming existing
rates of taxation. Had all this income been retained by the government, the
national debt of $4 trillion could have been cut by well over half.
Alternatively, the government could have reduced taxes by some proportion of
this additional revenue. Had the government committed this sum to debt
reduction, however, it would probably have saved more than $1.2 trillion in
interest payments over these years. In most years since 1973, the federal
government would have run only a small annual deficit at worst had growth
remained closer to its historical average, so that by the late 1980s there
would have been no federal deficit at all. By fiscal year 1993 there would have
been a substantial surplus in the budget. Federal tax revenues in 1993 would
have been about $300 billion higher and interest expense much lower, even if
interest rates remained as high as they were, not only eliminating altogether
the 1993 federal deficit of about a quarter of a trillion dollars but creating
a substantial surplus as well. In other words, had we grown about 1 percent a
year faster since 1973 than we did, which would still have left us slightly
below our historical rate of 3.4 percent a year, all other things being equal
we could have easily afforded the rising cost of government and reduced taxes
as well.
Now, let's consider the impact on jobs and incomes. A reasonable approximation
is that slow growth since 1973 resulted in the loss of several million jobs,
mostly among the already poor, less educated, and first-time job seekers, whose
unemployment added billions to the cost of government. More significant for the
economy as a whole, faster growth would have increased demand for, and the
value of, labor in general, so that wages and salaries on average would have
been higher. For the typical family, annual income would probably have been
about $5,500 a year higher in 1993, and possibly more. Over twenty years, the
typical family would conservatively have earned an additional $50,000 in total,
assuming the same employment pattern as prevailed in the 1980s. The extra
income would, among other things, have allowed many more young couples to buy a
first home, many poor workers to buy health insurance, and the typical
middle-class family to pay for housing and buy more of the goods and services
it needed without requiring a spouse to go to work or the main breadwinner to
take on a second job.
Let's also look at the level of investment in capital equipment, education, and
physical infrastructure, one of our biggest concerns over these years. The
money for such investment comes from business profits, individual savings, and
the budgets of state and local governments. Had we grown at about our historic
rate, a reasonable estimate of the additional capital available for investment
from higher individual savings and corporate profits would have been about $700
billion over twenty years, without any increase in our savings rate or profit
margins. State and local governments, which over these years have reduced
services, failed to repair their roads and mass-transportation systems, and
often cut spending on education, would have collected an additional $900
billion in income and corporate taxes. Local economies of course grow at
various rates. But to take one example, had the economy of the state of Iowa
grown by 1 percent more a year since 1973, it would have had approximately $3
billion more in tax revenues. Instead, it raised sales taxes twice, had to plug
an annual budget deficit that had grown to nearly $500 million by 1993, and put
a spending cap on social programs.
Our health-care expenditures, our fastest-growing major expense, would have
been more affordable if we had continued to grow at the historic rate. About 14
percent of everything we spend now goes for health care, compared with 9
percent or 10 percent in other advanced nations. But had our economy grown at
its historic rate, the same level of health expenditures would have been a more
comfortable 11 percent to 12 percent. Similarly, presuming we borrowed no more,
the burden of household and corporate debt that we took on would have been more
manageable. With a far lower national debt, the federal government in turn
might have been able to borrow enough money prudently to reform welfare and
provide such social goods as day care and health insurance to the millions who
couldn't afford it. It would also have been able to stimulate the economy more
readily in times of recession by raising its level of spending. Because of the
high level of debt, the use of such countercyclical policies to minimize the
duration of recessions is now limited.7
To most of us, it may still defy common sense that so small a decline in the
rate of growth can have such consequences. We generally presume, and are often
told by the experts, that we have lived through worse times before. Some insist
that in the last twenty years we have merely given back some of the unusual
gains made in the twenty-five years after World War II when we easily dominated
world markets and American workers enjoyed rapid gains in income.
But by the early 1990s the record of the prior two decades was clearly unusual
by any standard in American history. Measures of our early economic growth as a
nation are not as reliable as current data are, but they show that over no
other peaceful twenty-year period since the Civil War, and possibly since the
early 1800s, excepting the years that included the Great Depression, did the
economy grow as consistently slowly as it has in the past twenty years. In
fact, the economy grew as fast as it did only because baby boomers born after
World War II entered the workforce in huge numbers, the number of workers
expanding one and a half to two times faster than the total population. Had the
economy been as robust as it was in the past, GDP per capita should have grown
much faster in the 1970s and 1980s than its long-term average because a much
higher proportion of the population was now working. But GDP per capita grew at
only 1.3 percent a year, a full percent slower than it did between 1948 and
1973, and half a percent slower than its average growth rate of 1.8 percent a
year since 1870.8 Had GDP per capita grown at only 1.8 percent a
year since 1973, the combined increase in federal tax revenues and reduced
interest payments would still have wiped out the entire federal deficit by
1993.
What is more, the rate of growth between 1948 and 1973 of nearly 4 percent a
year, which some now maintain was unsustainably fast, was not unique in
America's industrial history. As we have seen, we had been growing at 3.4
percent a year since 1870 after inflation. But if we go back to 1820, when our
economy was first beginning to grow rapidly, the average rate of growth from
this smaller base has been 3.7 percent a year after inflation. Between 1870 and
1910, when our industrialization was fully under way and the economy was
already quite large, the rate of growth averaged 4 percent a year, and rapid
growth persisted far longer than did the similarly rapid rate of growth after
World War II.9 It was not the first two post-World War II decades,
then, that were especially unusual compared with our historical record, it was
the two decades of slow growth that began in 1973.
The foundation of economic growth is productivity, whose rate of growth has
declined even more steeply than our overall economic growth since 1973. Growing
productivity--the economy's output of goods and services per hour of work--is
the reason the average person's standard of living rises. Conversely, without
growing productivity, incomes typically stagnate. An economy would then grow
only as fast as its working population grew.
Since a few years after the Civil War, productivity has grown at an average
rate of about 2 percent a year (even including the Great Depression). In other
words, workers produced an average of 2 percent more each year for every hour
they worked. Beginning in the 1890s, the rate of productivity growth picked up
to about 2.3 percent. At that point America had become the most productive
country in the world, producing more per hour of work than any other country,
and it retained its huge lead over most countries until well after World War
II. In the immediate post-World War II period, the rate of productivity growth
rose to 2.7 percent a year.10
But since 1973 the average annual growth of productivity has fallen to .9
percent a year--so far as we can tell, the worst twenty-year showing since the
end of the Civil War (again excepting the first few years of the Great
Depression).11 The widely heralded Reagan economic expansion did
nothing to correct this fundamental problem. The growth of productivity
remained about as slow during Reagan's two administrations as it had become in
Ford's and Carter's. In fact, as we have seen, we would have grown even more
slowly in the 1970s and 1980s had the workforce not expanded so rapidly. This
performance stands in contrast to our past record when we both raised
productivity and absorbed millions of new workers at the same time. For
example, between 1870 and 1910, when the working-age population grew even more
rapidly than it had in the 1970s and 1980s, our economy still produced gains in
productivity of 2 percent a year.
Once adjusted for the ups and downs of the business cycle, productivity so far
in the 1990s is again growing no faster than it did in the 1970s and 1980s. In
fact, overall productivity has been growing at almost the identical rate over
the course of the economic expansion since 1991 as it did over the expansive
phases of the business cycle in the slow-growing seventies and eighties.
Economic data are always subject to interpretation, of course, but claims that
productivity is climbing strongly in the 1990s typically ignore or
underestimate the cyclical nature of its growth. Moreover, revisions in the
data are likely to reduce the productivity growth reported so far in the 1990s
even further (see Chapter 6). "The real mystery of the post-1973 slowdown is
the sharp deceleration of productivity growth in the . . . USA,"
writes the British economist Angus Maddison.12 To this mystery we
shall soon return (see Figure 3 in the Appendix).
Also unprecedented over so long a period was the fall in average wages.
Whatever changes had occurred in the economy in these two decades had clearly
hit the American worker hardest. Slow-growing productivity inevitably dampens
gains in salaries and wages because we don't produce as much in goods and
services per worker, and therefore we don't produce as much income per worker,
either. But most American workers since 1973 fared significantly worse than
even slow productivity growth warranted. The highest proportion of new jobs
over these years was created in low-paying service industries, where
productivity gains were hard to attain, while many higher-paying manufacturing
and related jobs were eliminated or filled by temporary or lower-wage workers,
often in companies abroad or in low-wage regions of the United
States.13 Workers no longer got nearly the wage increases over time
that they had expected as they stayed on the job or rose through the ranks.
Workers in each age group typically made less than those who came before them.
A growing proportion of workers lost ground, incomes falling below the levels
they attained when they were younger and less experienced. Overall, discounted
for inflation, the average weekly wages of so-called non-supervisory workers,
about 80 percent of the workforce, fell by 15 percent from 1973 to 1993 (see
Figure 4). Even if we include the growth of pension, health, and other worker
benefits over these years, the compensation of a typical worker today has
fallen compared with what it was for the typical worker twenty years ago, after
discounting for inflation, and it has fallen sharply on average for young, high
school-educated, and minority workers.14
By about 1987 slow economic growth had begun to put pressure on the salaries of
better-paid white-collar workers as well. These wages fell in that year, and
did so continually throughout the economic recovery that began in 1991. As a
result of these factors, the average real income of families was only a few
percentage points higher in 1993 than in 1973, and that largely because so many
more spouses were working.15 There have been shorter periods when
wages have fallen sharply, but as far as we can tell, there has been no other
twenty-year period since 1820 when average real wages fell, with the possible
exception of the years just before and after the Civil War.16
One result of these gradual, almost unnoticed changes was, of course, that as
our incomes stopped growing, we saved less and borrowed more in the 1980s to
maintain our standard of living. Meanwhile, as tax revenues grew more slowly,
government also borrowed more to meet its own obligations. Having borrowed so
much, we found ourselves without the expected amounts of money to invest in
upgrading our public infrastructure and education. Moreover, with incomes
stagnating for so long, we were less willing than ever to pay higher taxes.
Thus, when President Clinton insisted early in his administration on the need
to reduce our borrowing by spending less and taxing more while putting a little
aside to invest in such areas as job training, business development in cities,
and a youth service program, the public resisted anything more than modest
changes. Even a $3 billion package of aid to victims of the 1993 summer flood
in the Midwest met resistance. Five billion dollars for job retraining was hard
to find. Small tax-hike proposals were fought tooth and nail. By 1994 even the
popular crime bill was hard to pass mainly for lack of money, whereas in the
past large tax increases were paid by the public from a significantly rising
standard of living or to pay for a major war. In early 1995 Congress refused to
pass a $40 billion loan guarantee for Mexico to stay a financial crisis there,
a guarantee that probably would never have been called upon.
If we don't make up for our lost growth since 1973, and overall we continue to
grow only between 2 percent and 2.5 percent a year for another twenty years
rather than at our historical norm, the compounding effects will take a far
bigger toll. In addition, we will no longer have the benefit of a rapidly
growing workforce. Between 1993 and 2013, roughly another $24 trillion, or more
than $75,000 a person in today's dollars, could be lost in addition to what has
already been lost in the past twenty years because of a reduction of 1 percent
a year in our rate of growth. In total, it would be as if everyone in America
were to stop working for two or three years. The reduced tax revenue to the
federal government would amount to more than $4 trillion over the next twenty
years, or about two and a half years' worth of current government expenditures.
In the year 2013 alone, the typical family could earn $11,000 less, about one
third of what it earns annually today.17 Many economists believe we
will not grow by more than 2.5 percent a year for the foreseeable future. Yet
numbers like these change history.
Many factors help explain this decline in America's growth. The list is
familiar. Success bred complacency. Old ways of doing business became encrusted
and corporate bureaucracies discouraged change. Political and policy errors
took their toll, from overexpanding the economy in the inflationary 1970s to
taking on debt in the 1980s, which drove interest rates and therefore the
dollar to debilitating heights. So did the costs of the Cold War, including the
Vietnam war. American consumers saved too little and spent too much. Foreign
nations, having had to rebuild from scratch after World War II, had more modern
capital equipment than we did.
But the extent and abruptness of the slowdown since 1973 demand further
explanation. Twenty years of slow growth is a long time--long enough to produce
significant social and political consequences, and long enough so that we must
now take seriously the possibility that we may be suffering not from a series
of recessions from which we will eventually recover but from a substantial
change in our fortunes that will not correct itself or respond to government
policies. No one can say with confidence whether or not a new prosperous
chapter in America's history will open soon, but it is possible that our slower
economic growth is no longer simply cyclical or temporary but structural and
permanent. We are not prepared for this. Americans are the only people in the
world who take fast growth for granted as a natural consequence of their
country's uniquely prosperous history. Our instinctive response to our
problems, our sense of what is right and good, the means by which we earn our
self-respect, and our view of our role in the world have been formed by a
history of unusual economic advantage. Unlike most of our advanced rivals, we
have had little experience with inherent limits to expansion.
It cannot be said strongly enough that the economy we have come to take for
granted has been remarkable. By the 1880s, the size of the U.S. economy had
surpassed Britain's, whose lead was thought insurmountable. We were more
productive than Britain--we produced more goods and services per hour of
work--by sometime in the 1890s.18 A comparison with Germany's
nineteenth-century economy is especially instructive. Germany's powerful
industrial revolution did not get fully started until 1870. By that time
America's industrial revolution had been well under way. Starting from a much
lower base, therefore, Germany's rate of growth would have been expected to
exceed America's, at least for a while. Yet despite Germany's takeoff (the
fastest of a major European nation over these years), the American economy
continued to grow faster. In 1870 America's per capita GDP was $2,247. In terms
of dollars, Germany's was only $1,300. By 1913 America's GDP per person more
than doubled to $4,850 while Germany's didn't quite double to $2,606. All this
time, America was providing jobs for tens of millions of
immigrants.19
America's fast growth continued until the early 1970s, though the rate of
growth was tested time and again by severe recessions and financial collapses.
There were nine significant recessions between 1870 and 1913, three of which
were especially severe. The depression of the 1870s, for example, was almost as
lengthy as the Great Depression of the 1930s. In the 1890s production fell by
more than 15 percent and the unemployment rate remained above 15 percent for
four years. The recession of the early 1900s was almost as steep. But after
each setback, the economy recovered and resumed its fast pace of growth.
Because both productivity and the population were growing strongly over this
period, the average rate of annual growth remained around 4 percent a
year.20
During the early years of the Great Depression, production fell by more than 30
percent. The economy crawled back only to plunge again in the second half of
the decade. But so powerful was the underlying strength of the economy that the
lost production was entirely made up soon after our entrance into World War II.
After the war, recessions were milder.
Production rarely dropped by more
than a few percent partly because of government guarantees, including so-called
automatic stabilizers, such as unemployment insurance, and financial
safeguards, such as insurance on bank deposits, as well as management of the
economy through fiscal and monetary stimulus. Only after 1973 did the economy
expand less vigorously than it had in the past, while recessions themselves had
steepened compared with the early post-World War II period.21
According to the calculations of Angus Maddison, who has compiled the
historical growth rates of the world's leading nations, the American economy
grew on average by 3.7 percent a year between 1820 and 1989, as we have noted,
over which time America's GDP rose by 450 times. No other country came close.
Germany grew by 2.5 percent a year, its GDP rising by only sixty times over the
same period, and Japan grew by 2.8 percent a year, its GDP rising by about one
hundred times. Britain was the notable laggard. While its lead was enormous in
1820, it grew at only 2 percent a year on average since then, so that by 1989
Britain's GDP had risen by only twenty-seven times since 1820. Such is the
damage done by consistently slow growth. Since the early 1800s, America's
population grew by more than 2 percent a year while the populations of the
other nations grew far more slowly. Yet even measured in terms of growth per
capita, the American economy outpaced that of every other major nation until
after World War II.
Where do we now stand in our economic history? The answer to this question will
occupy the next few chapters. But we should begin by examining just what
sustained America's unusual rate of growth over so long a period and what
actually happened to change it. The most influential nineteenth-century
interpretation of America's economic expansion was made by Frederick Jackson
Turner in 1893. The young historian was trying to make sense of disturbing
changes in the American economy as the agrarian economy gave way to an
industrial one--changes that confused us then as much as current changes do
today.
Turner argued that the American experience was formed largely by our vast and
open frontier, where the ratio of people to land, so high in Europe, was for us
uniquely low. This economic advantage had provided ample opportunity for
Americans to acquire fertile, cheap, often free land, enabling the majority to
become economically independent. But when the Census of 1890 reported that the
frontier had been at last filled up, Turner believed America's distinctive
advantage had been lost. "Never again will such free gifts of land offer
themselves," he said. "The frontier is gone and with its going has closed the
first period of American history."23 He worried that after nearly a
century of economic opportunity America might have reached a turning point, and
believed that we would have to look outside our boundaries to find sources of
new growth. Turner's gloomy thesis was consistent with the imperial longings
that gripped many Americans at the turn of the century.
Turner was wrong, of course. Even as he wrote, industrialization was providing
a second, even more potent frontier of renewed economic opportunity to new
generations of Americans. But despite his oversimplifications, Turner
articulated something essential about what held us together as a nation. His
was the first broad economic interpretation of America's history. He understood
that unusual, even abnormal economic opportunity had been the one long unbroken
strand in American experience, which had created America's distinctive
characteristics, including its optimism. "Since the days when the fleet of
Columbus sailed into the waters of the New World, America had been another name
for opportunity," Turner said. "So long as free land exists, the opportunity
for a competency exists."24 What Turner could not imagine was that
there would be any other basis for this opportunity than access to land, and he
was deeply concerned about what would happen to us without it.
If Turner underestimated the benefits of commerce and industrialization, he was
right about the powerful appeal of the frontier in our early years. From the
beginning, cheap, fertile land provided unusual economic opportunity and
attracted migrants in remarkable numbers. Western New York filled rapidly after
the Revolution, the state's population quadrupling in only two decades. Between
1780 and 1800 the population of Tennessee grew by ten times. Only a generation
after it was settled in 1820 Ohio had a population of five hundred thousand
people, making it the fifth-largest state in the union. The acquisition of new
territory kept pace. The Louisiana Purchase of 1803 alone doubled America's
territory.25 Eventually, the United States would nearly double in
size again. Acquiring new territory was one of the most important of
presidential priorities in our first half century or so. Thomas Jefferson
assured the public in his first inaugural address that land would be available
"to the hundredth and thousandth generation."26
Even later in the century, during the first stages of industrialization, the
search for economic opportunity at the frontier remained a way of life for
many. The population of Boston, for example, grew by only 387,000 people
between 1830 and 1890, yet well over 3 million people had lived there at one
time or another over these years before moving west.27 "We are a
rapidly--I was about to say fearfully--growing country," said John Calhoun
early in the century, when the U.S. population was young, vigorous, and
expanding by nearly 40 percent a decade. As late as 1820 only 5 percent of
Americans lived in cities, compared with nearly one third of En-gland's
population.28
On the farms early Americans lived with what would be considered today a bare
minimum of necessities. "A majority of free Americans lived in a distinctive
subsistence culture remote from river navigation and the market world," writes
historian Charles Sellers.29 But compared with Europe, the average
standard of living in colonial America was enviable. In England, three fourths
of the land was owned by the gentry, but most of those who worked the land in
America owned their piece of it. Throughout the new nation, poverty was not
onerous and famine not a serious concern as it was throughout
Europe.30 This is the "best poor man's country in the world," said
one colonial observer, provided of course that you were white.31
Economic historians have concluded that the average standard of living in the
United States was
rising only modestly in the colonial and immediate
post-
Revolutionary years. But access to land was so ample that it allowed
the overwhelming number of new Americans to acquire a minimal standard of
living and a significant degree of economic independence.32
America's optimistic, individualistic, and self-reliant ideology was emboldened
by the economic success on the frontier. "These free lands promoted
individualism, economic equality, the freedom to rise, democracy," wrote
Turner.33 What was clear was that in the early years, when these
"distinctly American" characteristics, to use Turner's description, were
applied to the task of eking out a living, they seemed to work. One result was
that Americans developed a special intolerance for poverty. Though there were
almshouses before the Revolution, when the common wisdom ascribed poverty to
divine will, by the nineteenth century poverty was regarded as a matter of
individual responsibility. You could always go far enough west to find cheap
land and feed yourself. Even when poverty increased later in the nineteenth
century, Americans did not readily accept it. "This is a country of self-made
men, and the idle, lazy, poor man gets little pity in his poverty," wrote the
Reverend Calvin Colton.34 As industrialization spread, many
Americans refused to believe that hundreds of thousands of workers could be
unemployed through no fault of their own. The Protestant ethic preached that
hard work invariably led to material success and that material well-being was a
sign of spiritual grace, a doctrine later expanded of course by social
Darwinists who claimed that survival of the economically fittest was nothing
less than a law of nature. Those who did not succeed might be pitied but should
not be helped, a principle reemphasized by Herbert Hoover in the first years of
the Great Depression. In a country where most citizens did better themselves,
such an ideology easily took hold.35 Mistrust of government and a
stubborn sense of equality, Turner argued, had their roots in the frontier,
where hard work, self-reliance, and optimism paid off.
Turner would cite Daniel Boone as the archetypal frontiersman who could
maintain his independence by always moving farther west, keeping one step
beyond civilization. It was a story that Americans not only relished but also
adopted as part of their folklore. In the 1760s, Boone led his large family and
a community of followers into Kentucky, where they cleared land, farmed for
themselves, and hunted for game. Despite the well-known forays of Native
Americans, Boonesborough, with only three hundred settlers, could support
itself on what it hunted and what it farmed.36 But after the
Revolutionary War, Kentucky was the site of a land rush. From scarcely a soul
when Boone got there, the population of Kentucky Territory grew to 20,000
people by the early 1780s. By 1800 there were more than 220,000 settlers in
Kentucky.37
Boone was never clever enough to profit from the land rush, though recent
research suggests that he tried to do so.38 Disillusioned, he
settled in Missouri Territory, on the other side of the Mississippi. There he
and his family were again able to hunt and farm to support themselves. His
complaints about the encroachment of civilization and government grew
legendary, spread by the newspapers and idealized in a best-selling biography
that was long on lore and soft on facts. His popularity tells us much about how
we want to see ourselves. As his exploits were publicly romanticized, Boone
seemed to isolate himself further. In his eighties, broke from speculating in
land, he was still farming and hunting, and he remained our quintessential
free, independent, self-reliant American man, a model for our literature and
popular culture ever after.39
The economic age that depended on access to land ended long before Turner's
lecture in 1893, however. Even when Boone died in 1820, a market economy, with
a growing volume of trade dominated by rising towns and cities, had begun
contributing to economic growth. By then, with the recent lifting of the
embargo on trade with Britain, the American economy was on its way to dominate
the world.40
Many farmers had become small businessmen themselves in these years, often
selling their surplus crops both domestically and overseas. Agrarian exports
were soaring, leading one observer to state that America was the "granary of
Europe."41 One study found that the distance wheat could be
transported profitably doubled to one hundred miles in the forty years
following the Revolution, so that by the early nineteenth century the
wheat-export belt of America extended from Connecticut to Virginia and inland
to the Shenandoah Valley. Cotton production, made especially competitive by
cheap slave labor, had also moved farther inland.42 The image of the
simple farmer attached forever to his land is a romantically exaggerated one.
Farmers widely speculated in land as the commercial boom sent prices up. They
often settled on their farms for only a few years, sold out at a handsome
profit, and moved farther west in the expectation of making another
killing.43
By 1820 small-scale industrialization had also spread far more widely than was
generally realized. A quarter of the working population of New England, for
example, was employed in small textile and shoe factories by then. Many others
worked at home. Adam Smith's specialization of labor was already raising
productivity. Tasks were efficiently divided among those who made only shoe
"uppers," for example, or others who sewed only the cuffs on garments. Farmers
too took in textile work and had begun to manufacture some items, such as
rudimentary iron tools. Even Jefferson, who once believed America would and
should remain a nation of farmers, eventually admitted that manufactures were
good for the country. Between 1790 and the beginning of Jefferson's trade
embargo in 1807, American agricultural and manufacturing exports rose from
about $20 million a year to more than $108 million. Tariffs were imposed on
imports, and would mostly be kept high for the rest of the century in order to
protect America's infant manufacturing industries.44
In these years the building of roads and canals became a national passion.
Trade had risen by thirteen times on the Erie Canal between 1824 and the 1850s,
and by twelve times on the Mississippi over the same period.45 The
first short railroad lines were put in during the 1830s and 1840s. Overall,
despite several sharp depressions, the economy grew more quickly since the
early 1800s than it ever had before, stimulated not solely by the swelling
population but by something new in America: rising productivity. The best
evidence is that between 1800 and 1850 the economy grew between 1 percent and
1.3 percent per person compared with less than .5 percent per person before
that. Access to land still mattered, but less so; prosperity was now also being
created by increasing agricultural and industrial productivity.46
By the 1850s the size of markets was growing dramatically. "Manifest destiny"
was on everyone's lips and the nation's territory was expanded to the Pacific.
Long rail lines were being laid for the first time that connected large cities
in all the country's regions. Steamship lines grew rapidly as well. In all,
goods that once took months to reach their destination now frequently got there
in less than a week.47 Domestic trade became a key to growth. The
huge American market was an unparalleled free-trade zone, so to speak, where
farmers and businesses could specialize in the production of what they did
best. It was a diverse, thriving marketplace, where Adam Smith's assertions
about the advantages of specialization and the division of labor could come to
their fullest fruition.
The Civil War had interrupted the nation's growth. But once the nation was
united again, the economy was spurred on by an industrial revolution whose
strength no one could have anticipated. Manufacturing replaced trade as the
focus of dozens of fast-growing cities. By 1900, 30 million Americans were in
the workforce, some 10 million of whom worked in manufacturing. Millions more
worked in the transportation, trade, and service businesses that supplied them.
By then, nearly 40 percent of the population lived in the cities. Millions also
worked in the increasingly valuable mines. This was America's second
frontier.48
Modern research shows what Turner probably could not have known at the time of
his lecture. In the 1880s, the nation manufactured as much in dollar volume of
industrial goods as it had produced in wheat, corn, beef, poultry, and all
other agricultural products. Even before Turner's lecture, the United States
produced more goods and services than Great Britain, and several times as much
as the next largest economy, Germany. Driven by the spread of mass production,
American products were now typically cheaper than those manufactured in other
nations. As noted, total output per hour of work exceeded Britain's sometime in
the late 1890s, making the United States the most productive large nation in
the world.49
By the time of Turner's lecture the importance of the geographical frontier
that he had so romanticized had long since begun to decline. By 1900 there were
about 75 million people living in the United States, the large majority of them
making a better living than they ever had before. As a result, America did not
rebel, lose its direction, or renounce its basic ideology. To a large degree,
though Americans were by no means as independent as they once were, now often
working in highly regimented factories and living in dense cities, they
believed that the characteristics that had propelled the economy in its
earliest years continued to do so long after the industrial revolution had
begun to make the first frontier less important. Economic opportunity still
provided "competency" in America, and Americans still believed the true sources
of their unique success were self-reliance, individualism, and hard work.
Industrialization brought with it a set of new problems. Unemployment became
pronounced during industrial depressions. Over the century the distribution of
income became more unequal, and the rising fortunes of the ostentatious robber
barons in the 1880s and 1890s stood in sharp contrast to growing pockets of
poverty and squalor in the cities. Work had been less hard on the farms than
the often sixteen-hour workdays, six days a week in the cities. Labor strikes
brought on by these conditions were thwarted by the courts or put down
violently by employers, apparently without serious protest from the citizenry,
even though many strikes did succeed in raising wages and improving conditions
for their workers.50
However much the American ideology denied it, poverty now existed and it was
palpable. In the slums of New York, Boston, and Chicago, workers often lived
six in a room. A strong strain of pessimism crept into the American culture.
Writers like Frank Norris and Upton Sinclair captured the rising discontent
with a new moneyed culture. Respected intellectuals like Henry and Brooks Adams
saw little hope for America's future. Populism, which flourished especially in
southern and agricultural states, became a powerful political movement that
demanded significant reforms and accused the big-city bankers of nailing
Americans to a cross of gold.51
But time and again, rapid economic growth provided enough opportunity to
appease most of America's rebels and doubters, even during politically
turbulent times. Surging economic growth in the 1830s cemented Andrew Jackson's
democratic reforms, just as it calmed populist discontent once the depression
of the 1890s had ended. For all the political reforms, it was a rising real
wage over time that was the great palliative. Despite "sweated" labor and
occasionally severe depressions, real wages for most Americans rose rapidly
between the 1870s and early 1900s. Overall, real wages, though they fluctuated
widely, rose by about 1 percent a year on average over the entire century. The
typical American was earning roughly three times as much after inflation in
1900 as in 1800.52 Despite the arrival of so many millions of
immigrants, the average American wage was still 50 percent higher at the start
of World War I, measured in terms of purchasing power, than the average wage
earned by a British worker.53
In the twentieth century, production and wages again rose dramatically. The
second frontier turned out to have been only in its early stages. After a steep
recession following World War I, the American economy again took off in the
1920s, up by 18 percent in 1922 alone. The use of electricity spread widely.
The Model T had come to market a few years before the war and was a great
success. Other new products included radios, gas ovens, and refrigerators, all
selling at prices that made them affordable to a new mass market of American
consumers. Productivity rose on average by 4 percent a year between 1922 and
1928.
The Great Depression was a major challenge to the new industrial economy, and
to our political stability as well. Unemployment soared to about 25 percent of
the labor force. It took the arms buildup before World War II, and ultimately
the war itself, to get America back on its feet. While some prominent
economists believed the economy could stagnate indefinitely, in fact the war
merely demonstrated how powerful America's economic potential was. Production
rose above its 1929 peak by 1940. Incomes rose to pre-Depression levels by
1942. Productivity was again on a fast track. Unemployment virtually
disappeared.54
After World War II most analysts thought a return to recession, or even a
severe depression, was all but inevitable. The sharp recession after World War
I and the Great Depression were still fresh memories. But the second frontier
proved far more durable than its critics supposed. After a brief recession the
economy again expanded rapidly and the forecasts of long unemployment lines
never materialized.
The fast growth after World War II was aided by the destruction of Europe and
Japan during the war; the United States had the world market mostly to itself
well into the 1950s. Returning veterans went to college in great numbers,
financed by the GI Bill of Rights, and the emphasis on education spread
throughout the nation. Wartime technological breakthroughs spilled over to
profitable commercial uses. With government help, for example, Bell Labs
developed the transistor in 1948. Timex produced a cheap watch based on
government research. High military expenditures may also have promoted growth
in the short run, though over time they eroded resources in a way that would
dampen future growth.
The annual rate of economic growth of nearly 4 percent a year between 1947 and
the early 1970s rivaled in pace, though not duration, the fast growth of the
latter third of the nineteenth century. Family income after inflation doubled
in this period. In the mid-sixties, the unemployment rate was only about 4
percent, yet inflation was inconsequential.
This second frontier was the answer to Turner's understandable concerns. For
all its ups and downs, it produced the fastest, broadest-based economic growth
and rising living standards a major economy had ever seen. There is not one
forecast on record that suggested it might not last. One celebrated forecaster
in the 1960s claimed that productivity would grow at a rate of 4 percent a year
until the turn of the century.55 America had no reason to doubt
itself, or to challenge the validity of its original frontier ideology. Because
of rapid economic growth, its confidence in itself had never been higher.
But the pace of growth was about to slow dramatically. There were several signs
of this as early as the mid-1960s. Corporate profits as a percentage of sales
began to decline rapidly, falling from about 14 percent of sales after taxes in
1965 to only 8 percent by 1970. The growth of productivity had also tapered off
significantly to a rate of 2 percent a year from a rate of about 3 percent a
year, even though capital investment was high.56 Economists believed
that at worst it was a temporary stall.
Only after the oil crisis in 1973, when the OPEC countries raised prices
threefold, did we have the first serious recession of the post-World War II
period. The economy did not begin to recover until mid-1975. Over the next
seven years we experienced an inflationary spiral and the two other recessions
we discussed earlier. The 1982 recession was even more severe than the
OPEC-induced recession in 1974 and 1975. In sum, we suffered three recessions
in the ten years between 1973 and 1982, two of which were the worst in the
post-
World War II period.
The Reagan expansion between 1983 and 1988 temporarily muted economic
criticism. When the expansion petered out in 1989, however, the economy was
only slightly ahead of its 1979 peak by most per capita measures, and most
important, as we noted, the growth of productivity continued to lag badly for
the second decade in a row. The Reagan expansion was followed by the four years
of slow economic growth under President Bush, which included the recession in
1990 and part of 1991. The economic expansion that began in 1991 was only a
moderate one, unable to reverse even modestly the damage done over the
preceding twenty years. As of the fall of 1994 the average real wage had been
falling for more than two decades, the rate of growth in productivity was still
historically low, the poverty rate had risen significantly, and America could
no longer invest adequately in its future without a significant sacrifice in
current standards of living. Americans would say in survey after survey that
they were beginning to feel that something had changed, but nevertheless they
had continued to underestimate the impact that slow economic growth was having
on their lives. This may have been only natural. Americans had never had to
deal with an indefinite period of slow economic growth before, and most of us
could not figure out exactly what had changed. Here our politicians, recalling
how voters rejected Carter and Mondale for their candor about some of our
economic problems, chose not to repeat their mistake. The media, which had
learned the same lesson, were no better. Doom and gloom, to use the catchword
of the times, did not sell. But President Reagan's optimism did.