HOME>>
Archive:
Issue 5 - Fall/Winter 1997
page 1 of 1

Does Technology Create Jobs?

Two leading economists, MIT's Paul Krugman and the Hoover Institution's David R. Henderson, debate whether jobs lost to technology are met by a net increase in jobs elsewhere in a more productive economy. Krugman, a noted liberal, says maybe in the long run, but for now ordinary workers see their wages falling. Henderson, a conservative, says that the problem is not the elimination of jobs through technology but a workforce with inadequate skills.

Not for ordinary folk
By Paul Krugman

Even the early stages of the Industrial Revolution quickly made England the wealthiest society that had ever existed, but it took a long time for the wealth to be reflected in the earnings of ordinary workers. Economic historians still argue about whether real wages rose or fell between 1790 and 1830, but the very fact that there is an argument shows that the laboring classes did not really share in the nation's new prosperity.

It's happening again. As with early-19th-century England, late-20th-century America is a society being transformed by radical new technologies that have failed to produce a dramatic improvement in the lives of ordinary working families--indeed, these are technologies whose introductions have been associated with stagnant or declining wages for many. The Industrial Revolution was based on iron and steam, while we are living through a revolution based on silicon and information; but in a deep sense the story is probably much the same.

As far back as 1817, the great economist David Ricardo explained how technological progress can raise productivity yet hurt workers; his analysis, suitably reinterpreted, remains valid today.

Here is a modernized version of Ricardo's story: imagine that initially our economy uses a technology requiring that each worker be supplied with $50,000 in capital equipment. And suppose that the current level of savings and investment is just enough both to replace old capital as it wears out and to equip new workers with the same level of capital as those already employed. In such an economy, there will be more or less full employment and a stable distribution of income between capital and labor.

Now suppose a new technology comes along--one that raises the productivity of the average worker dramatically, say by 75 percent. The only drawback is that to use the new technology, a worker must be equipped with much more capital--say $100,000's worth. If wages are a great enough share of costs, companies will find the new technology well worth introducing in spite of the extra cost, but what will it do to the workers?

The answer is that, at least at first, workers will be hurt, because the economy will no longer have enough savings to maintain full employment at the going wage. An investment that would have added two jobs will now add only one, so there will no longer be enough jobs created. The new technology will begin destroying jobs instead of creating them.

Now it's true that the law of supply and demand can still work its magic. In a free-market economy, the prospect of unemployment will drive down wages, and at sufficiently lower wages, employers will find it profitable to offer more jobs after all. But the point is that these will be worse, lower-paying jobs even though the economy as a whole is richer.

It's also true that higher profits generated by the new technology will lead to more investment, and this may eventually mean higher wages. But the operative word is eventually. If history is any guide, it may be decades before the fruits of a better technology are fully reflected in higher wages. There are, admittedly, some important differences between the early 19th century and the late 20th, but they are less fundamental than they may seem.

What made the Industrial Revolution bad for wages was that it was not only labor saving but also, to use technical jargon, “capital using,” because the new technology meant replacing small-scale artisan production with capital-intensive factories, creating a shortage of capital and a scarcity of jobs. Information technology, however, is not especially capital using. Indeed, it often seems to economize on capital as much as it economizes on labor.

The characteristic of modern technology, rather, is that it is human-capital using; it greatly increases the demand for highly educated and exceptionally gifted people. Never in human history have so many people become so rich so quickly, and the rewards to skill and talent have never been larger. But for every Bill Gates or Marc Andreessen, there are thousands who find that technology has made it harder, not easier, to earn a living. Just as the physical-capital-using technology of the Industrial Revolution initially favored capital at the expense of labor, the human-capital-using technology of the information revolution favors the exceptional (and lucky) few at the expense of the merely intelligent and hardworking many.

We could not stop the information revolution even if we wanted to. And in the long run, new technology will undoubtedly raise everyone's standard of living. But that is then and this is now, and as John Maynard Keynes famously pointed out, in the long run we are all dead.

Paul Krugman (krugman@mit.edu) is a professor of economics at MIT and winner of the 1992 John Bates Medal. He has served as senior international economist on the staff of the Council of Economic Advisers and is the author of The Age of Diminished Expectations: U.S. Economic Policy in the 1990s (1990) and Pop Internationalism (1996), reviewed in October's Herring (see “Everything You Know Is Wrong”).

Yes, for everyone but the unskilled
By David R. Henderson

Paul Krugman and I agree that as long as wages are flexible--and we agree that in the United States they are--technological change cannot destroy jobs on net. The reason: even if the demand for labor falls, wage rates can and will fall, keeping workers employed. The one exception would be very unskilled workers, some of whom would be priced out of work by the minimum wage. Krugman and I also agree that “capital using” technological change can reduce real wages for workers.

But a theoretical possibility is not the same as a fact. The important question is not whether the information revolution can reduce real wages for workers, but whether it does. This Krugman has failed to establish.

It's true that real hourly wage rates for employees have fallen gradually over the last 23 years. Based on data from the president's Council of Economic Advisers, I compute that the average real wage for production and nonsupervisory workers in the private sector peaked in 1973 at $14 (in 1996 dollars) and is now about $12.13. But these data have two big shortcomings; the effect of both is to understate current real wages.

First, over the last 23 years, an increasing portion of workers' pay has taken the form of benefits--pensions, health insurance, etc.--none of which are counted in hourly wages. Although the Bureau of Labor Statistics reports overall compensation for all employees, not just for production and nonsupervisory workers, the data are illuminating. Since 1980, real benefits, valued at the employer's cost, have risen by 20 percent. Average real employee compensation, including benefits valued at cost, has risen by about 4 percent.

The second problem with the standard data on real wages is that the consumer price index (CPI), used to adjust for inflation, overstates inflation. According to the 1995 Report by the Advisory Commission to Study the Consumer Price Index, between 1987 and 1995 the CPI overstated the inflation rate by between 1 and 2.7 percentage points annually. The CPI does not adjust for the fact that people buy more of those goods whose price has fallen and less of those whose price has risen.

It also fails to adjust for quality improvements and to capture the “Wal-Mart phenomenon”--that consumers can now purchase goods at large chains for lower prices than they used to pay at local mom-and-pop stores. These three factors alone, according to a recent study by Northwestern University economist Robert J. Gordon, bias the CPI upward by about 1.2 percent a year. Assuming this same 1.2 percent bias for every year since 1973, real hourly wages have actually increased from $14 to about $16.50, and real employee compensation has increased by about 40 percent. One of the main reasons for quality improvement, incidentally, is the revolution in technology that has improved cars, made movies available on demand at a fraction of the previous cost, and slashed transportation and communication costs.

Of course, fringe benefits should not be valued at employer cost because they are typically worth less. The employer's portion of social security taxes, for example, is mandated by the federal government and is less valuable to employees than the cash that they could have invested in stocks and bonds. Benefits that are not mandated, such as health insurance, are probably worth less than their cost but are provided because they are a form of tax-free income. Therefore, the picture I painted of rising real compensation is rosier than the reality. But let's put the blame where it lies: not on the information revolution, but on actions like the federal and state governments' increase of social security taxes.

Finally, it may well be true that very unskilled workers earn lower real wages than they did 20 years ago. But the reason is that they have fewer skills than their counterparts did two decades ago. A recent study in Review of Economics and Statistics, by two economists from Harvard and one from MIT, concludes that “a high school senior's mastery of skills taught in American schools no later than the eighth grade is an increasingly important determinant of subsequent wages” (italics theirs). It finds that those who graduated from high school in 1980 are noticeably less skilled than their class-of-1972 counterparts. What are these skills? Not rocket science, but simple computation with decimals, fractions, and percents and recognition of geometric figures.

More government spending on schools is not the solution. The government's approach to schools is the problem. What are we to think of a president of the United States proudly stating his ambition for every student to know how to read by the end of the third grade? Only about half of the nation's high school seniors have mastered eighth-grade skills, the study's authors note. When a firm has only a 50 percent success rate on the basics, most of us think the customer should go elsewhere.

David R. Henderson (drhend@mbay.net) is a research fellow at the Hoover Institution and an economics professor at the Naval Postgraduate School in Monterey, California. He was a senior economist with President Reagan's Council of Economic Advisers. He writes regularly for Fortune and the Wall Street Journal and edited The Fortune Encyclopedia of Economics.

 

page 1 of 1
 
WELCOME! You are visitor number
 

Designed by ByteSized Productions © 2003-2006