The Future of the Welfare State
Is the era of Big Government over? In an interview in New Perspectives Quarterly (Winter 1996), economist John Kenneth Galbraith offers his view.
Certainly, the welfare state doesn’t inspire the enthusiasm and sense of achievement it did 50 years ago in the days of the New Deal. But there should be no doubt the welfare state is here to stay. So is some element of government support of the economy in times of high unemployment and depression. These things are not a result of the invention of liberals like Galbraith, but part of the thrust of history.
Let us take health care as an example. The struggle over public health care doesn’t result from the fact that some people want it and some don’t. It arises from the fact that surgery and medical care have so advanced at such enormous cost that the question must now be faced as to whether people ought to die for lack of money. This is something that no civilized country can accept. Medical care provided by the state is therefore inevitable.
A big dip in the share of the jobless drawing unemployment checks occurred between 1980 and 1984. The authors cite two key causes: the states tightened eligibility standards and benefits became partially subject to federal income taxes in 1979 (and fully taxable in 1986).
Why worry? There is a reason beyond the fact that the unemployed are entitled to benefits, the authors say. Unemployment insurance was designed to function as an economic stabilizer, pumping money into the economy when times are tough. (Total outlays in 1993 were $22 billion.) That won’t work if the jobless don’t draw checks.
The Rise of Management Consultants
"The Origins of Modern Management Consulting" by Christopher D. McKenna,
in Business and Economic History (Fall 1995), Dept. of Economics,
College of William and Mary, Williamsburg, Va. 23187.
Management consultants are to the corporate world what big-name athletes are to professional sports: sometimes loved, sometimes hated, but always very well compensated. In 1993, AT&T paid out more to management consultants than it spent on research and development. While it’s generally assumed that management consulting grew directly out of the "scientific management" movement fathered by Frederick W. Taylor (1856–1915), its origins were quite different, argues McKenna, a historian at Johns Hopkins University.
By the late 19th century, American big business had grown large enough to require outside advice. Most of this advice came from major banks, which enjoyed far more intimate contact with their corporate clients than today’s banks do. They owned stock, lent money, and often took an active role in management—sometimes including a seat on the board of directors. The banks began to draw in the specialized consultants: chemical engineers such as Arthur D. Little for engineering advice, accounting firms such as Arthur Anderson and Ernst & Ernst for outside audits and financial advice, and large corporate law firms.
But the Glass-Steagall Act of 1933 and the establishment of the Securities and Exchange Commission in 1934 ended all that, forcing banks to choose between commercial and investment banking and to sever their close ties with their corporate customers.
"The new institutional arrangements in banking opened up a vacuum into which firms of management consultants rushed,"
McKenna writes. From about 100 independent "management engineering" firms (as they were called then) in 1930, the number grew to 400 a decade later. Firms also expanded in size. In 1926, after a dozen years in business, Edwin Booz employed only one other management engineer; a decade later, Booz-Allen & Hamilton had 11 consultants on staff.
"Since the 1930s," McKenna writes, "management consultants have reorganized the largest and most important organizations in the world." McKinsey and Company, for example, during the 1960s and ’70s, decentralized some 25 of Great Britain’s largest companies. "Whether reorganizing the Bank of England, Royal Dutch Shell, the government of Tanzania, or even the World Bank, management consultants disseminated American management techniques throughout the world."
"Measuring Inflation in a High-Tech Age" by Leonard I. Nakamura, in Business Review (Nov.–Dec. 1995), Federal Reserve Bank of Philadelphia, Dept. of Research and Statistics, 10 Independence Mall, Philadelphia, Pa. 19106–1574.
Assessments of the economic state of the Union almost always revolve around the "fact" that Americans’ wages, corrected for inflation, have declined, falling from an average of more than $8 an hour (in 1982 dollars) in 1975 to less than $7.50 last July. Nakamura, an economic adviser in the research department of the Federal Reserve Bank of Philadelphia, contends that the decline is, in all likelihood, an illusion.
The culprit, he contends, is the Consumer Price Index (CPI), which measures changes in the cost of living by tracking the price of a fixed "basket" of goods. The CPI basket currently holds items selected in the early 1980s. But today’s actual consumer basket is different. Improvements in the quality of goods (e.g., personal computers and cars) and services (e.g., cable television and medical care) increase the standard of living yet are largely missed by the CPI. The result: at least a half-point overestimate of annual inflation.
There are other problems with the measure. If clothes go up in price, for example, while computer supplies go down, the consumer may buy more of the latter and fewer articles of clothing. The consumer is better off, but, again, the CPI, with its fixed basket, takes no notice, distorting the index further by two-tenths of a percentage point.
New products, such as CD-ROMs, that have come out since the basket’s contents were fixed, are largely ignored. The Bureau of Labor Statistics, which collects the basic data for the CPI, tries to keep abreast of new products by gradually fitting them into an existing category of goods and rotating part of the sample of stores and goods it surveys each year. But that procedure not only fails to capture all the improvements in the standard of living, Nakamura maintains; it itself pushes the inflation index further upward—by an estimated two-tenths to three-tenths of a percentage point a year. The reason: it gives greater weight to goods whose prices are likely to rise after their initial "sale price" introduction on the market.
All in all, Nakamura calculates, the CPI probably has been overstating inflation by more than one percent a year. If the index is revised downward by that amount, the post-1975 decline in real wages becomes an increase (to about $9.50, in 1982 dollars). If Nakamura is right about this politically charged subject—which leaves economists, as usual, divided—then other items tied to the CPI, including Social Security payments and personal income tax brackets, have also been distorted.
Rapid advances in computers and telecommunications are responsible for many of the quality-of-life improvements that go unmeasured by the CPI. But such new technologies may also be part of the solution. The Bureau of Labor Statistics (which, it should be noted, has been taking steps to improve the measurement of inflation) still does its work the old-fashioned way, sending people out to stores to gather data. If the bureau instead could electronically tap into the detailed infor
126 WQ Spring 1996