The Urban Myth

Share:
Read Time:
6m 14sec



The Periodical Observer

strengthen the federal government. Duer, working in secret with others, borrowed heavily in an effort to corner the markets in

U.S. debt securities as well as the stocks of the national bank and the Bank of New York. "In the ensuing speculation," writes Cowen, a foreign currency trader and director of Deutsche Bank, "securities prices reached their peaks in late January 1792. Prices trended lower in February, [and] fell off sharply in March"—prompting the "Panic of 1792."

The speculator Duer, his credit exhausted, could not meet contracts he had made to buy securities and suspended payments on his obligations on March 9. His failure, a contemporary said that month, was "beyond all description—the sums he owes upon notes is unknown—the least supposition is half a Million dollars. Last night he went to [jail]." Historians blamed him for bringing the market down.

An 1833 fire at the Treasury Department destroyed most of the First Bank’s records. But its balance sheets for the 1790s were found by historian James Wettereau in the 1930s in the papers of Hamilton’s successor, and published in 1985. Together with other historical materials, Cowen says, they make it clear that the national bank, headed by Thomas Willing, was responsible for the March crash.

When it opened in December 1791, the bank "flooded the economy with credit." Some loans were for legitimate businesses, but others were made to speculators (apparently including Duer) who used them to buy securities. In February—a full month before Duer ran into trouble—the bank, realizing it had loaned so heavily that its bank notes were not being readily accepted everywhere, reversed course by sharply curtailing credit and calling in outstanding loans. Hamilton, worried about speculation and the state of the financial system, gave the reversal his blessing, Cowen says, and may even have initiated it. Suddenly, Duer and other speculators were called upon to repay their loans. Many dumped stocks to do so, and the market sank.

It was a classic "credit crunch." But no recession or depression followed. Like Federal Reserve Chairman Alan Greenspan after the 1987 market crash, Hamilton moved rapidly to have the central monetary authority act as lender of last resort, helping to avert a meltdown.



Society

The Urban Myth

"Small Towns, Mass Society, and the 21st Century" by James D. Wright, in Society (Nov.–Dec. 2000), Rutgers—The State Univ., 35 Berrue Circle, Piscataway, N.J. 08854.


Over the past half-century it’s become conventional wisdom, reaffirmed at 10-year intervals by the Census Bureau, that the United States is becoming an ever more urban nation. Wright, a sociologist at Tulane University, paints a different picture.

If America is becoming more "urban," he says, isn’t it strange that "most of the really big American cities have been losing population for decades"? Of the 10 largest cities in 1970, seven—New York, Chicago, Philadelphia, Detroit, Baltimore, Washingon, and Cleveland—were noticeably smaller two decades later. Of the 100 largest cities, 54—predominantly in the Northeast and Midwest—had fewer people.

Of course, if "urban" simply means "not rural," then, yes, more than three-fourths of the American populace is "urban." (The Census Bureau classifies as rural any place with fewer than 2,500 inhabitants.) But should a tiny burg of 3,000 really be considered "urban"? It’s an archaic definition, Wright says.

"Urban" is also often casually equated with what the Census Bureau calls "metropolitan areas." These have a "large population nucleus" of at least 50,000 people, located in a county of at least 100,000, and include any adjacent counties that seem economically or socially "integrated" with the nucleus. In 1990, nearly four out of five Americans lived in such areas. Does that really make all of them "urban" folk? Many


90 Wilson Quarterly

metropolitan areas, such as SpringfieldHolyoke-Chicopee in Massachusetts, in fact comprise "aggregations of numerous small cities and towns," Wright points out. And 50,000 people hardly make a metropolitan hub. Kokomo, Indiana, 30 miles from his hometown of Logansport, now falls just below that cutoff, but aside from its two large manufacturing facilities, says Wright, it "strikes me as wholly indistinguishable from the hundreds of other small towns that dot the Indiana landscape." Fort Wayne, Indiana (pop. 173,717), in contrast, seems like "a real city." Only about 22 percent of Hoosiers live in the five cities with populations greater than 100,000, but the Census Bureau has 72 percent living in metropolitan areas.

And what about suburbanites? Are they truly part of "urban" America? The term suburb implies "inferiority and dependence," Wright notes, but "the whole point of these communities is to be something other than the cities." People fled to the suburbs to escape the ills of the cities and "to reclaim for themselves and their children some of the stillaccessible virtues and insularity of small town American life."

When suburbanites (48 percent of the population in 1990) are added to the 20 percent of the population in non-metropolitan areas, Wright says, it becomes clear that most Americans live in small towns or in places that resemble or seek to emulate small towns.

The small town is much changed, of course. Most of the corner grocery stores have been replaced by supermarkets, and residents now watch cable TV, read national newspapers, and wear clothes made in Taiwan. But over the past half-century, Wright says, "there has been a strong resurgence of traditionalism, of religiosity, of small town ‘American’ and ‘family’ values, and an equally substantial repudiation" of big-city ills. Are these the characteristics, asks Wright, of an urban society?



The Next Welfare Reform

"Reforming Welfare Reform" by Jared Bernstein and Mark Greenberg, in The American Prospect (Jan. 1–15, 2001), 5 Broad St., Boston, Mass. 02109–2901.


When welfare reform turned from buzzword into law in 1996, many liberals feared the worst: that one million children would be pushed into poverty, and 11 million families made worse off than before. So far, those fears haven’t been realized. Yet many of the affected families are not really better off today, contend Bernstein, an economist at the Economic Policy Institute, and Greenberg, a senior staff attorney at the Center for Law and Social Policy.

The Personal Responsibility and Work Opportunity Reconciliation Act of 1996 transformed welfare from a federal entitlement into a program of fixed block grants, with the states given much more discretion over spending. The law (antedated by some state-level reforms) accelerated a decrease in welfare caseloads that had begun in 1994. In that year, the number of American families getting aid was five million; by the end of 1999, it was 2.4 million. Meanwhile, the employment rate for lowincome single mothers rose from 39 percent to 55 percent.

While a majority of former welfare recipients are employed at any given moment, "for many the connection to the labor market is quite tenuous," Bernstein and Greenberg say. Only about 40 percent work consistently throughout the year, according to recent studies, and the wages they earn are very low, averaging around $6-8 an hour. Nationwide, about 40 percent of former welfare recipients "are not working and have very high poverty rates." Working or not, many former recipients report having experienced some hardships since leaving welfare.

Yet "state studies consistently find that roughly half of those surveyed report that life is better . . . and that if they could choose to go back on welfare, they would not want to do so," write Bernstein and Greenberg. These mothers seem to have "a sense of hope for the future that was absent in the past." Low-wage workers made significant earnings gains during the 1990s, thanks to the tight labor market, a hike in the minimum wage, and the expansion of the federal Earned Income Tax Credit.


Spring 2001 91



More From This Issue