No Hocus-Pocus

Share:
Read Time:
5m 24sec


after humans appeared on the Australian scene. Though the evidence is circumstantial, Roberts thinks it "definitely" implicates humans. But the lethal blow that humans delivered to frightful 660-pound, claw-footed kangaroos, flightless 220-pound Genyornis birds, and other huge beasts was indirect, he believes. Aborigines habitually set fire to the landscape, perhaps to make hunting and traveling easier, and so reduced the megafauna’s food supply. Hunting and climate change may have pushed the big animals the rest of the way to extinction.

"In North America, by contrast," writes Dayton, "hunters may have been in the thick of the faunicidal fray." Ice Age America had saber-toothed tigers, giant antelopes, woolly bison, and woolly mammoths. But by the end of the Pleistocene era, 11,000 years ago, more than two-thirds of the large mammals had died out—once again, after humans had arrived on the scene. According to the "blitzkrieg" hypothesis put forth in 1967 by geoscientist Paul Martin of the University of Arizona, Tucson, early huntergatherers followed their prey across the top of Asia to North America, then southward. Wiping out animals locally, the hunters ultimately drove populations to extinction.

To test Martin’s theory, Alroy, an evolutionary biologist at the University of California, Santa Barbara, recently ran computer simulations of such an invasion of human hunters in North America, starting 14,000 years ago, and the impact it would have had on 41 species of large, plant-eating animals. "Alroy found that no matter how he adjusted the variables, mass extinctions ensued," Dayton writes. "Even the slowest, clumsiest hunters unleashed ecological devastation," and the largest animals were hardest hit. Hunting and human population growth could have done in the megafauna even without climate change.

But "not everyone is convinced," notes Dayton. Biologists Ross MacPhee and Alex Greenwood, of the American Museum of Natural History in New York City, say that Alroy’s hunter argument fails to explain why extinctions ceased 10,000 years ago, instead of continuing into the current era, the Holocene. But MacPhee and Greenwood don’t let Homo sapiens completely off the hook. They suspect that the human newcomers brought with them a lethal, highly contagious virus, and that it did in the woolly mammoth and the other behemoths of the Ice Age.


No Hocus-Pocus

"The Truth and the Hype of Hypnosis" by Michael R. Nash, in Scientific American (July 2001), 415 Madison Ave., New York, N.Y. 10017–1111.

It is a scene familiar from countless movies. A pocket watch swings back and forth on a chain while a voice soothingly intones, "You are getting sleepy, very sleepy." But hypnosis is more than Hollywood fantasy. It has important, widely recognized medical uses, reports Nash, a professor of psychology at the University of Tennessee, Knoxville.

A National Institutes of Health panel found in 1996 that hypnosis alleviated pain in patients with cancer and other chronic conditions. It also has reduced pain in burn victims and women in labor. A recent review of various studies found that hypnosis relieved the pain of 75 percent of 933 subjects taking part in 27 different



"You’re getting sleepy . . ."



Autumn 2001 125


The Periodical Observer

experiments. In a few cases, says Nash, the relief was greater than that provided by morphine.

Another "meta-analysis," of 18 different studies, found that hypnosis, in conjunction with psychotherapy, helped treat anxiety, insomnia, hypertension, and obesity. But certain other conditions such as drug addiction and alcoholism "do not respond well" to hypnosis, says Nash.

Psychologists in the late 1950s developed a series of 12 tests to measure the depth of a subject’s hypnotic state. In one test, for instance, the subject is told that he is holding a very heavy ball. If his arm sags under the imaginary weight, he scores a point. The more tests the individual passes, the more responsive to hypnosis he is. On a scale of zero to 12, most people score between five and seven.

Contrary to what one might suppose, readily hypnotized persons aren’t necessarily prone to "gullibility, hysteria, psychopathology, trust, aggressiveness, imagination, or social compliance," says Nash. Instead, they tend to be people who lose themselves in reading, daydreaming, or listening to music.

Studies show that a person’s capacity to be hypnotized, like an IQ score, remains stable throughout adulthood. Identical twins are more likely to have similar hypnosis scores than same-sex fraternal twins, a finding that indicates a possible hereditary factor.

"Under hypnosis, subjects do not behave as passive automatons," Nash observes. Rather, they actively respond to the hypnotist’s suggestions. Yet they typically perceive the sometimes dramatic changes in thought and behavior that they experience—including hallucinations, delusions, and memory loss—as "something that just happens" to them, without any effort on their part. "My hand became heavy and moved down by itself," a subject might say.

The clinical use of hypnosis, Nash believes, may become a matter of course for some patients with certain conditions. Hypnosis is not yet a part of standard medicine, but it has "come a long way from the swinging pocket watch."


The Darwinian Doctor

"Dr. Darwin’s Rx" by Beth Saulnier, in Cornell Magazine (Mar.–Apr. 2001), Cornell Alumni Federation, 55 Brown Rd., Ithaca, N.Y. 14850–1247.


There seems no end to the frontiers of medicine. The latest: "Darwinian medicine," an emerging field that takes an evolutionary perspective on human health. Advocates, notes Saulnier, an associate editor of Cornell Magazine, look at the symptoms of illnesses or injuries that physicians traditionally treat, and ask whether some symptoms are not beneficial.

Consider fever, for instance. "A moderate fever, below about 103 degrees, actually can speed the healing process," says Paul Sherman, an evolutionary biologist at Cornell University. "It makes the body’s environment less able to be invaded by the pathogen, and it enables its immune system to work faster."

Morning sickness, in the Darwinian perspective, is another misunderstood protective response, writes Saulnier. Sherman and a student, Sam Flaxman, found that women "who experience moderate morning sickness are less likely to miscarry." Meat, eggs, and certain other foods are likely to contain chemicals or pathogens that could harm the developing fetus, so the mother’s nausea and vomiting protect the baby. Thus, women genetically disposed to morning sickness are "more likely to reproduce and pass on the trait."

"Human biology is designed for Stone Age conditions," wrote researchers Randolph Nesse and George Williams in a 1991 article that gave the nascent field of "Darwinian medicine" its name. That design lag can help explain information age maladies.

The craving for fat, for instance, once was "a distinct evolutionary advantage," Saulnier says, since fat has more calories


126 Wilson Quarterly



More From This Issue