2003: Amy Ellis Nutt, The Star-Ledger, Newark, N.J.
Award for Nondeadline Writing

Article list

The Mind's 'I': What is it that makes you think you're you?

December 1, 2002

NEW YORK -- Maps are the tools of dreamers. A map gives substance to possibility, truth to discovery. In the 16th and 17th centuries, cartographers were called "world describers." In the 21st century, it is neuroscientists who are pushing back the boundaries, attempting to describe that final terra incognita, the human mind.

In 1637 the mind was front and center when Descartes announced, "I think, therefore I am." Having proven his own existence, the French philosopher then asked himself the mother of all follow-up questions: "What is this 'I' that I know?"

Nearly four centuries after Descartes essentially threw in the philosophical towel, Todd Feinberg, a neurologist at Beth Israel Medical Center in New York City, and Julian Keenan, an experimental psychologist at Montclair State University, believe they are close to mapping the place in the brain where the sense of self is formed.

Feinberg, author of "Altered Egos: How the Brain Creates the Self," treats patients who have neurological damage, studying how their injuries have robbed them of the key ingredients of their identity.

For many of his patients, stroke, disease and physical trauma -- especially in the right hemisphere of their brains -- have resulted in a kind of self-alienation. They are people whose brains have lost their way.

Keenan, author of the soon-to-be-released "The Face in the Mirror," is researching those same ingredients through experiments that involve magnetic stimulation of the brains of healthy subjects, testing for the thing that he believes makes us uniquely human: self-recognition.

Among all the species on the face of the earth, human beings alone inquire about who they are. Feinberg and Keenan are among a small band of scientists reaching through the mists of memory and emotion to explain how this could be.


Todd Feinberg hunches in his chair as his theory of the fractured self is played out in front of him in a simple game of cards. An elderly couple sitting across from him are playing war, in which two players simultaneously pick up cards from their own halves of the deck and place the cards next to one another. Whoever has the card with the higher face value wins the round.

Sylvia is moving the game along at a clip, and it's clear why. Every time she picks up a card from her own pile with her left hand, she is compelled to pick up a card from her husband's pile with her right hand.

Feinberg, a psychiatrist as well as a neurologist, is fascinated. Quite literally, Sylvia's right hand doesn't know what the left hand is doing. Only occasionally, and seemingly unconsciously, does Sylvia (not her real name) realize that her right hand is meddling with the game, and when she does, she places the hand between her knees and squeezes it to try to keep it from misbehaving.

The 72-year-old woman, who owns an antique store in New Jersey with her husband, suffers from alien hand syndrome, a rare neurological condition. A stroke several months ago damaged Sylvia's corpus callosum, a broad band of 200 million fibers that bind together the left and right hemispheres of the brain. Signals from the left hemisphere, which would normally inhibit the actions of Sylvia's right hand, are not getting through to the other side of the brain. The result is that her right hand seems to have a life of its own.

When she speaks to her husband on the phone from her room in the Center for Head Injuries at JFK Johnson Rehabilitation Institute in Edison, she cradles the receiver with her left hand, but her right hand frequently reaches out and disconnects the call.

When she eats with her left hand, her right hand will wipe the table with an imaginary cloth.

And when she plays checkers, she moves her own piece with her left hand, and then her opponent's with her right.

Sylvia, for all intents and purposes, is a woman of two minds. Which is why, says Feinberg, she not only has a damaged brain, she has a fractured self.

"Alien hand syndrome tells us a lot about brain unity," says the 51-year-old doctor. "It tells us that there is no consciousness or mind that does not require cerebral integration. If you destroy or damage the corpus callosum, there are times at which the brain can act as though it was possessed of two minds, two consciousnesses, two independent entities."

Only in the 20th century did the brain become the primary focus in the search for the self. The ancient Egyptians thought the self, or the mind, was located in the bowels. The Sumerians and Assyrians thought it was in the liver, Aristotle the heart.

Descartes thought they were all wrong. The mind and the body, he wrote, were two separate entities. The body was a physical organ, a complex machine that walks, eats and sleeps, and the mind was a disembodied spirit, intangible and unobservable but altogether real.

In 1949, British philosopher Gilbert Ryle said Descartes' dualism was preposterous. An independent, invisible secret agent inhabiting the body? That would mean there was a "ghost in the machine." Ryle rejected the idea of two separate entities. There was, he contended, no intangible self, no "homunculus," or miniature man, directing a person's thoughts and actions from the inside. Instead, a person simply was his thoughts and actions, and the world was processed entirely by the gelatinous gray and white matter inside our skulls.

The mystical mind was out, the hard-wired brain was in.


Mapping the brain, of course, has not been an easy task. A dense tapestry threaded by archipelagoes of nerve cells, the brain consists of billions of neurons and trillions of synapses. It is the most complex object on the planet. The heart pumps blood, the lungs ingest oxygen, the stomach absorbs nutrients, but the functions of the brain are manifold. It monitors the body's basic processes, coordinates physical movement, perceives, thinks, acts and feels. It is an executive branch of government that ceaselessly plans, reacts and interacts with the organic world around it.

It takes millions of neurons firing in sequence to create the simplest thought, and in the same way the Greek philosopher Heraclitus believed one never steps into the same river twice, we cannot have the same thought twice. Every sensation, every idea, every action creates a unique firing pattern, and each firing pattern creates a wave of neuronal activity that reacts to the one that came before it. At every moment, the landscape of the brain is being redrawn.

The idea that this puzzle of brain activity could be assembled into a single, subjective consciousness has perplexed Feinberg for most of his life. How does a 3-pound lump of matter become a "me"?

"The first thing that I remember discovering in life was that I had a brain," says Feinberg, founder and chief of the Yarmon Neurobehavior & Alzheimer's Disease Center at New York City's Beth Israel Medical Center. "I couldn't have been more than 6 years old, and one day I said to myself: 'I have thoughts and I have experiences. I have consciousness. But where are they? Where are they located? How come I can't see them? How come they can't be touched and measured and weighed?' And I just could not believe that. Ever since, I've been obsessed with the mind."

In truth, the life of any one mind is irremediably closed, colored by experiences and bounded by the uniqueness of individual perspective. "The mind is its own place," wrote Ryle, "and in his inner life each of us lives the life of a ghostly Robinson Crusoe ... blind and deaf to the workings of one another's minds and inoperative upon them."

"There is no way to find out if your experience of the color red, for instance, is like my experience of the color red," says Martha Farah, director of the University of Pennsylvania's Center for Cognitive Neuroscience. "But if you define consciousness as mental content -- the information contained in thoughts that is reportable by the person, and which they can reflect on and talk about -- then, in that sense, consciousness is a valid subject of scientific study."

It is the content of consciousness that particularly interests Feinberg. The son of two psychologists, he was reading Freud in the second grade and by high school had steeped himself in abnormal psychology. After graduating from the University of Pennsylvania summa cum laude and receiving his medical degree from Mount Sinai School of Medicine in New York, Feinberg took on a dual residency in psychiatry and neurology, becoming an expert on both sides of the Cartesian coin -- the mind and the body.

Like the best scientists, Feinberg is a tightrope walker, searching for purchase on the subtlest threads of evidence, trusting only his sense of imbalance to tell him how much farther he has to go. Every day, he tests the wire.

"When I got out of the shower this morning, I stubbed my toe and it hurt and it hurt," says Feinberg, who lives with his wife and teenage son in Tenafly. (A daughter studies psychology at Syracuse University.) "And I said to myself: 'Boy, if that isn't mysterious. Why and how am I in pain if those neurons that are telling me I'm in pain aren't themselves throbbing in pain?'

"Where is that pain? When I stubbed my toe I didn't grab my head in pain. I grabbed my toe. If I had an 'autocerebroscope' and I could look through it and observe my own brain, I might see neurons firing in patterns, but I'll never find that pain. I can't touch it. I can't see it. I can only experience it.

"And that, in a nutshell, is what is so mysterious about consciousness."


Subjective experience can't be seen or heard or touched. It simply is. Feinberg calls this the "transparency problem."

There is a second aspect to self-awareness that deepens the mystery, the "binding problem," which is: How do billions of different neurons come together to form a single unified self, and if we know where the neurons are located, why can't we find the self?

It's a bit like looking for Beethoven's Fifth Symphony in the sheet music. The score includes all the notes played by the violins, the cellos, the timpani and so on. But where is the music, this thing called "Beethoven's Fifth Symphony"?

For Feinberg, solving the problem of the unity of consciousness is like building a cathedral from a billion blades of grass. "If you're a really obsessional person like I am," he says, "then you can't give up. You don't ever say it can't be understood. So you just don't stop."

Feinberg's office at Beth Israel is a testament to the neurologist's professional and personal fixation. Framed brain scans adorn his wall like family photographs. Feinberg points to each of them in turn: "This is an interesting one because it's abnormal. That's a normal one. That's an abnormal one." Ever the teacher, he asks his visitor about the picture just over his right shoulder: "Can you tell what's wrong with this one?" The entire corpus callosum is missing. "Amazing," he says, "isn't it?"

The secrets of the self, Feinberg believes, lie in the brains of his patients.

"As slicing an apple reveals its core, the neurological lesion or damage opens a door into the inner self," he writes in his book. "It provides an opportunity to examine the physical structure of the self and to see how the self changes and adapts in response to the damaged brain."

There are many ways to define consciousness, including the act of simple perception. Self-awareness and the ability to recognize that other people have self-awareness represent the highest order of consciousness.

"Everything, in the last analysis -- every feeling, every thought, every memory, every state of mind -- has to be represented by a brain state," says Gordon Gallup, psychologist at the State University of New York at Albany and one of the first to study self-recognition in primates. "These things aren't generated in a vacuum."

Many of Feinberg's patients are stroke victims, and many, if not most, have suffered damage to the right side of their brains. Because the right hemisphere affects the left side of the body (and vice versa), these patients have problems with their left limbs, as well as vision and general movement on the left side.

Sylvia is an unusual case. She was examined by Feinberg at the JFK Johnson Rehabilitation Institute's Center for Head Injuries in late September as a guest of Joseph Giacino, the center's associate director of neuropsychology, who is a frequent collaborator with Feinberg. Sylvia is Giacino's patient. Both doctors agree Sylvia has suffered an anterior cerebral artery stroke in the left hemisphere of her brain, which has led to damage on the left side of the corpus callosum, as well as in supplementary motor areas.

Basically, Sylvia's brain was deprived of oxygen and a small bundle of cells between the two hemispheres liquefied, then hardened into a scar the size of a half-dollar.

In many of the alien hand cases Feinberg has seen, the limbs act in violent opposition. (Unlike Sylvia's case, patients who have damage only to the corpus callosum will always have left-hand alien hand syndrome.) When one of Feinberg's patients tried to button his shirt with his right hand, his left hand unbuttoned it. When he picked up a forkful of food with his right hand, the left hand knocked it away. Another patient reported that her left hand tried to strangle her while she slept.

"The thing that grabs one's attention here," says Feinberg excitedly, "is the fact that you have two hemispheres in one person with competing and conflicting attentions, and that highlights the incredible unification in normal intact individuals. ... The sense of the self is the sense of a unified self, of personhood."

Some cognitive scientists believe this fact makes it impossible to localize consciousness to one area, or even several areas, of the brain.

"The frontal poles of the brain separate humans from all other living things," says neuropsychologist Mark Wheeler of Temple University. "But that is not going to be the whole story. You don't lose consciousness by losing a bit of brain tissue. ... There are physical correlates to everything. Questions like 'How does a brain state become a mental state' I don't know how to answer. Neuroscience has done an incredible job in the last few years; the philosophy of mind hasn't moved much in 300."


In his filing cabinets, Feinberg has hundreds of scans of patients whose sense of personhood was shattered by stroke or disease. Atop the cabinet are scores of videos of many of those patients going back 18 years. In one, an elderly, hearing-impaired woman who knew sign language and could read lips looked at her reflection in a mirror. Feinberg asked her what she was doing, and she said she was communicating in the mirror with someone else, someone who was very much like her and attended the same grade school but was nonetheless a stranger.

Talking about this person in the mirror, the woman said: "She's not a very good lip reader. I had to talk mostly in sign language for her, to make her understand. ... She's not that bright. I hate to say that. ...

"She's a nice person. But one thing about her ... I see her every day through a mirror, and that's the only place I can see her. When she sees me through the mirror, she looks a little, then she comes over and talks to me, and that's how we began becoming friends."

Feinberg's diagnosis was a delusional misidentification problem known as "Capgras syndrome for her mirror image," caused by atrophy in the right temperoparietal region of the brain. Except for misidentifying her own reflection, the woman was perfectly normal.

Some of Feinberg's patients suffer from asomatognosia, in which they deny or misidentify a part of their own body after it has been paralyzed by stroke. In all of the cases Feinberg has seen, the damage was to the patients' right hemisphere, causing them to attribute ownership of their left arm to another person -- a relative, a stranger -- or even a pet.

Some patients try to throw the disowned arm out of bed. Others, trying to acclimate, create stories about the arm, give it nicknames such as "Toby" or "Silly Billy," or simply refer to it as "a canary claw," "a sack of coal" or "dead wood."

Feinberg's research has shown a peculiarly gender-specific phenomenon associated with asomatognosia. Women frequently will mistake their left arms for their husbands' arms. Men will frequently mistake their left arms for the arms of their mothers-in-law.

There is no cure for most of these patients, but over weeks and sometimes years, their symptoms often diminish and even disappear -- a testament to the resourcefulness, as well as recuperative ability, of the damaged brain.

Sylvia, after just a few months, already has begun to recognize and gain more control over the actions of her right hand.

Feinberg believes that the sense of identity is probably a mixture -- what he calls a "nested hierarchy" -- of coordinated functions arising out of several areas of the brain, but he believes, too, that the right hemisphere is dominant as the source of the self.

Julian Keenan's belief is stronger, and more specific. The right hemisphere isn't simply dominant in the formation of self-awareness, he says, it is essential.

"I think there actually is a center" of the self, says Keenan as he leans back in a chair in his office at Montclair State University. "There are definite neural correlates of higher-order consciousness that, if you mark them out, the person is no longer conscious, no longer capable of self-awareness."

Just a tenth of an inch beneath the furrowed ridges of gray matter that cover the right front side of the brain, he contends, is a layer of tangled cell tissue that makes us uniquely human.

While acknowledging there may be other similarly minuscule areas of the brain that contribute to consciousness, the 32-year-old experimental cognitive psychologist has come to the conclusion that the right prefrontal cortex -- located just above the right eye -- is the primary source of self-awareness.


Two years ago, while conducting postdoctoral research in behavioral neurology at Harvard Medical School, Keenan created an unusual experiment to test for "self-face recognition," which he regards as the hallmark of higher consciousness.

"What we know, as far as self-face recognition is concerned, is that it's reserved for a very few species," says Keenan, who lives with his wife, Ilene, in Jersey City. "Only chimpanzees, orangutans and humans have the ability to recognize an image as their own. So what we wanted to do was see where in the brain that takes place."

Volunteering as test subjects were five people about to undergo brain surgery at Boston's Beth Israel Deaconess Medical Center for severe epilepsy. During the presurgical evaluation of each patient, the two hemispheres of the brain were anesthetized, one at a time, while the patient stayed conscious and alert. After each hemisphere was numbed, Keenan and his colleagues showed the person a photograph with a morphed image blending the patient's face with that of a famous person's -- Marilyn Monroe or Princess Diana for the women, Bill Clinton or Albert Einstein for the men. After the testing, each patient was presented with two conventional photos, one of himself or herself and one of the famous person. They were asked which was the one they remembered seeing under anesthesia.

The results were startling. When the right hemisphere was anesthetized, four of the five recollected seeing only the famous person. With the left hemisphere numbed, all five patients remembered the morphed picture as a photo of themselves alone.

"We really saw that the right hemisphere was the big player in self-recognition," says Keenan, "and in particular the right prefrontal cortex." His conclusion: That is where the self resides.

For Keenan, thinking about thinking is a deeply personal preoccupation. It's easy to imagine what a scan of the young psychologist's brain would have looked like: storm clouds of electrical activity roiling through the right hemisphere, firing up neurons as if they were lights on the Rockefeller Center Christmas tree.

"It started when I was 15 when I read 'Gödel, Escher, Bach' by Douglas Hofstadter," says Keenan, referring to the 1979 best seller about mathematics, art and music, "and that book has just always stayed with me -- all those self-referential systems. We all think about our own thoughts. We all think about, 'Am I the only person on this planet and everyone else is just a robot?' We all have these sorts of ideas about our own thinking, about the little voice in our head. ... I guess I've always thought that everyone thinks like that."

Keenan is an abstract painter and a musician and likes to speak in visual terms. In considering the brain, he cites an article by John Updike about baseball Hall of Famer Ted Williams on the eve of Williams' retirement.

"Updike described Williams, who at that point was old and injured, as looking like a Calder mobile with one of its threads cut. I thought that was the most beautiful description. Everything went off balance just a little bit. And as I read that, I thought: What a great description for the brain. There are these separate sorts of units and they're all in balance, and even though they may look independent, damage to one area will affect other, far-reaching areas."

Every year Keenan asks students in his Introduction to Physiological Psychology course to create a mobile with the brain as the governing theme, and he hangs the best ones in his office. As he speaks, a mobile of a neuron, made out of multicolored pipe cleaners, dangles delicately overhead.


While Keenan and Feinberg are traditional materialists, believing that the mind is nothing more than brain functions, others, like Daniel Dennett, a cognitive scientist at Tufts University, believe the mind is nothing at all -- that mental states don't arise from neural states, they are neural states. Dennett once declared about consciousness: "It's like fame. It doesn't exist except in the eye of the beholder."

Colin McGinn, philosopher of mind at Rutgers University, also believes the self is elusive, but not because it is nonexistent; because it is fundamentally unknowable. "We're trapped inside ourselves, inside our own language," says McGinn. "For that reason, trying to describe the contents of our consciousness with the same tools, the same words is inadequate."

William James, the pioneering 19th-century philosopher and psychologist (and brother of novelist Henry James), said that trying to describe introspection was like "trying to turn up the gas (light) quickly enough to see how the darkness looks."

Keenan, however, believes the science of consciousness can transcend linguistic limitations.

In a new series of experiments at Montclair State, he is using a device called a transcranial magnetic stimulator to measure how active each hemisphere of the brain is in tasks involving self-recognition. When gently placed against the skull, the stimulator -- which looks oddly like a thick, metal Mardi Gras mask -- creates a magnetic field that painlessly deactivates a specific area of the brain for a moment as brief as a hundred-thousandth of a second. When the device is held over the area of the right prefrontal cortex -- the area Keenan believes is the source of self-recognition -- subjects routinely take a fraction of a second longer than normal to recognize their face on the computer screen. When the stimulator is held over the left frontal region, nothing happens.

"Again and again, what we're seeing is that the processes of self-evaluation are preferentially engaged in the right hemisphere," says Keenan. "And it is that ability to recognize one's own face that appears to be a hallmark of consciousness. To know that our own face is ours inevitably requires knowledge of the self. Without self-knowledge, it would be seemingly impossible to recognize who we are."

Farah, the Penn neuroscientist, whose primary research is in the neural correlates of cognition, believes self-recognition studies are helping to advance the scientific study of the mind. "A lot of the work on sense of self and the brain is pretty flaky," she says, "but Keenan's and Feinberg's work is credible. Keenan has found distinctive patterns of brain activity that correlate with processing one's own face compared to other people's, and Feinberg finds that certain brain lesions disrupt a person's ability to recognize their own face or arms as belonging to them. This tells us that one's sense of physical self is the result of specific brain systems."

Keenan claims to be obsessed by his work, and a sleeping bag wedged atop the bookcase in his office attests to that. "There's always so much more to know," he says. "There's always just another level of understanding. You think you have a clue, and then you find out you have no clue, and it goes on and on and on. It's never-ending. You can never know enough."

Still, Keenan believes that in the next 10 years he will know enough to have a new map of the brain with more precise coordinates of the self. Describing subjective experience may forever be elusive; describing what it is that makes us most human, he says, is not.

That's all Feinberg is looking to do, too, and he believes the search is profoundly important: "You could argue that aside from intelligence, the sense of the self is probably the greatest human achievement. Without that sense of being a being, where would we be?"

Article list

Defying the years

December 2, 2002

OAKLAND, Calif. -- Time unravels us. Day by day, it peels away the layers of our lives until nothing is left but the nub of our own mortality.

Human beings are the only animals on the planet capable of contemplating their own demise. We mourn, we memorialize, we philosophize and we pray. And when it happens on that rare occasion that we "cheat" death or "escape" our fate, we believe, just for a moment, in the myth of immortality.

Today scientists are tempting fate in ways never before imagined as they demystify the secrets of longevity. Biochemist Bruce Ames believes that vitamins can repair damaged cells and make them "young" again. Molecular biologist Judith Campisi is studying how to keep cells from aging.

Both believe that while there may be no actual Fountain of Youth, no scientific Dorian Gray in a Bottle, reversal of aging and an extended life span are now on the horizon.

Bruce Ames is a chain-reaction thinker -- one thought always leads to another -- which may explain why the senior scientist at Children's Hospital of Oakland Research Institute is so restless. Ames, 73 and wiry, often starts a conversation sitting down but invariably finishes it standing up, practically sprinting across his office to a blackboard to illustrate something about unattached free radicals or mitochondrial decay.

Chain-reaction thinking leads to big ideas, and Ames is a big idea man. Genes. Cancer. Nutrition. Aging. He has tackled them all, publishing more than 450 articles and becoming one of the most frequently cited scientists on the planet.

"I told a colleague recently that I was doing the best work in my career," says Ames, who is also a professor of biochemistry at the University of California at Berkeley, "and he looked at me and said, 'Bruce, you've been telling me that for 30 years.' I guess that means my enthusiasm genes are undamaged."

Ames should know. Damaged genes have been his business for half a century. Ames grew up in Manhattan as the son of a high school chemistry teacher and a mother who wanted him to be a doctor. Instead he became a researcher, graduating from the Bronx High School of Science before getting his undergraduate degree at Cornell and his Ph.D. in biochemistry at the California Institute of Technology.

In the 1950s Ames was a researcher in a lab at the National Institutes of Health, investigating ways to test for genetic mutation. His petri dish protocol ultimately proved that genes damaged by certain chemical substances often gave rise to cancer. By the 1970s, the "Ames test" was the world's most widely used method for identifying potential carcinogens in everything from clothing to hair dye to pharmaceuticals.

"It's just problem-solving," says Ames about his research methods. "If you have two odd facts in your head and suddenly they fit together, you see some new way of explaining something."

That's what happened nearly a decade ago, when Ames turned his focus from cancer to aging.

How and why we age has been a mystery since humans first contemplated their own mortality. It is one of the most complex of biological processes: The human body contains more than 250 types of cells, and each type has its own peculiar aging characteristics.

There are scores of different theories about aging, but all of them can be broken down into two broad camps: theories that regard aging as the result of normal wear and tear from environmental insults and metabolic processes; and theories that regard aging as the result of a pre-programmed genetic plan, a process that begins at birth, or even at conception, and continues until our "biological clock" runs down.

As a scientist who loves studying process almost as much as its results, Ames falls in the wear-and- tear camp. His years of watching the cellular chaos created by cancer has given him perspective on the degradation of cells that comes with aging.

"In 6 million years of evolution, we've gone from a short-life creature to a long-life creature," says Ames, "and age-specific cancers have gone up. Thinking about that said to me: A lot of cancer is just about getting old. And that got me interested in aging."

Two odd events kept jangling about in Ames' head: the rise in cancer and the increase in free radicals with age. Free radicals are molecular miscreants, compound substances that create havoc inside cells by stripping other molecules of their electrons. Was there a direct link between free radicals and aging? Was it possible that free radicals actually contributed to aging?


Ames began by looking at mitochondria, where free radicals are produced. Mitochondria are tiny structures inside every cell that act like furnaces, manufacturing most of the energy that is used by the body. Some cells with high metabolic rates, such as those in the heart muscle, contain many thousands of mitochondria. Other types of cells may contain as few as a dozen.

As energy-producing machines go, mitochondria are spectacularly efficient. Of the oxygen consumed by an average cell, the mitochondria convert 95 percent of it to help turn food -- fats and carbohydrates -- into a chemical fuel known as adenosine triphosphate, or ATP. Every time we breathe, in other words, we're giving an energy boost to our cells.

During that process, mitochondria steal electrons from oxygen molecules in order to function more smoothly. But therein lies the problem. During those acts of larceny, a mitochondrion sometimes "misplaces" the electrons it is stealing. Like money flying out the back of a Brink's truck careening around a corner, these misplaced electrons -- now called free radicals -- scatter around the insides of cells, bonding indiscriminately with other molecules.

This mischief is called oxidation, and it allows free radicals to become chromosomal rototillers, breaking and mangling DNA at will.

Too many free radicals create a kind of cellular pollution that stiffens cell membranes and wears down enzymes. Too much damaged DNA results in cell mutations (which can cause cancer). Both are signs of aging.

If not for these free radicals, Ames realized, mitochondria could be a cellular Fountain of Youth.

In 1990 he and his colleagues at Berkeley announced the findings of their study. They'd discovered twice as much free radical damage in tissues of 2-year-old rats as in those of 2-month-old rats. Ames had found a crucial link among oxidation, DNA mutation and age: Free radical oxidation doesn't just rise with aging, it causes it. The more that mitochondria "leak" free radicals, the more those radicals end up damaging the mitochondria, which in turn leak even more free radicals.

This vicious cycle gets only worse with age. It is the ultimate biological irony: The thing we most need to live -- oxygen -- is the very thing killing us.

Ames estimates that the DNA in each cell of the human body experiences at least 100,000 "hits," or instances, of free radical damage per day.

"Living is like getting irradiated," says Ames. He admits it's a slight oversimplification, but free radicals created by radiation do the same thing as free radicals created by breathing. "With age, despite the mitochondria trying to keep it all in check, the level of free radicals goes up, which means the level of oxidized protein goes up, which means the level of DNA damage goes up."

Most scientists believe that mitochondrial health is only one cog in the aging wheel.

"Aging is complex and will not be explained by one gene or mechanism," says Jerry Shay, who holds the Distinguished Chair in Geriatric Research at the University of Texas Southwestern Medical Center. Shay believes Ames' research is promising but that other biological processes affecting longevity must be taken into account, since "different tissues may have fundamentally different mechanisms underlying their maintenance and repair."

To prove that mitochondrial dysfunction actually causes us to age, Ames decided to work backward. If he could find a way to restore mitochondrial health by lowering free radical damage, he could improve cellular function. In essence, he could turn back the cells' biological clocks.

(Ames is in no hurry to turn back his own biological clock. He likes to joke that he gets his exercise by "running" experiments, "skipping" the controls and "jumping" to conclusions. His wife of 40 years, biochemist Giovanna Ferro- Luzzi, heard the joke for the 50th time recently and exacted her revenge: "She got me a personal trainer."

Ames says he has time for only about an hour a week with the trainer, but his wife insists they walk the two miles to their favorite Italian restaurant, Oliveto, for lunch at least three times a week.)


It was while visiting his wife's native country in the mid-1990s -- they have a house in Tuscany and an apartment in Rome -- that Ames got the idea for how to improve mitochondrial health and perhaps slow, or even reverse, the aging process.

A dietary supplement known as acetyl-L-carnitine, or Alcar, was sweeping Italy. The latest nutritional fad was being marketed as a pick-me-up, and Ames understood why: Alcar is a naturally occurring biochemical involved in the transport of fatty acids into the cell's mitochondria. In other words, Alcar helps cells produce energy.

When Ames got back to his lab, he started feeding Alcar to his old rats.

And the old rats loved the stuff. Within weeks, they appeared re-energized, and their biochemistry was running more smoothly. There was a problem, however. As the Alcar improved mitochondrial health, it also appeared to increase the level of free radicals. Ames decided to add another nutritional supplement to his rats' diets, the anti-oxidant alpha lipoic acid. Another naturally occurring chemical, lipoic acid, he thought, should work by tuning up mitochondrial function, thereby lowering free radical oxidation.

The results were staggering. Said Ames earlier this year, after the findings of his research team were published in the Proceedings of the National Academy of Sciences:

"With these two supplements together, these old rats got up and did the Macarena. ... The brain looks better, they are full of energy. Everything we looked at looks more like a young animal."

Some researchers believe the hope offered by maintaining healthy cells or rejuvenating old ones is limited.

"You can achieve immortality at the cellular level, but I don't see how it would be practical in extending life span," says Robert Lanza, the medical and scientific director of Advanced Cell Technology in Worcester, Mass. "There's a wall at 120 years. We can continue to piece things together. But we're like tires; there are just so many times you can be patched up."

Ames acknowledges he has not discovered the Fountain of Youth but lays claim to a Fountain of Middle Age. The evidence, he says, lies not only in the physical rejuvenation he observed in his rats, but in their improvements on cognition and memory tests. Says Ames: "It was the equivalent of making a 75- to 80-year-old person look and act middle-aged."

Ames looks every bit the part of an elderly gent, with his white hair, bifocals and quaint bow tie. While he has a penchant for mixing plaids, his mind is relentlessly mixing and matching ideas.

"I was always sort of a B-student in school, but I loved reading enormously. Still do. I was always a pretty creative thinker. I try to be a generalist. I make my living as a big picture guy, always looking for the next big idea."

Ames put his current big idea into a pill. In 1999, he and a colleague, Tory Hagen, founded a company to sell the energy formula as a dietary supplement. The pill, available over the Internet, includes 200 milligrams of alpha lipoic acid and 500 milligrams of acetyl-L-carnitine, but Ames says the two nutrients just as easily can be purchased separately at any health food store.

While Ames and Hagen's company, Juvenon, licenses the supplement, the University of California holds the patent. Juvenon has yet to make a profit. If it does, the university will get a third. Another third will go to the university's department of molecular and cell biology, where Ames is a professor, and the remaining third will be split by Ames and Hagen, now at the Linus Pauling Institute at Oregon State University.

Clinical human trials are ongoing. Ames, for one, is satisfied enough with the animal results that he takes a dose of his own supplement twice a day. He admits he hasn't noticed any significant changes in himself just yet.

"Is it a reversal of aging or just a slowing?" he asks himself out loud. "The rats seem to do better on the IQ test as well as the treadmill test, so that looks like a reversal. ...

"I don't want to over-hype it. If you're an old rat, it looks very good. But we still have to wait for the results from the human trials. There's every reason to think it's going to work in people. I'm very optimistic."


In her basement office in nearby Berkeley, Judith Campisi perches herself on the edge of a chair and speaks with a wide-eyed enthusiasm usually reserved for first-year graduate students. Campisi is a senior molecular biologist at Lawrence Berkeley National Laboratory. An expert in the genetics of aging and a proponent of the "biological clock" theory, the 54-year-old scientist believes that "reprogramming" human genes to extend life span may not be far off.

It is 7:30 in the evening and Campisi is in no hurry to go home. "I have no separation between my work life and the rest of my life," she admits without hesitation, and the evidence is all around her: three empty yogurt cups in the wastebasket, and on a shelf below a side table, a kind of researcher's survivor kit -- a couple of cans of Progresso hearty chicken soup, a container of Cafe Vienna coffee, a makeup mirror and hand lotion.

Piles of papers rise from the floor like unsteady chimneys, forcing pedestrian traffic to take a serpentine route through the room. The stacks are layered with journals bearing such titles as "Trends in Cellular Biology" and "Experimental Gerontology." On a nearby table, "Handbook of the Biology of Aging," a textbook co-written by Campisi, sits atop a tower of paper nearly as tall as the 4-foot-10 biochemist.

Campisi's research focuses on the telomere, a structure containing a repeated DNA sequence that is found on both ends of every chromosome in the human body. In 1990, Calvin Harley, now the chief scientist at Genron, a California biotech company, discovered that as cells divide, the telomeres of the new cells become shorter.

A few years later, it was shown that in some cells telomeres also get shorter with age. When telomeres become too short, they send a signal to the cell to stop dividing and a natural cellular state called senescence ensues. Campisi believes the primary function of senescence is to fight off cancer.

"Senescent cells are not dead," she says, "they're perfectly alive, they metabolize, but what they can never, ever do again is divide. And if you can't divide, you can't form a tumor. ...

"It's only in the last .00001 percent of human evolution that we have had the luxury of living in an environment where the food supply is good, infections are pretty much kept at bay, and there are no lions jumping out of the savanna to kill us. But for the vast majority of evolution, we evolved under very hazardous conditions. The life span was probably only 25, 30, 35 years at most. So think about what happens. If all evolution really does is devise a system to keep an organism -- keep us -- cancer-free for 30 years, well, then it does a pretty good job."

What it doesn't do is keep us young.

Campisi's research has shown that the longer we live, the more senescent cells our bodies accumulate, and it's those senescent cells, she says, that may play a leading role in making us look and feel old. If she can prove this hypothesis, Campisi will have identified one of the main contributors to aging: We age not because our cells die, but because they stop dividing.

"We reasoned several years ago that because the senescence response is an arrest of cell proliferation, but not cell death, after about age 50 we start to see significant numbers of these cells appearing. And we know from our culture studies that these cells don't function properly, and so we're filling up with these dysfunctional senescent cells the longer we live, and so this may be an important reason we age."


Campisi, like Ames, came to aging by way of cancer research. She came to research, however, by way of a Catholic girls high school.

"When I finally got to college," she says, "I decided I wanted to take classes with lots of guys. I was good at science and I liked it, but the best part was all those men majoring in it."

Campisi, who was born in Queens, graduated from the State University of New York at Stony Brook in 1974 and stayed on for her doctorate, which she received five years later. Along the way, she married, divorced and settled into a career in cancer research. In the mid- 1980s, during a postdoctoral fellowship at Boston University, a colleague came calling with an offer.

"He was putting together a project grant on aging, which I wasn't even interested in at the time," says Campisi, "and they needed one more scientist. He said, 'Do you think we could get you interested in a problem called cell senescence?' The funny thing is, he didn't think senescence had anything to do with aging."

Campisi came to see that cancer researchers were looking at one aspect of senescence, researchers on aging were looking at a different aspect of it, and "nobody tried to get those two to come back and talk to each other."

Campisi didn't have to. She looked at both aspects herself, and like several other molecular biologists discovered a critical connection among cancer, aging and cellular senescence.

"When the telomere becomes dangerously short but not completely gone, it sends that signal to the cell to stop dividing," she says. If it didn't, the DNA tips on the end of the chromosome would become raggedy, and the chromosome would start seeking out other broken chromosomal pieces -- and "that," says Campisi, "is the hallmark of cancer.

"Now, how does a healthy cell know that it doesn't have a broken piece of DNA? The telomere."

Telomeres allow cells to senesce, and if such cells stop dividing, they can't form a tumor. "The question is," says Campisi, "what happens to an organism that begins to accumulate senescent cells with age?"

Cancer, again, may hold the key.

While normal cells can divide only so many times (known as the Hayflick Limit), cancer cells are essentially immortal, and in 90 percent of them telomerase, an enzyme, can be found. Telomerase replaces the bit of telomere clipped off after each cancer cell's division.

If telomerase production can be turned on in normal cells, it seems reasonable to assume that normal cells could be immortalized.

"One thing we've learned from the mouse model," says Campisi, "is that you don't want cells to not senesce at all, because if you do that, you have cancer. What would be great would be to have some of those senescent cells die, so that they don't accumulate with age. That's what we're working on. It's not going to be easy to do that, but that's the idea, that's the long-term goal."

Campisi credits her scientific creativity to her wide-ranging education, which includes an undergraduate degree in chemistry, a Ph.D. in biophysics and postdoctoral research in the biology of cancer.

"I kind of learned at an early age not to be bound by field or science or even technique. And so I think when you have that kind of broad training, you move between fields very easily." Moving between fields allows Campisi to keep looking for the next thing she needs to know. "You have to have this fire in the belly to know the answer to something, and then you just go and find out."

Research, says Campisi, is a lot like one of her favorite pastimes, cooking. A little of this, a little of that -- the best of meals are unplanned, the result of intuition and experimentation. "I consider recipes advisory only," says the microbiologist.

Likewise in her research. Campisi enjoys creating her own path to an answer, pursuing solutions not with a sprinter's speed but at an ambler's pace, taking the time to search out familiar territory for missed clues and overlooked details.

"I have this philosophy of I just start doing this random walk," says Campisi, "and eventually I wind up where I need to go."

Currently, she is walking her way through the complex problem of aging by trying to identify the molecular mechanisms responsible for cellular arrest, studying the defective genes in premature aging diseases, and determining how telomere length is regulated. The payoff from that research, she hopes, will be a postponement of aging.

Some scientists, such as Jay Olshansky, a professor of public health at the University of Illinois at Chicago, express caution when it comes to the promise of research into aging.

The co-author of "The Quest for Immortality" said last year: "When we survive into old age, just as with automobiles and race cars, things start to go wrong, and unless we can change the structure of the body itself or the rate at which aging occurs, then inevitably things will go wrong as we push out the envelope of human survival."

Lanza, of Advanced Cell Technology, believes the problem of wear and tear will soon be overcome.

"I think there's no question that in two or three decades we'll be able to replace every part of the human body," says Lanza. "Whole organs, like blood vessels and bladders, have already been grown in the lab."

Like Ames, Campisi believes the secret to longevity is about maintaining a balance in the biological processes, whether it's mitochondrial function or the stability of the DNA.

"Eventually you run out of cells, which is why immortality is not on the books," she says. "But a reversal of aging -- as long as you define the aging process segmentally -- is within our grasp.

"We really are talking about how to preserve the health of tissues for the maximal period of time, for a very long and healthy extended middle age."

Article list

Faith's Place: Belief in a spiritual power is a universal trait. That's because we've been designed for religious experience.

December 3, 2002

SANTA CLARA, Calif. — The human brain, even at its ancient, primitive core, is less an organ of impulse than a machine of reason. We are built to make sense of things. Our brains restlessly scan the world for patterns in chaos and causes in coincidence.

We crave explanation and, when faced with the ineffable, sometimes we create the answer.

For many people, the answer to the most ineffable question of all — "Why do we exist?" — is God.

Neuroscientist Rhawn Joseph has spent years studying history, myth and biology in his quest to understand the universality of spiritual experience and its evolutionary function.

In his studies of the brains of Tibetan monks and Franciscan nuns, radiologist Andrew Newberg seeks out the relationship between neural activity and mystical experience.

Both men believe that the connection between the brain and spirituality suggests that there is a physiological basis for religion — that human beings, in essence, are hard-wired for God.


Rhawn Joseph slowly stirs his third cup of coffee, staring at the whirlpool of milk that spreads across the top and then disappears. Joseph is an oasis of quiet in the lunchtime havoc of the Cozy Restaurant in Santa Clara. Maybe that's because Joseph is thinking about God.

Joseph believes there is a neurological, even genetic, explanation for religious belief and spiritual experience.

Homo sapiens, he theorizes, have evolved the capacity to experience God primarily through the amygdala, a small, almond-shaped structure buried deep in the brain. The amygdala, along with the hippocampus and hypothalamus, make up the limbic system, the first-formed and most primitive part of the brain, where emotions, sexual pleasure and deeply felt memories arise.

Says Joseph: "These tissues, which become highly activated when we dream, when we pray or when we take drugs such as LSD, enable us to experience those realms of reality normally filtered from consciousness, including the reality of God, the spirit, the soul, and life after death."

Joseph, who has a doctorate in neuropsychology and is the author of a comprehensive textbook called "Neuropsychiatry, Neuropsychology, and Clinical Neuroscience," published by Lippincott, cites his own clinical and historical research, as well as studies of epileptic patients who have experienced religious hallucinations, as evidence that "spiritual experience is not based on superstition but is instead real, biological, and part of our primitive biological drives."

The quest for the truth of spiritual experience started early for the 51-year-old neuroscientist:

"My grandparents were very religious, especially my grandmother," says Joseph, whose heritage is Jewish and Catholic. "She would read to me from the Bible when I was a little kid. I loved listening to those stories, and I still remember the evening when she was reading to me about Sodom and Gomorrah. I was 3 or 4 years old, and I asked her if God had killed the little children, too, and she said yes, and I said, 'What did they do wrong?'

"She didn't have an explanation for it, and it really made me wonder: What kind of God is this?"

That question stayed with Joseph through college at San Jose State University, where he majored in psychology and minored in philosophy, and at University of Health Sciences/The Chicago Medical School in Illinois, where Joseph enrolled in the neuropsychology Ph.D. program. He finished his requirements for the doctorate in two years, half the usual time, and then interned at the Traumatic Brain Injury Center of the VA/Palo Alto Health Care System, which, says Joseph, gave him "a whole different direction in life: the brain."

After a second internship, this time in the neurology department of Yale University Medical School, Joseph was offered academic positions at a number of colleges and universities.

"I turned them all down," says Joseph. "The salaries were too low, my ego was too big, I wanted to be independent, and I didn't like cold weather.

"It was a big mistake. It is the tragedy of my life. I have always wanted to teach; unfortunately, the right job and the right school in the right location has never been offered."

Today, the scientist is the founder of an independent publishing company in Santa Cruz called University Press. Some of the company's nonfiction books concern astrobiology, the science of consciousness and, in a recent collection of essays, neurotheology — the study of the relationship between brain function and spiritual experience.


There is a maverick, even provocative bent to much of Joseph's writing. He has published a half-dozen books of his own at University Press, including "The Transmitter to God: The Limbic System, the Soul, and Spirituality," and he continues to research a number of subjects, many of then in evolutionary biology.

Most of Joseph's investigations begin with a look into cultural contexts.

"In a lot of myth, you can go back and find elements of history," he says. "You can look at the Old Testament as fanciful stories, or as containing the seeds of historical information. ... If you're going to be a scientist, you can't dismiss things and you have to go take a good look and try and sift through it all. It's like panning for gold."

Joseph admits his approach to questions is sometimes unorthodox. Like other scientific seekers, he is creative and intuitive, more comfortable taking his own route to an answer than someone else's. In the 1970s, Joseph was one of the first scientists to demonstrate the hormonal and environmental foundations of gender differences in learning, as well as the neuroplasticity of the brain — recovery of brain cells — in primates. Though he is unaffiliated with any academic institution, Joseph has been invited to speak on all these subjects, at different times, by the University of California at Berkeley, Brigham Young University and the University of Geneva in Switzerland.

For the past 20 years, Joseph has been mining neuroscience, astronomy, history, religion, archeology and anthropology for clues about the meaning of intense religious ecstasy during which a person may see an image of God or hear the voice of an angel. Joseph believes those experiences are the result of hyperstimulation of the amygdala, which releases large quantities of natural opiates. The same opiates are released in response to pain, terror and trauma, as well as social isolation and sensory deprivation.

"Hyperactivation of the amygdala, hippocampus and overlying temporal lobe gives a person the sense that they're floating or flying above their surroundings," says Joseph. "It can trigger memories and hallucinations, create brilliant lights, and at the same time secrete neurotransmitters that induce feelings of euphoria, peace and harmony."

Many religious people might view the cause and effect in reverse — it is the divine inspiration that activates those areas of the brain, instead of the other way around — but to Joseph, the order is irrelevant. For him, the more important question is, "Why?"

"There are creatures living in caves who don't have eyes," he says, "because there's nothing for them to see. But we have a visual cortex and an auditory cortex, because there are things we were made to see and hear. You don't develop a brain structure to help you experience something that doesn't exist." We are hard-wired for God, in other words, because there is a real God to experience.

Matthew Alper, author of "The 'God' Part of the Brain," believes this assumption is flawed. "We're capable of repression, of phantom limb pain — our capacity to believe what isn't there is also sometimes helpful."

Joseph acknowledges this but argues there is an equally possible alternative explanation for spiritual experience: evolution.

"Maybe the ability to experience God and the spiritually sublime is an inherited limbic trait. Maybe we evolved these neurons to better cope with the unknown, to perceive and respond to spiritual messages because they would increase the likelihood of our survival."

We became genetically predisposed to spirituality, says Joseph, because belief in a divine being makes us stronger.

It also makes us less anxious, says Alper, and that is critical for a self-conscious species like our own.

"Consciousness creates so much anxiety that our species had to come up with a cognitive adaptation to deal with the pain of our intelligence — being able to think about our own mortality, for instance," he says. "So it came up with a brain modification that allows us to believe in an alternative reality, that when we die there is a spiritual part of us that will live forever."

Proving the evolution argument, says Massimo Pigliucci, an associate professor of ecology and evolutionary biology at the University of Tennessee-Knoxville, is an entirely different matter.

"It is possible that if there is an advantage — that believing in an afterworld or God reduces anxiety or allows you to better navigate the world — that nature selected for that belief. But there's no evidence for that, and not only do we not have any evidence, there is no way to gather the evidence. It is inconceivable that you could do an experiment on survival of people who believe in an afterlife, because human beings in the past evolved in a totally different environment than any of us live in today."


The lack of opportunity for empirical studies does not deter Joseph. He sees similarities across cultures in near-death experiences; beliefs in ghosts, spirits and demons; symbols such as crosses, triangles and circles, as further evidence of the neuro-anatomical basis of spirituality.

"If you're a scientist and you find people having the same experience, colored by their own cultural differences, all over the world 4,000 years ago and among both children and adults, you have to say, well, there's something there that's worthy of scientific explanation."

For Joseph, the brain is a magical vessel, "an enchanted loom," as neurologist Charles Sherrington wrote 60 years ago, "where millions of flashing shuttles weave a dissolving pattern, always a meaningful pattern though never an abiding one; a shifting harmony of subpatterns."

The search for those patterns, says Joseph, is what brain research is all about.

"You throw open a door and there is another door and another door. Most people don't even realize they're closed, or they don't care. I want to open all of those doors."

When he's not readying his company's next book for publication, Joseph, who is single, spends most of his time writing and reading, or thinking as he walks in a nearby park. His home, he says, is piled high with journals from every scientific discipline.

There is a restlessness to Joseph's mind that is revealed only in his eyes. They are constantly scanning the landscape wherever he is — in a diner, in a park — looking for some missed opportunity to make a connection, to find an answer, or to uncover another question.

Sometimes on the weekend, Joseph will travel around Santa Cruz, where he has lived for 16 years, and stop in at a religious service. One day it's a gathering of Jehovah's Witnesses, another time it's a synagogue. Or the Church of Christian Science.

"Am I religious? No. Am I spiritual? Yes. I certainly don't believe in an anthropomorphic God. I would say the kingdom of God is inside us all. The brain is the chamber of God. It allows us to realize God and contemplate God, whatever God is."


Huddled inside a shoe box of an office that is buried deep inside the Hospital of the University of Pennsylvania, in Philadelphia, Andrew Newberg is also looking for God. Though he believes the limbic system is important in explaining religious phenomena, he does not think it is solely responsible. The complexity and diversity of those experiences, he says, must involve other higher brain structures, specifically the autonomic system.

The 36-year-old radiologist is a study in intensity. When he speaks, his sentences often spill into one another like excited children on the cafeteria lunch line.

"Going back to when I was young, as a child, I just was always asking a lot of questions, always wondering about why we were here and how we can know something and what it meant to have that kind of knowledge and how we got to it."

Newberg's day job is radiology. Three days a week, he takes pictures of kidneys, lungs and hearts, looking for signs of disease. Two days a week, when he has willing subjects, he takes pictures of the brains of deeply religious people, looking for signs of God.

Newberg is conducting brain-imaging experiments trying to identify those areas where neural activity is linked to religious experience. In so doing, Newberg is taking Joseph's theories about the relationship between the limbic system and spirituality one step further.

A dozen times over five years, Newberg has brought in men and women, Tibetan Buddhists and Franciscan nuns, to peer into their brains as they meditate and pray. In the first experiment, involving a Tibetan monk, Newberg attached an intravenous line to the subject's arm and had him meditate inside a small, darkened laboratory on the third floor of the hospital. When the monk was deep into meditation, Newberg injected a chemical tracer into the IV line.

A minute later, the monk was placed on an inclined table, his head directly beneath three rotating lenses of a massive, high-imaging machine known as a single photon emission computed tomography camera.

The images from the SPECT scans were filled with pools of neon green and red. The patterns represented increased and decreased blood flows to various parts of the brain, especially the lobes. Newberg found areas of increased blood flow in the frontal lobes, where higher thinking takes place, and decreased blood flow in the back or parietal lobes, where spatial orientation takes place.

Newberg said the frontal lobe activity might be an indication of heightened activity in the amygdala, as Joseph theorizes, although better imaging techniques would be needed to prove it.

"We believe that we were seeing colorful evidence on the SPECT's computer screen of the brain's capacity to make spiritual experience real. We saw evidence of a neurological process that has evolved to allow us humans to transcend material existence and acknowledge and connect with a deeper, more spiritual part of ourselves perceived of as an absolute, universal reality that connects us to all that is," Newberg says in "Why God Won't Go Away: Brain Science and the Biology of Belief," one of the two books he wrote with the late psychiatrist Eugene d'Aquili.

The son of Reform Jewish parents, Newberg practices Judaism but has an affinity for Eastern religions as well. His scientific searching, he says, is just part of what it means to be fully human:

"I read in a book about Taoist teaching that there's this two-way street between you and God. And that in some way, as a human being, you have to kind of go up and reach towards it. And to me it's a little bit like that: that whatever is real out there, you have to let it come to you, but you also have go towards it.

"And I think that the contemplative part for me is the waiting part and the scientific is the going-after-it part. And so for me personally, the thing is to keep pushing yourself toward the questions and keep asking about the issues, and not being satisfied when the answers don't quite make sense."


Newberg has been pursuing reality and trying to describe it for most of his life. As a chemistry major at Haverford College, just outside Philadelphia, he had nearly enough credits to graduate in 1988 with degrees in astrophysics, philosophy, religion and Russian history. Before going off to get his medical degree at the University of Pennsylvania, Newberg conducted an unusual senior research project at Haverford: trying to create life.

"It was one of these classic experiments where you put together methane and ammonia and water in a test tube and then jolt it with electricity or ultraviolet light or something like that and try to make amino acids. That's the basis for a lot of theories about how early life started. I had this big test tube, and I had these visions of coming in and seeing two little eyes looking out at me. I think we may have made an amino acid at some point. ...

"I just wanted to tackle something like that. I'm always tackling the big questions."

In medical school, Newberg realized a lot of his interests kept leading him back to the brain. Radiology — taking pictures of the inside of a person's body and especially the brain — seemed a good fit. Research in dementia and Alzheimer's led Newberg to reading psychiatry bulletins, which in turn led the young doctor to d'Aquili, then an associate professor of psychiatry at Penn's medical school and a pioneer in the neurological research of religion.

D'Aquili was heavily involved in his own work but finally relented and agreed to meet the persistent medical school student. Newberg bombarded the psychiatrist with questions. By the time he finished medical school in 1993, Newberg had teamed with d'Aquili.

"We came up with a very detailed model about what we thought was going on in the brain" during intense spiritual experiences, Newberg said. "So we started working toward testing those hypotheses by doing imaging studies."

The imaging studies revealed that two specific areas of the brain, the posterior superior parietal lobe and the prefrontal cortex, play a critical role in intense spiritual experiences. In their books, Newberg and d'Aquili refer to these two brain structures as the areas of orientation and attention, respectively.

The "orientation association area" is responsible for creating the mental experience of personal physical boundaries and for providing a kind of spatial, three-dimensional matrix in which the body locates and orients itself.

The "attention association area" is critical in organizing all goal-directed behavior and actions.

The SPECT scans of Newberg's subjects during deep meditation revealed two things: that there was increased activity in the attention association area, and decreased activity in the orientation association area.

"Several studies suggest that the attention area is able to focus the mind upon important tasks through a process neurologists describe as 'redundancy,'" write Newberg and d'Aquili in "Why God Won't Go Away." "Redundancy allows the brain to screen out superfluous sensory input and concentrate upon a goal. It's what allows you to read a book in a noisy restaurant or daydream while walking along a crowded street. ... Victims of damage (to the attention association area) are often unable to complete long sentences or plan a schedule for the day. They also frequently exhibit emotional flatness. ...

"We believe part of the reason the attention association area is activated during spiritual practices such as meditation is because it is heavily involved in emotional responses — and religious experiences are usually highly emotional."

In the scans of one of Newberg's Buddhist subjects, this attention association area was lit up in red at the peak of his meditation. However, the orientation association area in the parietal lobe registered little electrical activity.

Newberg asked himself: What if the orientation area was working as hard as ever but the incoming flow of sensory information had somehow been blocked? With no information flowing in from the senses, the orientation association area wouldn't be able to find any boundaries. What would the brain make of that?

Using the evidence of his meditating monks and praying nuns, Newberg says he now believes the brain has no choice but "to perceive that the self is endless and intimately interwoven with everyone and everything the mind senses," and that this perception, to those in the midst of an intense spiritual experience, feels "utterly and unquestionably real."


The University of Tennessee's Pigliucci believes Newberg's experiments are "well-done and interesting," but he takes exception to Newberg's interpretation of the results:

"Suppose we wanted to investigate some paranormal phenomenon, such as telepathy, and you claim that your brain behaves in a particular way when you do telepathy. So we do a brain scan, and we see that the pattern of neural activity will change because you are trying to concentrate on doing telepathy.

"The scan will obviously be different from your brain at rest, but does it show that telepathy is going on? No. The brain is always working. ... You go to the movies, you eat a piece of chocolate, you dream — your brain patterns will change."

Newberg acknowledges that at some fundamental level, the question of the existence of God will forever remain unanswered. "You can't throw open that veil of the brain and get outside of your own brain and see what's going on in the objective external world."

Even if science can't pry open that door, Newberg remains sanguine.

"Regardless of the perspective you take, the idea of God doesn't go away. I don't think we would ever say we could prove or disprove God just on the basis of our imaging studies."

Like Pigliucci, Anne Harrington, a Harvard professor in the history of science, believes Newberg's interpretations are overreaching, but she thinks his attempt to understand, scientifically, the nature of spiritual experience is worth the pursuit.

"If you really want to understand humans, you can't say anything is off bounds," says Harrington. "Max Weber, an important sociologist, gave a talk ... in which he said science was disenchanting us of our idea of reality, and that if people couldn't cope, then the doors of the churches were still open. They (Newberg and Joseph) aren't trying to re-enchant science. They're just saying that everything is still open to scientific investigation."

Newberg is willing to follow those investigations wherever they lead.

"What we're really talking about is that, regardless of whether God truly exists or not, in some sense it's not even a relevant issue. Human beings are always going to have this sense of connection to God, defining God broadly, whether we create it ourselves or whether there really is a God. ... In either case, to me it's a part of who we are. I've always felt that that which is absolute is everywhere, and it's just a matter of being open to it."

How faith happens — its connection to the thick forest of neurons inside our skulls — may be just one more leap of the imagination, the brain's gymnastic way of exercising its instinct for order.

"I don't think I've found any answers yet," Newberg says. "I think I'm finding ways of understanding the questions better, and I think — or, at least, I hope — I'm heading down a path that will give me more and more tools to be able to really answer those questions ... Just because we can't figure these experiences out doesn't mean we shouldn't talk about them, even if they're intensely inexplicable. How we ultimately describe them, that's what's important."

Article list

Genesis: If the universe is flat, the best way to show how it happened is to take its temperature.

December 4, 2002

GREENBELT, Md. -- We are acquainted with the universe through science, but we are intimate with it through ancestry. Mingling in our blood is the breath of creation -- hydrogen molecules born billions of years ago in the nuclear furnace that marked the beginning of time. In some fundamental way, our fascination with the cosmos is nothing more than an attempt to understand our own celestial roots.

For more than two decades, physicist Alan Guth has been fine- tuning his answer to the ultimate cosmic question: How did it all begin?

Late on a winter night in 1979, he believed he'd figured it out when he looked down at the pages of equations he'd just scribbled. Embedded in all those numbers was a theory about the origin and shaping of the universe that would shake the foundations of cosmology.

The theory -- called "inflation" -- explained how in the first fraction of a second of the big bang, space rolled out like a cosmic carpet, flat and infinitely large. And just as the weave of a carpet captures dust, the fabric of space captured bits of matter in its seams until billions of years later there were stars, planets and galaxies.

For the past 15 years, radio astronomer Charles Bennett has been working on experiments that could prove or disprove Guth's theory. Bennett's laboratory tool? The ancient light of the big bang.

Next month, Bennett and his colleagues at NASA's Goddard Space Flight Center here at Greenbelt will announce new findings based on years of studying that light, which is called the cosmic microwave background radiation. The results should help confirm not only the shape of the universe but the origin of all matter.

If Bennett's findings bear out what the preliminary evidence suggests, Guth's concept soon could be elevated to a perch in the pantheon of scientific ideas alongside Einstein's theories of relativity.

The power of Guth's proposal lies in its ability to explain not only our most distant past but our present as well -- how a universe that was infinitely small is now infinitely large.

Bennett and Guth, both New Jersey natives, are peering back nearly to the beginning of time, close to the genesis moment, seeking answers to the why and how of creation. For thousands of years, those were questions only philosophers dared to ask.


A deep-throated murmur leaks out of a brightly lit computer room at the Goddard Space Flight Center and escapes down a labyrinth of beige corridors. The hum, it turns out, is not coming from the bank of computers inside the room, but from a nest of pulsing, 4-inch-wide silicon coils that are keeping the computers cool. Without the air conditioning, the computers' 170 processors, which are analyzing information about the heat of the universe from a probe nearly a million miles from Earth, would turn to toast.

Charles Bennett checks the thermometer. It reads a steady 73 degrees. The 46-year-old astronomer smiles. The principal investigator of NASA's Microwave Anisotropy Probe, Bennett helped build the spacecraft that today is taking the temperature of the universe as it was when it began.

Most scientists believe the universe came into existence in an event known as the big bang. Contrary to popular belief, big-bang theory is not about an initial explosion of matter. There was no primordial firecracker that exploded in the middle of nothing.

There was, rather, a ferocious spasm of infinitely dense, immeasurably hot subatomic particles. This spasm created the fabric of space, which has been unrolling for 14 billion years, carrying along with it the remnants of light and heat from the beginning of time.

Of the three most fundamental methods of measurement -- time, distance and temperature -- it is temperature, which measures the motion of particles that make up the universe, that has the most to tell about how the universe was created and the shape it took. Temperature helped determine the contents of our solar system, the size of our planet and the conditions for human life.

Before there were stars, even before there was light, the universe had a temperature. Minuscule dips in the temperature of the universe's background radiation are seen as evidence of slightly denser points in space where matter began to coalesce into what would become the planets and stars.


The universe's first light -- the background radiation -- has been called the afterglow of creation, and it is streaming all around us, invisibly sifting though our hair, mingling with our breath, even settling in our lungs. Most of the time we are completely unaware of its presence, although 1 percent of the "snow" or static picked up by a TV antenna when no program is being broadcast is big-bang radiation.

"The light comes from a time before there were any stars or galaxies, any carbon or oxygen," says Bennett. "The tiny temperature fluctuations (anisotropies) that we're measuring in the universe hold the key to its shape."

Because light travels at a finite speed (186,282 miles per second), it takes time for it to reach us. The farther away an object, the longer it takes. The light from the sun, for example, takes eight minutes to reach the Earth; the light from the Andromeda Galaxy, 2 million years. Writer Edgar Allan Poe, an amateur astronomer, was the first to suggest some stars were so distant their light had not yet reached us.

The deeper we look into space, therefore, the deeper we are looking into our past. What astronomers such as Bennett are hoping to find there is a snapshot of the infant universe and evidence of its first features -- the tiny bits of subatomic matter that one day would become the stars, planets and galaxies.

When the universe was 100 million years old, the first stars appeared. At 1 billion years old, the first galaxies. Galaxies gathered into clusters. Clusters congregated into superclusters. The universe cooled to minus 475 degrees, and the primal scream of creation became a sigh.

For decades scientists sifted through the background radiation like cosmic archeologists trying to dig up an artifact -- some piece of evidence that their theories about the big bang and the beginning of time weren't just numbers and equations, but had substance and reality.

If a detailed picture of the universe's background radiation revealed subtle variations in temperature, that would be evidence that the landscape of the early universe had tiny hills and valleys where matter would gather, just as Guth's theory predicted.

Those were the issues facing Bennett when he worked on the Cosmic Background Explorer from 1984 to 1996. COBE was the most ambitious effort to measure the background radiation since its discovery in 1965.

Bennett loves sweating the small stuff. In fact, he was born for it. The son of a scientist, he spent the first two years of his life in New Brunswick, where his father was in graduate school at Rutgers studying solid-state physics. The family moved to Bethesda, Md., when his father landed a job as a research scientist with the National Bureau of Standards (now the National Institute of Standards and Technology).

Bennett grew up a tinkerer, happiest when he was building things -- especially new and better antennae for his ham radio. His world was circumscribed by circuits, transistors and capacitors until he was 14, when his grandmother gave him a telescope.

"Every night I'd take it into the back yard and look at whatever I could see. I loved looking at the moon, the planets and especially the rings of Saturn. Then I learned that there was something called radio astronomy. You get to build circuits to look at stuff in the sky.

"That really just combined the two hobbies I had, and I decided right then that that's what I was going to do. ... It used to drive my friends crazy that I knew exactly what it was I wanted to do. I didn't want to do just physics or astronomy. I wanted to do radio astronomy."

His freshman year at the University of Maryland, Bennett was one of about 120 physics majors. By the time he graduated four years later, there were half a dozen.

"I wasn't the brightest guy in there," he says with a laugh, "and so I had to make up for it with harder work ... but I loved it. I loved being able to solve problems."

Bennett graduated with high honors in physics and astronomy and headed off to the Massachusetts Institute of Technology for his Ph.D.

In the mid-1980s, just as he was finishing his doctorate in radio astronomy, COBE came calling. The COBE team, headquartered at Goddard, was starting to build the instruments that would be used to measure the cosmic microwave background radiation, and the team needed a radio astronomer.

By the time COBE launched on Nov. 18, 1989, Bennett was the deputy principal investigator for the Differential Microwave Radiometer, the instrument that would measure the brightness of the radiation and produce a map of the average anisotropies in cosmic temperature. The measurements of these variations, it was hoped, would bring into clearer focus the origin, shape and size of the universe.


At the beginning of the 20th century, the universe was thought to be finite, bounded by the edges of the Milky Way. By the end of the century, the Milky Way was just one of a hundred billion galaxies, and the sun one of trillions of stars. The limits of space had been pushed into infinity.

The Hubble Space Telescope, launched in 1990, was an attempt to see into that vastness. And its views confirmed what scientists had believed for some time -- that the universe was uniform in all directions, but also that it was a bit lumpy. Instead of matter being spread out evenly through space like butter on bread, it looked like a bowl of cold, clumpy oatmeal someone forgot to stir. Oceans of stars were pooled into galaxies, galaxies were bunched into superclusters, and in between was a latticework of gas and dust and seemingly empty space.

Throughout the summer and fall of 1991, Bennett sat in front of his computer at the Goddard Space Flight Center. Day after day, hour after hour, he studied thermal maps of the sky. When he was not on his computer at Goddard, he was on his laptop at home, often spread out on the floor of the family room after he and his wife, Renée, had put their two young boys to bed.

Finally, that December, Bennett felt satisfied with what he was seeing. When the team announced its findings in the spring of 1992, the reaction by the scientific community was nothing short of astonishment.

COBE had produced a map that showed the background radiation differing in temperature ever so slightly in different directions, sometimes just 30 millionths of a degree hotter or colder than average. "These variations are so small," says Bennett, "they're like height variations of only 4 inches on a mile- high plateau," or the difference in the weight of a cup of sand when one grain is removed. Still, these ripples in the background radiation were just irregular enough to correspond to the slight clumpiness from which all structure in the universe evolved.

"Seeing these fluctuations is like the first peek into a window of the physics of the early universe," says Princeton cosmologist and MAP team member David Spergel. "All these things we couldn't measure before are emerging and keep fitting with the standard model" of the big bang.

Another COBE team member said that seeing that first thermal map of the background radiation was like "seeing the face of God."

COBE's detection of tiny temperature fluctuations was the strongest evidence yet in support of theories that suggest the universe is flat, with just enough matter to keep it glued together while it continues to expand -- instead of too little matter, which would make it fly apart, or too much, which would make it collapse back on itself.

Bennett and his team of astronomers had done something cosmologists usually only dream about: They had verified that all the late-night musings and academic papers of theoreticians, all the mathematical hypotheses and conjectures, were grounded in reality. Big-bang and inflation theories were the best things on the table not because they were the best guesses, but because they fit the evidence.

COBE's limitation, however, was that it could measure the differences in average temperature only for huge swaths of the sky. It couldn't pinpoint the fluctuations. Enter the Microwave Anisotropy Probe, or MAP, launched in June 2001. Next month Bennett and the MAP team will release its first findings, which are expected to answer the question: What did the universe look like shortly after its birth?

"What we have is a bunch of theoretical possibilities of what the temperature patterns mean for what the universe is like," says Bennett. "And each of those specific theories predicts a kind of pattern. MAP is going to measure the pattern that's really on the sky, and then it's like a detective story: matching the fingerprint with the mug book."


Alan Guth believes he made that match more than 20 years ago. In 1979, the young physicist invented inflation theory, which described why the big bang happened and then what happened in the next trillionth of a trillionth of a trillionth of a second.

By the time COBE was launched, Guth had been waiting more than a decade, without much hope, for the empirical evidence that would tell him he was on the right track.

That pessimism was born from historical frustration. Cosmology is unlike most sciences, where theories spring from evidence. Ideas about the universe, such as Guth's, are born in the absence of evidence . For the better part of the 20th century, facts about the origin and shape of the universe had been impossible to come by.

Guth seems an unlikely candidate for a scientific revolutionary. The 54-year-old physicist is boyish in appearance, his slightly graying hair swept over his forehead like a'60s surfer. He sits, hunched over, on the edge of a threadbare armchair in his office at M.I.T. in Cambridge, Mass., and speaks quietly, almost conspiratorially, as if he is letting his visitor in on some secret of the universe.

He is.

"The classical big-bang theory was never really a theory of a bang," says Guth. "It was a really a theory about the aftermath of the bang. Inflation answers the question of what happened before that -- what made the universe bang in the first place."

Where Charles Bennett is a cosmic gumshoe, tracking down leads and gathering evidence to prove his case, Guth is a cosmic magician, a lover of numbers and equations who still marvels that something he thought up in the solitude of night might be the key to creation.

Both men believe the cosmic microwave background radiation holds the clues to solving the mysteries of the universe's creation, evolution, shape and fate. For Guth, who created his inflation theory 20 years ago in an empirical wasteland, the thermal maps of the microwave background could either confirm or destroy his idea of creation.

In 1979, Guth was at the Stanford Linear Accelerator Center in Menlo Park, Calif. It was his fourth stop on the postdoctoral "beauty pageant" circuit, during which freshly minted Ph.D.s audition for university professorships. Eight years and stints at Princeton, Columbia and Cornell well behind him, Guth saw his career wilting on the vine.

The son of a grocer, Guth grew up in Highland Park and went to the local public schools, then on to M.I.T. for his undergraduate and graduate degrees. Now, with his tenure at SLAC in its final months, he was facing unemployment and, worse, failure as a scientist.

Everything changed in the space of four hours late on the night of Thursday, Dec. 6, 1979.

Guth was holed up in his small study in a rented one-bedroom house not far from the Stanford campus. While his wife, Susan, and 2-year-old son, Larry, slept in the next room, Guth began writing. He and a colleague were trying to rush a paper into print on a topic in particle physics dealing with the transitional phases of the early universe as it expanded and cooled. Guth's job that night was to check whether one of these phases would affect the expansion rate.

By 1 a.m. Guth had the answer, and it was a surprising, emphatic "yes." What came next can be described only as a "eureka" moment, because Guth realized that this idea of an early transition phase could have profound implications for solving the mystery of why the big bang happened at all.

"In order to make the big- bang theory work," he says, "you have to very carefully fine-tune your assumptions about the initial conditions of the universe, to put the universe just on the borderline of the right mass density to allow eternal expansion. Too much mass density would cause the universe to collapse back on itself. Too little would cause it to fly apart. The mass density at the time of big bang had to be just right. ... So the big question was, what caused that to happen?"

A few pages of equations later, Guth had the answer: inflation.


Guth's insight was to see that before there was a universe, there was an infinitely small energy field of subatomic particles, and what caused the universe to "bang" into existence was a fluctuation in that energy field. Like shaking a can of Coke and then popping its top, in the first trillionth of a trillionth of a trillionth of a second of the universe's existence, an enormous amount of energy was trapped, creating a negative pressure, which in turn caused a sudden and violent stretching of space. In one brief massive burst, the budding universe -- smaller than the width of a proton -- doubled in size 100 times over.

This transitional phase left small pockets of subatomic particles scattered through space, like the bubbles left in a boiling pot of water after the heat is suddenly turned down. It was those particles that created the seams in the early universe where matter would gather, eventually growing into galaxies and galaxy clusters.

Everything that exists today -- from the stars and planets to every rock, tree and human being -- can trace its ancestry to less than an ounce of original matter, according to Guth's theory.

Early on the morning of Dec. 7, 1979, on little sleep, Guth bicycled to the center -- somehow having the presence of mind to time his ride, a habit he had picked up in graduate school. The soon-to-be- world-renowned cosmologist would proudly note in his journal that night that he had broken his personal speed record with a time of 9 minutes and 32 seconds.

When he sat down at his desk at the center, his first notation wasn't about his bike ride, but his new theory. With equations stirring restlessly around in his brain, Guth wrote at the top of a page in his notebook, "SPECTACULAR REALIZATION."

Within weeks of announcing his theory, the physicist was inundated with offers from universities. But Guth wanted to return to his alma mater, and took his cue from a fortune cookie he opened at dinner one night: "An exciting opportunity lies just ahead if you are not too timid."

Guth called the physics department at M.I.T. to see whether there was an opening. Twenty-four hours later he was offered an associate professorship.

More than a decade later, Charles Bennett's team on NASA's Cosmic Background Explorer project provided the first empirical evidence validating the equations Guth had puzzled out with pen and paper.

Today, inflation is considered by many scientists to be one of the greatest achievements in cosmology in the 20th century. Last year Guth received the Benjamin Franklin Medal in Physics, which in past years had been given to Albert Einstein, Edwin Hubble and Stephen Hawking. Several months ago he was awarded, along with two other cosmologists, the 2002 Paul Dirac Medal by the Institute for Physics, which honored Hawking in 1987.


Using his IBM Thinkpad, which sits on his desk in his M.I.T. office wedged between discarded computer keyboards and 3-foot stacks of paper, Guth calls up four graphs on the screen, one on top of the other. He does this a few times a week, he says, just because he likes looking at them. The graphs are four different measurements of the cosmic microwave background, and the lines appear nearly identical.

"Back when I came up with inflation, I never believed anybody would ever measure the predictions," says Guth, smiling at the results on the computer screen in front of him. "I just thought it would be fun to calculate it. Now that they have, it's amazing. Looking at the results of COBE, and how the measurements agreed perfectly with the predictions of the inflationary model, was absolutely wonderful. It's really gorgeous to look at."

From its debut as little more than a theoretical blip on the screen, inflation theory has become the golden child of cosmology, the best explanation yet of what happened at the genesis moment.

"When inflation was introduced, there were a lot of disbelievers," says Michael Turner, chairman of the astronomy and astrophysics department at the University of Chicago. "So far it has passed the test. It predicted the universe was flat, and that's what we're seeing. It predicted these 'acoustic' peaks in the cosmic microwave background, and that's what we're seeing and what we're zeroing in on. The real question now in terms of inflation is how much of the truth it has and, of course, what caused it."

If inflation is the dynamite of the universe, says Turner, "then cosmologists are still looking for the match."

Inflation theory has been refined and updated and altered. There are now several variations on the theme, including chaotic inflation, extended inflation, hyperextended inflation, open inflation and two-round inflation. All can trace their ancestry to Guth, the struggling Ph.D.

Other recent findings about the cosmic microwave background radiation offer more direct evidence that what is now the standard model of cosmology -- big bang plus inflation -- is correct. There also is every indication that when the MAP team announces its results next month, the evidence finally may be overwhelming.

"I believe it will be a major success for inflation," says cosmologist Andreas Albrecht of the University of California at Davis. "Whenever I talk with someone from the MAP collaboration, they can barely contain their joy at the success of their experiment. The results should be fantastic."

Article list

How will it end?: Will the universe disappear, or does a mysterious force have other plans for it?

December 5, 2002

PASADENA, Calif. -- The unknown is a great seductress. Never is this more obvious than when we look up into the illimitable darkness of night. In its capacity to enchant, the universe has been siren and muse to poets and scientists alike.

Some 70 years after the discovery that there were galaxies beyond the Milky Way too numerous to count, the most elusive questions of cosmology are beginning to be answered. Scientists believe they now know what happened in the first microsecond of creation, how the big bang got started and how seeds of energy gave birth to matter.

What they still don't know is how it will all end.

The answer lies within the fabric of space itself, where a titanic tug of war is being staged between gravity and a mysterious dark energy, a repulsive force that is tantalizing scientists with its tenacity.

Understanding dark energy and how it affects space could help in figuring out the future of the universe, whether it is destined to continue expanding, fall back on itself in a violent implosion, or be consumed by everlasting darkness.

Two forces. Three possible futures.

Astronomer Wendy Freedman has all but determined the speed of the universe's expansion. Physicist Paul Steinhardt is working to understand how dark energy affects that expansion.

Together, these discoveries could tell us what will become of the universe hundreds of billions of years from now.


Like many scientists who have a passion for what they do, Wendy Freedman remembers the exact moment her fascination with astronomy began -- on a summer night, beside a lake, under a trellis of stars ribboned by darkness.

Leaning back in a chair in her second-floor office at the Carnegie Observatories, Freedman smiles, her soft brown eyes squinting slightly as if to better focus the memory.

"I was 7 years old and we were on vacation at Lake Simcoe in Toronto," says the Canadian native. "And I remember that the sky was very, very dark and my father told me that the light from the stars we were looking at had left a long time ago, maybe so long ago that those stars weren't even there anymore. And that just knocked my breath out."

Freedman, 47, is a soft-spoken anomaly in a field historically dominated by testosterone and braggadocio. But she is a no-nonsense astronomer who has spent 25 years reeling in galaxies with a telescope and transforming their starlight to data.

Templing her fingertips together, she talks about her decade-long search for the Hubble constant, a number that would tell her how fast the universe is moving. That simple, two-digit number had been the quarry of some of the 20th century's greatest astronomers -- until Freedman bagged it in 1999.

"It took many, many years and there were lots of low points," she says. "You keep measuring and remeasuring, and problems come up with calibrations, and things go wrong with the telescope, and it seems like you're never, ever going to finish and find the answer."

Many before Freedman certainly had tried, beginning with Edwin Hubble himself, the American astronomer who first demonstrated the existence of galaxies outside the Milky Way (and for whom the Hubble Space Telescope is named). It was Hubble who, in 1929, was the first to observe galaxies rocketing away from each other. His conclusion, that the universe was expanding, was viewed by many as a second Copernican Revolution, further displacing the notion that the Earth was the cozy, static center of a one-galaxy universe.

The question that plagued Hubble and other astronomers for years afterward was how to measure the speed of that expansion. The answer looked simple. Find distant stars and then measure two things, the speed at which they are receding from Earth, and their distance.

Easier said than done. A star's outward speed, known as redshift velocity, can be calculated by observing the wavelength of the star's light (the longer and redder the wavelength, the faster the star is speeding away), but distance is another matter altogether.

In 1838, the distances to the stars nearest Earth were measured for the first time using a simple geometric principle known as triangulation, or parallax.

To understand how parallax works, hold your index finger in front of your face at arm's length and look at it while quickly covering one eye and then the other. Your finger appears to jump back and forth because you are looking at it from two angles. Now hold your finger directly in front of your nose and do the same thing. Your finger seems to jump more dramatically. The same thing happens when looking at stars from two different sides of the globe. The closer the star or planet is to the Earth, the larger the parallax.

For an object far beyond the boundaries of the Milky Way, however, parallax is almost impossible to detect. To measure the distance to extragalactic objects, a different tool is needed: a type of highly luminous star called a standard candle.

Among the best standard candles are a class of stars known as Cepheid variables. Cepheids blink in predictable patterns and with predictable intensity, which allows astronomers to know exactly how bright -- and therefore how distant -- they truly are. Combined with stellar velocity, the distance calculation produces a kind of cosmic speedometer for the expansion of the universe.

Freedman has been studying Cepheids for two decades, and her brain is brimming with facts and figures about the stars. When she talks, she wields those statistics like a mathematical seamstress, using numbers to buckle her sentences together and snap thoughts into place:

"Cepheids had been studied really well for seven decades. Our project was to ... look at those Cepheids and also to find more of them at larger distances. We looked at 800 in 24 different galaxies. A typical galaxy has between 50,000 and 100,000 stars, and we took 32 different images of each galaxy. ...

"The breakthrough that the Hubble telescope provided was by getting up above the Earth's atmosphere, where we didn't have the blurring that takes place and where we could survey galaxies that were at a distance 10 times as great as we normally could see from the ground."

With the Hubble Space Telescope, launched in 1990, Freedman knew if she could find Cepheid variables far enough away, she'd have a set of candles by which she could measure how fast the universe was expanding.


The daughter of a medical doctor and a concert pianist, Freedman always had an affinity for research. When she arrived at the University of Toronto as an undergraduate, her intention was to study biophysics, but a freshman astronomy course reminded her of those summer nights on the lake with her father -- and her fascination with the stars.

"I wrote term papers first on the solar system and then star formation and later did a senior thesis on galaxy evolution," says Freedman. "I even had a teacher tell me he was sorry I left the solar system. So I guess I've been moving farther and farther out as I've gone along."

Freedman continued on at Toronto for her Ph.D. in astronomy, eventually married her graduate adviser, Barry Madore, and then began a postdoctoral fellowship at Carnegie Observatories. Within three years she had become the first woman staff member in the institution's nearly 100-year history.

Founded in 1903 by George Hale, one of the leaders of modern astrophysics, the Carnegie Observatories is a small two-story building nestled among suburban homes on a tree-lined Pasadena street. The wood-paneled hallways are populated less with the living -- all of them appear to be behind closed office doors -- than the dead. Newton, Einstein, Hubble -- the brightest lights of science -- form a kind of gantlet as they peer down from paintings and photos that line the Observatories' corridors.

As soon as Freedman got to Carnegie, she drew up plans for a project involving the Hubble Space Telescope: to determine the speed of the universe by finding the most distant variable stars ever observed. The pitch worked. Freedman's Extra-Galactic Distance Scale project was named the key, or primary, project of Hubble before its launch.

Freedman's euphoria was fleeting. Hubble's first observations brought bad news: Instead of new vistas, there was a cosmic blur. The telescope had a flawed lens, and it would be three years before shuttle astronauts could fix the problem. When they did, Freedman had new hope for the success of her hunt for the Hubble constant.

Her best chance was to find a Cepheid variable in the Virgo cluster, the nearest big cluster of galaxies to the Milky Way. If she could find a variable star in Virgo, Freedman thought, it would be more than twice as far as the most distant Cepheids then known and would make an ideal standard candle for measuring the Hubble constant.

The telescope was trained on the edge of the Virgo cluster at a spectacular spiral galaxy known as M100. With more than 100 billion stars, M100 can be viewed face-on through a telescope, its majestic swirling arms filled with bright blue clusters of hot, newborn stars and winding avenues of dust and gas.

The 30 members of the key project team honed in even closer on M100 -- on one of its outer arms laced with young massive stars. They pored over nearly three dozen exposures taken over a two-month period, searching for flickering Cepheids as if they were diamonds in a sunlit sea. It took the computers a month just to crunch the data and another month for the key project team to read through it all.

Most of the time Freedman woke at 3 a.m., and though she, her husband and their two young children lived just a mile and a half from Carnegie, she would boot up her computer at home, impatient to study new data.

On May 9, 1994 -- her daughter Rachel's birthday -- Freedman saw stars that seemed to have that certain brightness.

"I remember sitting there and looking at my computer, and it was amazing. Not only were they there, they were beautiful. They were really high-accuracy, low-scatter, unmistakable Cepheid variables."

Like a nearsighted person finally putting on eyeglasses, Freedman had found what she was looking for, and by the time she and the team finished studying all the images from the Virgo cluster, they had 20 Cepheid variables -- 20 different standard candles that would help them nail down the Hubble constant.

"It's like when you're hiking and you're looking at some distant sight, or you're climbing and it just looks like you're never going to get there and you slog on, and you have nice times but you have hard times, too, and then suddenly you're at the top and you did it and it's exhilarating."


By the summer of 1999, Freedman and her team had the answer that had eluded astronomers for 70 years. After more than 400 hours of observation time on the Hubble Space Telescope, after the sampling of 800 known and newly discovered Cepheids in two dozen galaxies over a swath of sky millions of light-years across, the Hubble constant was no longer a mystery. They had a number, and it was 72.

The universe, the team concluded, was expanding at a rate of 72 kilometers per second per megaparsec (3.26 million light-years) of distance. This means that two galaxies 3.26 million light-years apart are moving away from each other at an average of 160,000 miles per hour, and galaxies twice as far apart are moving away from each other at twice that speed.

Hubble constant in hand, Freedman's team was able to rewind the film of creation back nearly to the beginning of time. But when they did, they had a problem. The age of the universe appeared to be only 8 to 10 billion years old -- younger than the generally accepted age of 14 billion years, younger even than some of the stars in our own galaxy.

Other astronomers, notably a team led by Allen Sandage, who works just down the hall from Freedman at Carnegie Observatories, had been making different kinds of measurements that fit with an appropriately older universe. But Freedman's team seemed to have the more commanding data -- different methods of measurement, over a wider and more distant range of objects. So where was the error?

In the fall of 2001, two independent teams of astronomers, one led by Saul Perlmutter from the Lawrence Berkeley National Laboratory in Northern California, and another led by Brian Schmidt of Australia's Mount Stromlo Observatory, found an explanation.

There wasn't a miscalculation of the Hubble constant, the two astronomers concluded. Freedman's team simply didn't know that another, completely unexpected factor was affecting the expansion of the universe -- something they simply called "dark energy," an inexplicable force continually working against gravity.

The universe wasn't just expanding, it was expanding at an accelerated rate, not slowing down as was previously thought.

The discovery astonished astronomers, but it helped clarify the age problem that arose with the Hubble constant. The new calculations showed that today's accelerating universe had taken a lot longer than 8 to 10 billion years to expand to its present size -- 14 billion years was back on the map.

There is some bad news, however. If dark energy continues to cause the hyperexpansion of space at its present rate, the fate of the universe appears horrifyingly grim.

Long after the sun has expended all its hydrogen, becoming bigger and hotter and turning the Earth into a cinder, stars and galaxies will speed away from each other. Eventually, vast stretches of black space will push the clusters so far apart that, for all intents and purposes, there will be no more starlit nights.

Galaxies will die, no new stars will be born and, right before the end, only black holes will fill the infinite darkness until even they are consumed, leaving, at the very end of time, pretty much nothing at all.

Said Michael Turner, a world-renowned cosmologist at the University of Chicago, last year: "We live in a preposterous universe. ... Dark energy. Who ordered that?"

The presence of dark energy wasn't a complete surprise. Ninety-five percent of the universe is a mystery. Only 5 percent of space is filled with known matter -- stars, planets and interstellar gas and dust. Astronomers believe an additional 30 percent of the universe contains "dark," or unknown, matter -- perhaps subatomic particles that have yet to be detected. The rest, a whopping 65 percent, is this dark, repulsive energy that works in opposition to gravity and is seemingly accelerating the universe into extinction.


The discovery of dark energy had opened a window onto the future, and the view was staggeringly bleak. But could there be a different theory, a way of understanding the universe that told a different story?

It's exactly the kind of question Paul Steinhardt had been waiting his whole life to answer.

The path to the end of the universe, however, started with a question about the beginning.

The 49-year-old Princeton University physicist was an early contributor to inflation theory. First set forth in 1979 by Alan Guth, inflation says that the big bang was set in motion by a fluctuation in an energy field that suddenly and exponentially stretched the size of the universe in the first microsecond of creation.

Steinhardt provided some crucial refinements to inflation theory, for which he shared the prestigious Paul Dirac Medal in physics with Guth of the Massachusetts Institute of Technology and Neil Turok of Cambridge University in England earlier this year.

Ironically, Steinhardt is now taking dead aim at inflation. He is a rambunctious scientist, a muscular thinker who prefers the discomfort of nagging questions to the boredom of accepted theory.

"There was always a question in my mind that maybe we just haven't been imaginative enough to think of an alternative to inflation," says Steinhardt. "Probably because I was looking at inflation at close range, I also could see its flaws, incompleteness, and so I've always had my eye out for alternatives. I think that's the way as a theorist that you test an idea, to see how difficult it is to come up with an alternative."

Finding an alternative to inflation was no quick fix. It meant coming up with an entirely new, even revolutionary, model for the universe.

Eight months ago he and Turok presented their new theory, dubbed the cyclic model, to the public. For Steinhardt in particular, the alternative had a significant advantage over inflation: It made dark energy the "good guy" -- an infinite but ever-changing force that would endlessly expand and contract the universe. There would be no cosmic armageddon. The universe -- or a series of different universes -- would go on forever.

"Cosmology has this problem that we can't go back in time and actually see how things were. Our information is through a kind of fossil evidence, so it's always an issue whether you're interpreting that fossil evidence correctly. So you develop a good story -- the big-bang inflation is a good story, and everything seems to fit in it. But how do you know it's the only story?"

Steinhardt began his search for a different story in 1998. First he looked at "brane theory," the branch of physics that suggests there are multiple universes, or membranes, existing in multiple dimensions. What if the big bang was simply a collision between two branes? And out there pushing them together was dark energy?

In cyclic theory, instead of inflation and big-bang expansion, universes undergo an endless sequence of cosmic epochs -- bangs and crunches -- that begin when two membranes collide like two weather systems slamming into one another to create two new ones.

"This assumes that time has no beginning," says Steinhardt, "or at least that the big bang is not the beginning; rather that the universe has gone through many stages of expansion and contraction."

The cosmological community is intrigued by the cyclic model but not altogether convinced. "It has a certain aesthetic attraction," says Arthur Kosowsky, a theoretical cosmologist at Rutgers, "in that you don't have to worry about what caused the big bang, which in the inflationary model is likely a question for metaphysics rather than science. Right now the cyclic model is very new. It seems likely that variants and subtleties will continue to be uncovered for a while."

Juan Maldacena, a theoretical physicist at the Institute for Advanced Study in Princeton, is more skeptical. "It's interesting, but it involves some assumptions that are less well motivated than the assumptions of inflationary theory. It is nice to have an alternative, but I would still bet my money on inflation."


For Steinhardt, the cyclic model serves a dual purpose -- it eliminates the question of what existed before the big bang, and it makes dark energy the force that causes branes to collide. "Expanding and contracting, heating and cooling, being highly dense to being highly under-dense: The best way to produce these features in the cyclic model is by using the physics of dark energy," says Steinhardt.

Dark energy made its first appearance as a hypothetical early in the 20th century. When Einstein was refining his general theory of relativity, he was forced to assume the existence of a mysterious force (which would later be called "dark energy") in order to keep gravity from causing the universe to contract. At the time, Einstein and nearly everyone else believed the universe to be static and essentially unmoving, so the father of relativity factored in what he called a "cosmological constant," represented by the Greek letter lambda, to balance gravity and keep the universe essentially unmoving.

Einstein, however, was never comfortable with his "fudge factor" and later, when Hubble discovered that in fact the universe was expanding, Einstein called the cosmological constant his biggest blunder.

But maybe it wasn't. Perlmutter and Schmidt brought Einstein's fudge factor back into the picture. How else to explain this repulsive, antigravitational force, this acceleration, that is trying to tear the universe apart?

There are two fundamentally different ways of viewing dark energy. Either it is like Einstein's cosmological constant and is woven into the fabric of space, or it is something that inhabits space and is more unpredictable.

Steinhardt believes the latter -- that dark energy is actually an energy field that interacts with matter and can change in intensity. He calls this dark energy "quintessence."

Steinhardt considered different names for his particular explanation of dark energy -- other scientists were independently calling it the "x-factor" and "funny energy." Finally he let his children select "quintessence" from among several of his own suggestions. Coined by Aristotle, quintessence means "fifth element." The ancient Greek philosopher believed that the universe was composed of four elements (earth, air, fire and water) and a fifth, ephemeral substance, called quintessence, which held the planets and stars in place.

The advantage of quintessence in scientists' calculations is that its energy varies. Early in the universe, when matter dominated, quintessence might have been weak, but with expansion it is now exerting a stronger force.

An energy that varies means the density of the universe is not fixed, nor is its fate. A universe propelled by quintessence is constantly changing, endlessly cycling through periods of expansion and contraction.

Most cosmologists feel forced to choose among three very different futures for the universe:

A future where the universe speeds off into nothingness because it is too light -- that is, there is not enough matter exerting a gravitational pull to hold it together.

A future where the universe collapses back onto itself in a big crunch because there is too much matter exerting a gravitational pull.

A future where the universe expands infinitely but ever more slowly, because its weight is perfectly balanced against expansion.

In Steinhardt's cyclic theory, all three scenarios play a part.

The weight of his "multiverse" is always changing, as universes bang, crunch and then bang again.

"I think the quintessence model is economical," says Marcelo Gleiser, an astrophysicist at Dartmouth College. "It gives us a way to show how dark energy can evolve and change. ... Of course, we still need to find the facts that justify it. But so goes theoretical physics. Often ideas precede observations."


The fact that Steinhardt is proposing an entirely new way of viewing the universe -- when no one else is really looking for one -- is just part of his nature, he says. Big questions need big answers, and he wants to be in the middle of the battle to find them.

The son of an Army lawyer who died when Steinhardt was only 9, he spent the first few years of his life on the move. After settling in Miami with his mother and siblings, Steinhardt quickly took up math and science.

"I remember my father used to tell me these very dramatic stories of people like Madame Curie and all that and about moments of discovery," says Steinhardt as he sits drinking yet another cup of coffee in his office at Jadwin Hall at Princeton. "And ever since then, I thought that was just a wonderful thing, to be the first person to know something, which is the greatest fun in doing science -- that moment when you think you know something important that no one else knows."

Steinhardt studied particle physics as an undergraduate at the California Institute of Technology and as a graduate student at Harvard University. It was at Harvard that Steinhardt, as a physics post-doc, happened to attend a weekly visitor's lecture in which Guth was the guest speaker, talking about inflation.

"That's really how I got into cosmology," says Steinhardt. "And I always thought that his talk was the most exciting and the most depressing talk I ever went to. Ninety percent of it was about how inflation solved so many questions about the universe, and then the last 10 percent was how, unfortunately, once inflation takes hold it never ends."

Laughing, Steinhardt says: "I thought, well, I'll spend a few weeks and try to see if I can find a way around this problem."

It was more like a year and a half, and when he did find a solution, he was hooked on cosmology. What better place than the universe to seek out new discoveries and find answers to age-old questions?

"Cyclic theory is an attempt at a better theory. ... My goal is to explain as much data, with as powerful a theory, as possible. ... That's what drives me."

The attraction of quintessence for the nonscientist is its hopefulness. If dark energy is a cosmological constant, then it will never change and the universe will expand forever until all that is left is a vacuum. But if dark energy is quintessence or something like it, then the universe could one day decelerate or contract and be spared its gloomy fate.

"Quintessence is just a much more exciting and appealing interpretation of what the dark energy is," Steinhardt says. "It's something really, really important. It's not just important for the future, it's important for the whole story."

Steinhardt doesn't know yet how to measure quintessence.

More than most, however, he understands that science is a cerebral ballet of leaps between the known and unknown, between truth and possibility, and that it is the capacity for wonder that inspires the dance.

"Thought is a flash between two long nights," wrote 19th-century mathematician Henri Poincaré, "but this flash is everything."

For Steinhardt, it is thought that creates the story of the universe, a story that astronomer Wendy Freedman continues to read in the stars. The numbers at the heart of the story are almost within reach.

Like all scientists, Steinhardt and Freedman are seekers. Curiosity is stitched into their genes, and the search for ultimate answers is fueled by their uncertainty.

"The goal is to get these secrets out of nature," says Steinhardt. "Whatever it takes."

Stories copyright 2002 The Star-Ledger. Reprinted with permission.

Related Articles
ASNE announces 2003 award winners


Post a Comment