Excerpt: 'Carved in Sand'

April 5, 2007 — -- Memory loss: It's a problem that plagues nearly everyone who reaches middle age, yet few people want to talk about it.

Now acclaimed journalist Cathryn Jakobson Ramin is lifting the veil off memory loss with her new book, "Carved in Sand: When Attention Fails and Memory Fades in Midlife." She takes readers through the brain changes that happen in middle life, mixing scientific findings with humorous anecdotes to paint a better picture of what's going on inside the middle-aged mind.

For anyone who's ever had a bout of forgetfulness, "Carved in Sand" is an indispensible read.

Read an Excerpt From "Carved in Sand" Below:

Chapter 1

Your Unreliable Brain

Midlife Forgetfulness Is Embarrassing and Frustrating, but What Does It Mean for the Future'

On the drive from the suburbs to the city, we'd experienced a disturbing number of memory lapses. Actually, the first bout with forgetfulness had occurred earlier that afternoon, when our friend Sam, who was three hours away in Reno judging a barbecue contest, forgot that we had dinner plans. After a not-so-gentle phone reminder from his wife, he made the drive home in record time, still carrying a whiff of slow-smoked baby back ribs on his person. That mistake was in the past, but other canyons loomed before us. Where was our restaurant, again' (I had printed out the address, as I'd promised -- and left it on the kitchen counter.) Had my husband made the reservation for seven, or seven-fifteen' Which way did Post Street go' Was the nearest parking lot on the corner, or midway up the block' I made the error of mentioning that the little bistro we'd chosen had a great young chef, fresh out of the kitchen of some hotshot who had a restaurant in the Napa Valley, and another in Los Angeles. Or maybe it was Las Vegas. I'd read about it somewhere.

That's what set my husband off.

'Oh, I know exactly who you mean,' he said, ready to educate us. Then, he drew a blank. I watched him become increasingly preoccupied as he explored every shadowy cognitive pathway, searching for the name he was after.

I whispered that he ought to give it a rest -- he'd think of it later.

'But it's driving me crazy,' he said.

An hour into this hard-earned evening out with friends, more information was missing than present. Among our peers, this state of affairs was so common I'd started to call it 'the content-less conversation.' When the words 'Ken Frank, La Toque, Fenix -- and that is in L.A.' finally tumbled from his lips, we cheered. We could move on to other things, like whether any of us had ever tasted the nice bottle of wine we were ordering, and if we had, whether we'd liked it. Maybe we'd only heard about it. Or read about it. Or seen it on the supermarket shelf. No one could say for sure.

'I guess this is normal,' Julia sighed, 'but I swear, no one we know can remember a thing.'

'It may be normal,' Sam said darkly, 'but it isn't acceptable. Maybe thirty years ago, when life was slower and you could depend on a gold watch for thirty years of dedicated service and a pension, it would have been okay.'

He was right: What was making us nuts hadn't flustered our parents in their forties and fifties. But their lives were different, and so were their expectations. They weren't changing careers or inventing new ones. At the age of fifty-two, they definitely weren't trying to remember to show up for back-to-school night for three kids at two different schools.

Normal -- But Not Acceptable

Nearly every time the subject of forgetfulness arose, people asked me if what they were experiencing was 'normal.' If they defined that word as the dictionary does -- 'conforming with, adhering to, or constituting a norm, standard, pattern, level or type,' the answer was yes -- perfectly.

Everybody asked, but in truth, few people were content with the implications of 'normal.' What they really wanted to know was whether they were just a little (or a lot) better off than their peers. This was important: If they slipped below the mean, chances were good that they would not be able to keep up.

What was normal had changed considerably over the centuries: Two hundred years ago, if we aged 'normally' -- that is, according to our biological destiny -- forgetfulness wouldn't be an issue at forty-five or fifty: Most of us would be in our graves. Medicine constantly redefines what is normal in terms of physiological aging. We get new knees and new hips. We take drugs to control our blood pressure. We don't give up reading when our fading vision demands that we hold the newspaper at arm's length. Instead, we build ourselves an arsenal of reading glasses and scatter them all over the house and office, in case we forget where they are. When the New York Times Magazine began to run a Sunday cartoon series with wording in a font so small that I couldn't manage it even while wearing my reading glasses, I suffered no damage to my self-esteem. And yet, when it comes to what scientists call 'age-associated cognitive changes,' we take it personally and refuse to do anything about it, mostly because we're not sure what we can do.

This sheeplike complacency occurs because the brain is our most intimidating organ. Your brain, with you from the start, has demanded remarkably little attention. Like me, you've probably spent more time worrying about the condition of your abdominal muscles. We assume that what is going on in there is as mysterious as the universe, involving such concepts as consciousness, being and soul, surely best left to the philosophers and the clerics. Here's the news: From a purely biochemical perspective, your brain -- a three-pound bolus of fat, with the texture of lightly scrambled eggs -- is essentially the same organ a rat is carrying around on its shoulders. As a result of your genetic inheritance, certain aspects of how your brain will age are already inscribed in the Book of You, but it's written in pencil, and you have an eraser. Recent studies of pairs of elderly identical twins, only one of whom developed Alzheimer's disease, show that genetics, although influential, aren't everything. As you will see, you can indeed influence the way your brain ages, through diet, physical and mental exercise, and assuming you've done all you can in those departments, the increasing availability of pharmaceuticals intended to enhance cognition.

Say it to yourself: 'Normal, but not acceptable.' And, from what the senior scientists tell me, definitely mutable, subject to the quality of your resolve.

I'll wager that when presbyopia set in, some time in your forties, and you could no longer read the small print, you didn't tell yourself that if you only tried harder, you could conquer your shortsightedness, nor did you ruminate about hiding your deficiency from your friends, family and employers.

Not so when your memory began to fail, you suddenly had the attention span of a flea and you felt like you were moving in slo-mo through cognitive Jell-O. In that, there was a portentousness: Maybe the foundation of your existence was crumbling and this was the beginning of the end. For people who have always been very competent, with a talent for thinking on their feet, forgetting brings a disturbing sense of the loss of control and mastery.

'Memory loss is a stealth killer,' said Peggy, a corporate consultant. For years, her quick mind and her sharp wit were her calling cards. Companies hired her because she was a very fast study, able to march into a corporate boardroom, absorb quantities of unfamiliar data, assess the holes in a strategy and produce a new, improved one, fast. It's getting a lot harder, she said. 'There are a million ways I know my brain is different. It's managed, in a few short years, to rob me of my pride and self-esteem. I notice that the 'ah-ha' moments of insight, on which I rely, are fewer, and frankly, they yield less fruit.'

People who glided through their school days and early professional years blessed with nearly encyclopedic recall and the ability to keep multiple plates spinning in the air, who've never relied on calendars or kept meticulous notes, suffer the most. 'I used to be able to do it all almost flawlessly,' said Rudy, who runs a manufacturing company. 'It was all up here,' he said, pointing to his bald pate, 'but recently I've started to cross wires.'

Predictably, he blew a circuit. He sent off a new manual to be printed and bound -- 100,000 copies -- failing to indicate the location of a critical switch. The new equipment, fresh from the factory, piled up in the warehouse, while the corrected manual was reprinted, at considerable expense. No one got paid for a long time, and he acknowledged that his error could have tanked the company. 'It could not have happened before,' he said, 'and I find it terrifying.'

It may seem as if midlife forgetfulness arrives overnight. 'One day I was fine,' acknowledged Laura, the marketing manager of a retail company. 'Better than fine, really. I was a whiz. And the next day, I was struggling to remember the names of the people who work for me.' In fact, the decaying of memory and attention is a slow process that begins in our twenties, when we start to lose processing speed. (Mice, rats and primates experience the same decline.) Because you're endowed with redundant systems -- enough spare neurons to get you through -- you don't feel it right away. By the time you reach your early forties, however, there are significant differences from the early-to-mid-twenties peak -- and it's essentially downhill from there.

'It's not the fact of the memory deficit that's the problem,' avowed clinical psychologist Harriet Lerner, author of The Dance of Anger, when I asked her what she thought lay beneath the surface. 'It's the anxiety that comes with it. Forgetting becomes globalized. It's no longer about the file you can't find on your desk. Now it's about the prospect of rapid mental deterioration. You interpret your latest error to mean that you're on a fast, deep slope to aging, perhaps faster than your best friend or your colleague, and that you will shortly be revealed as inadequate, unworthy, unintelligent and undesirable company for the people who pay your salary, as well as the ones you love and respect. Not surprisingly, that's when the fear and shame kick in.'

The more anxious you become, Lerner explained, the more likely you are to ruminate, an activity that is guaranteed to make matters worse. 'People become overfocused, in an obsessive, nonproductive way,' she said. 'You start waking up at three in the morning, catastrophizing about what you've forgotten. You begin to avoid circumstances where your weakness will be revealed -- staying out of conversations and backing away from work challenges, for instance. But avoidance doesn't work. It only makes the shame grow.'

When you start that slide into shame and fear, it's difficult to put on the brakes. You lose access to higher-level reasoning, the aspect of thinking that allows you to say, 'Hey, maybe you're making a mountain out of molehill.' Instead, the primitive brain takes over and you plummet into a black-and-white thinking, where subtle differences don't exist. When shame and fear arise -- even if it's because you can't remember your ATM password -- the primitive brain takes you straight into survival mode. It says, 'Hey, buddy, you could be in deep trouble here.' Suddenly, this is not about your password. It's not even about feeling foolish as you hold up the Friday afternoon line at the cash machine. It's about your ability to do your job, to manage your life, to remain safe, to feed yourself and your family and to keep a roof over your head.

What Does the Future Hold'

Our forgetfulness is fraught with implications about who we will become as we age. It's easy to allow feelings of embarrassment, frustration and anger to plunge us into fear -- cold, implacable anxiety emerging from the suspicion that we might have decades of dependence ahead of us, with a diminished mind trapped in a still vigorous body.

Without question, the people who worry about it the most are the ones who have had the misfortune of watching a loved one decline into Alzheimer's disease. Today, that experience is common: Alzheimer's occurs in 35 percent of people eighty and older. Of the three hundred individuals who replied to my survey, more than one third had watched a first-degree relative -- parent, aunt or uncle -- fall into Alzheimers' grip. Understandably, they began to observe themselves closely. 'After watching my mother lose her memory to Alzheimer's, I am hypersensitive to every little memory slip,' said Evelyn, a singer and songwriter. 'The other day, I couldn't remember the name of a business acquaintance I wanted to introduce to my husband at the supermarket. She wound up introducing herself, very pointedly, and I felt as if I'd not only insulted her, but that I'd failed in some profound way.'

Washington Post columnist George Will explained what it was like to watch the disease take hold of his mother. 'Dementia is an ever-deepening advance of wintry whiteness,' he wrote, 'a protracted pairing away of personality. It inflicts on victims the terror of attenuated personhood'No one has come back from deep in that foreign country to report on life there. However, it must be unbearably frightening to feel one's self become as light as a feather, with inner gales rising.'

I met Phyllis in Weymouth, Massachusetts, at an Alzheimer's clinic. She'd brought her seventy-two-year-old mother, Margaret, to be evaluated. I stood in a niche between the door and the exam table, a fly on the wall as Margaret's new physician asked her questions. She didn't know what floor of the building she was on, but she knew which county she stood in. He offered her three words -- 'ball, flag and tree' -- and then asked her to repeat them a few minutes later. Even with coaching, she could not. He asked her to draw a clock, with the hands placed at ten after eleven, but it was not possible.

During the exam, I watched Phyllis's face. She wore a slight smile, meant, I expect, to comfort her mother, who had no idea why her daughter insisted on this visit. When the doctor offered his assessment -- Margaret was about to enter stage two of Alzheimer's, when the disease usually took off like wildfire -- the patient continued to sit placidly on the table, swinging her legs, but Phyllis started to cry. Margaret would need to stop driving right away, the doctor said, and it would be a good idea to switch out the gas stove in her kitchen for an electric one.

'You're taking away her independence,' Phyllis said. 'She'll be horrified.' I could see the fear clutching at her. She was grieving, of course, for her mother. But there was more to it: Phyllis was considering her own prospects. Nothing scares us more than the prospect of becoming someone else's burden.

In the waiting room, Phyllis confided in me. It wasn't the right thing to be thinking about, she told me, not when her mother was in such trouble. But she wondered if I could tell her something. In the last few years, she'd noticed changes in her memory -- nothing big, for sure -- but enough to set her on her heels. Did her own lapses mean that in twenty-five years, or even less, she'd wind up like her mother'

I knew very well what she wanted me to say: That what she was experiencing was normal, that midlife memory failures, as irritating as they are, signify nothing, that these small incidences of forgetfulness are not evidence of a trickling stream of damage that would eventually grow into a torrent, and devastate memory, language and reason.

In good faith, I couldn't tell Phyllis what she wanted to hear. Every day, a new study rolls out of a university lab confirming that Alzheimer's isn't a disease that suddenly rears its head in old age. Current research shows that decades before clinical symptoms arise -- in middle age or even before -- the seeds of Alzheimer's disease are already planted. To insist otherwise is to indulge in the most unhealthy sort of denial. Her symptoms could be portentous -- or might mean that her middle-aged brain was simply overcome with the responsibilities of working, caring for her children and taking care of her declining mother.

Today, 4.5 million Americans have Alzheimer's disease. It's the eighth leading cause of death in the United States, and nearly as prevalent in Japan and Europe. In 2005, Alzheimer's cost the federal government $91 billion in Medicare costs. By 2010, that number will be closer to $160 billion. If the disease proceeds unimpeded, in twenty years the number of people with Alzheimer's is projected to increase to fourteen million. When you consider that by 2030, almost one in five Americans will be over sixty-five, it is apparent that we face a public health burden that will swamp us.

'By the time even the earliest symptoms of Alzheimer's are detectable by clinicians,' explained John C. Morris, Director of the Center for Aging at Washington University, 'there is already substantial brain damage, actual cell loss, in the critical areas of learning and attention. There is little or nothing that science can do to restore those neurons to their original state. We need to start tracking forty-year-olds. You don't want to wait until you have to say to somebody, 'Your PET scan just lit up like a Christmas tree.''

The goal of nearly every Alzheimer's researcher is to find a way to identify the beginning of the disease process. 'If we can meet this disease in the earliest stage, and counteract it with drugs that reverse the damage, science would no longer be in the position of trying to build new neurons,' said John Q. Trojanowski, director of the Penn Institute of Aging and Alzheimer's Disease at the University of Pennsylvania. 'It is in your forties, or maybe even younger, that normal memory loss begins to diverge from pathological memory loss. In my lab, we're spending a lot of time trying to define that fork in the road, where you either continue to lose a little bit of your memory capacity each year, and remain essentially normal, or you take the other fork, where you are on a downward trajectory, culminating in dementia.'

What sparks Alzheimer's disease' Attend an international Alzheimer's meeting, as I did in Philadelphia, and you will quickly realize, as you mill around with five thousand scientists, that for every hypothesis, there's a contradictory one. Read the peer-reviewed journals, however, and you'll see that consensus is building fast, suggesting that for all of us in midlife, a slow aggregation of proteins begin to block communication between the cells, resulting in mild forgetfulness. That's annoying, but it isn't pathological. In some people, for reasons that I'll go into later, these same tiny proteins get out of control, triggering the development of dense plaques and tangles that surround neurons and eventually strangle them. Today, scientists are working hard to develop a test that will show -- at the earliest possible moment -- who is taking the wrong fork in the road.

That the seeds were already flourishing in some of us made for a grim prospect, but David Bennett, the director of the Alzheimer's Disease Center at Rush University Medical Center in Chicago, refused to look at it that way. 'What's grim,' he said, 'is to call it 'normal aging,' to tuck your tail between your legs and refuse to put up a fight, both individually and on the political front.' For the first time in history, he explained, the federal government has reduced the dollars they'll spend on Alzheimer's research. If you live to the age of fifty, there's an excellent chance that you'll make it into your late eighties.

'If you survive that long, your lifetime risk of developing Alzheimer's is very high,' Bennett said. 'If you don't want to end your life this way, you need to let Washington know about it. How fast we can get this disease under control depends largely on how much money we can throw at it. One primary prevention trial costs more than thirty million dollars and takes five to ten years. So write to your representative in Congress and tell them you want a treatment by the time you're seventy.'

Here's what is clear: No matter how innocent or malevolent your proteins are, how you treat your brain and the rest of your body in middle age will definitely make a difference. In response to considerable pressure from those of us in midlife, scientists have now turned their headlights on cognitive aging. A clear picture has emerged: Except for a very few individuals, who carry a specific genetic mutation, Alzheimer's is not the inevitability that we once imagined. Recent studies demonstrate that midlife is the time to act: to consider your diet, control your weight and blood sugar, amend your sleeping habits, increase your aerobic fitness, attend to your stress level and -- most crucially -- assure that your brain gets the right kind of exercise. These practices will go a long way toward making sure you take the right fork in the road, reaching old age with the bulk of your marbles intact.