Rhesus monkeys do not often appear on the front page of the New York Times, but on July 10, 2009, there were two, pictured side by side: Canto, age 27, and Owen, age 29. In monkey terms, this made them the equivalent of senior citizens, but the striking thing was that Owen looked like he could have been Canto’s beer-drinking, dissipated dad. His hair was patchy, his face sagged, and his body was draped in rolls of fat. Canto, on the other hand, sported a thick (if graying) mane, a slender frame, and an alert, lively mien.
What made the difference? Diet. Since early adulthood, Canto had been fed 30 percent less food than Owen. The two monkeys were part of a long-running study of dietary restriction and aging, conducted at the Wisconsin National Primate Research Center in Madison. Beginning in the late 1980s, the researchers had been deliberately underfeeding Canto and some of his unfortunate colleagues. By late 2008, enough animals had died that the scientists could report meaningful results in Science.
The differences were as striking as the side-by-side photos: The calorie-restricted monkeys were far healthier, in terms of basic measures such as blood pressure, and had far less incidence of age-related disease, such as diabetes and cancer. And they seemed to be living longer: While 37 percent of the control monkeys had died of age-related causes at the time of the report, only 13 percent of the restricted monkeys had done so.
The results seemed to confirm one of the longest-held beliefs about aging: That eating less—a lot less—will help you live longer. Since the 1930s, scientists have learned that restricting diet in many animals, from fruit flies to trout to mice, will extend lifespan, both the average and the maximum. The phenomenon has been known for so long, and observed so often, that it’s been accorded the status of near-dogma in some circles. A devoted group of believers who think the principle should extend to humans has practiced caloric restriction, sometimes eating as little as 1,200 calories per day.
Now a new paper has come out in Nature, reporting a parallel monkey study conducted by the National Institute of Aging. The NIA study began around the same time as the Wisconsin study, with similar experimental conditions. But the Nature authors found no increase in lifespan; the calorically restricted animals lived no longer, statistically, than their well-fed cousins. Even stranger, the NIA control monkeys, the ones who ate a lot, actually lived just as long as the calorie-restricted Wisconsin primates. What gives?
Many of us simply roll our eyes and click away when yet another medical study contradicts the last study—so what else is new? Coffee’s bad for you, until it’s good for you—and so is red wine. Antioxidants are essential, or they’re useless. And so on. Contradictory studies are an essential part of the science-news stream—and, in fact, an important part of science itself. But that doesn’t make it any less frustrating.
The parallel monkey studies are some of the most important and closely watched experiments on aging to be conducted in our lifetimes. It was expected, even assumed, that the NIA results would show that caloric restriction extended longevity—the holy grail of aging research.
The fact that it didn’t, and that the two studies conflict, has unintentionally revealed a different truth about diet and aging. In both studies, the monkeys that ate less were healthier by a number of measures—and suffered far less from age-related disease. Even better, when taken together, both studies reveal a different path toward living a healthier life—one that doesn’t require self-starvation. To understand the new findings, let’s begin with a taster’s tour of the strange, fascinating world of caloric restriction.
The concept goes back to the 1930s, when a young professor of nutrition named Clive McCay noticed that hatchery trout seemed to live longer when they were fed less. At the time, he was looking for more economical ways to raise the fish (it was the Great Depression, after all), and the long-lived, underfed fish were too small to interest anyone. But the phenomenon puzzled him enough that he set up an experiment in his Cornell lab, in which he fed one group of rats about one-third less than another group of rats. In his much-cited 1935 paper, he showed that the restricted rats lived more than 60 percent longer than the normally fed animals—more than 800 days, versus an average of 500 days.
This astonishing result was the equivalent of humans living to age 125 or beyond. Even more amazing was that the experiment was repeatable, not just in rats and mice. Over the years, various researchers have shown that caloric restriction can extend life in bats, dogs, and even spiders, and on down to nematode worms and single-celled organisms like yeast. After decades of work, it remains the only way known to increase maximum lifespan. So a lot is riding on the concept, scientifically speaking.
The idea made its way into pop culture, too. In On the Road, Jack Kerouac writes:
I stumbled out of Harrisburg—cursed city! The ride I proceeded to get was with a skinny haggard man who believed in controlled starvation for the sake of health. When I told him I was starving to death as we rolled East, he said, “Fine, fine. There’s nothing better for you. I myself haven’t eaten for three days. I’m going to live to be 150 years old.”
In the 1990s, Leonard Guarente of MIT discovered a class of longevity genes in yeast called sirtuins that appear to be activated by a lack of food. Sirtuins appeared to be “conserved” in evolution, meaning that they appear in nearly all species, on up to humans. Sirtuins are thought to have evolved as a way to enable animals to survive periods of famine. They seem to work by regulating certain metabolic pathways and reduce the amount of damage cells endure.
It appeared, then, that caloric restriction seems to activate some sort of deep survival mechanism common to nearly all life forms. If researchers could somehow identify and isolate that mechanism, they’d be that much closer to some kind of longevity pill. Except for one inconvenient fact: Caloric restriction itself does not always work.
The study published today is itself a specimen of scientific longevity, dating all the way back to the late 1980s, not long after the founding of the National Institute of Aging in 1985 as part of the National Institutes of Health. One of the new institute’s first major long-term projects was to test the effects of caloric restriction (CR) in monkeys, the lab animal closest to humans. Such studies in humans are problematic, as one might imagine, because it’s not easy to convince people to spend decades starving themselves—and even if you could, you’d have to wait a lifetime for results (actually longer, if it worked as advertised). Monkeys can’t cheat on their diet or complain about it, and they only live 30-odd years or so.
The initial group of 60 monkeys was split into to two groups. Half were allowed to eat a full ration of food while the rest were given a portion equal to about 25 percent less. The monkeys were soon joined by another 60 animals; some were young, between 0 and 8 years old, while the rest were older, between 16 and 23 when the experiment started.
The data started coming out in dribs and drabs, in mundane descriptive studies at first. Then in 2003, the NIA team reported hopefully that “preliminary evidence suggests that CR will have beneficial effects on morbidity and mortality.” While 80 percent of the monkeys were still alive, the restricted animals had better measures of cardiovascular health, hormone levels, and blood-sugar management, an early indicator of diabetes risk. So it came as a bit of a surprise, eight years later, to find that the hungry monkeys are not actually living longer.
This was a surprise, and yet at the same time not surprising. The history of calorie restriction research is strewn with odd results that have been left unexplained (at best) or outright ignored (at worst). When Steven Austad of the University of Texas–San Antonio tested wild-caught mice, for instance, he found no caloric-restriction-induced increase in lifespan. In another study, researchers created 42 different cross-bred mouse strains and found that in a third of the strains, caloric restriction actually seemed to shorten lifespan. And even Clive McCay, the father of caloric restriction, found weird results: In his 1935 experiment, caloric restriction worked only in the males.
In fact, caloric restriction really seemed to work best in standard laboratory mice. This may be because they are predisposed to eat a lot, gain weight, and reproduce early—and thus are more sensitive to reduced food intake.
But in a long-awaited, well-funded monkey study like this, an “odd” result could not be ignored. Still stranger was the fact that even though the underfed monkeys were healthier than the others, they still didn’t live longer. They had lower incidence of cardiovascular disease, as well as diabetes and cancer—and when these diseases did appear, they did so later. “To me I think it’s one of our very interesting findings,” says lead author Rafael de Cabo. “We can have a dramatic effect on healthspan [the length of healthy life] without improving survival.”
Even odder was the fact that the NIA’s control monkeys seemed to be doing much better than the Wisconsin controls. In fact, the NIA controls seemed to be on track to live as long, or longer, than the Wisconsin calorie-restricted monkeys. Some of them were approaching 40 years old, previously the highest recorded age for Rhesus monkeys. (Four of the NIA monkeys have actually surpassed 40 at this writing.) What was that about?
At first, it seemed like a scientist’s nightmare: The control group is indistinguishable from the test group. In clinical trials, a result like this would kill any drug candidate. Then de Cabo took a closer look at a seemingly minor difference between the Wisconsin and NIA studies: the animals’ diets.
De Cabo is attuned to food. A native Spaniard who’s reputed to make some of the best paella this side of Cadiz, he would seem an unlikely advocate for caloric restriction. “I love to cook,” he says. “Would I like to practice caloric restriction? I don’t think so.”
It didn’t take him long to realize that the animals’ food was more important than anyone had thought. The NIA monkeys were fed a natural-ingredient diet, made from ground wheat, ground corn, and other whole foods; the Wisconsin animals ate a “purified” diet, a heavily refined type of food that allowed the researchers to control the nutritional content more precisely. Because the NIA monkeys were eating more natural ingredients, de Cabo realized, they were taking in more polyphenols, micronutrients, flavonoids, and other compounds that may have health-promoting effects.
Furthermore, the NIA diet consisted of 4 percent sucrose—while in the Wisconsin diet, sucrose accounted for some 28 percent of the total calories. High sugar consumption is thought to be a primary driver of obesity, diabetes, and possibly some cancers. “In physics, a calorie is a calorie,” says de Cabo. “In nutrition and animal physiology, there is more and more data coming out that says that the state of the animal is going to depend more on where the calories are coming from.”
In other words, it matters whether you eat at Whole Foods, like the suburban-Maryland NIA monkeys—or at the ballpark, like the Wisconsin monkeys. Guess which works out better in the end?
But what does all this really mean for humans? Is it really “healthier” to starve oneself, as some people believe? Or will this latest monkey study finally let us off the hook?
Calorie-restriction data in humans has been pretty spotty, for good reason. You try cutting back 30 percent of your food intake, and see how it goes. Most studies have been short-term, and none have measured longevity, for obvious reasons. One of the few true long-term trials in humans came out of the drama-ridden Biosphere project from the late 1990s.
Funded by billionaire Ed Bass, Biosphere was a 3-acre sealed environment built in the Arizona desert that was supposed to simulate life in a space station. The facility would be completely self-sufficient, with greenhouses producing all the crew’s food, as well as the oxygen they’d need to breathe. The expedition scientist, Roy Walford, happened to be a strong proponent and key early researcher of caloric restriction. (He also bore a striking resemblance to Mr. Clean.)
When it became clear early in the Biosphere project that the earthbound “space station” could not grow enough food to feed the crew of eight, Walford took the opportunity to place his fellow crew members on a restricted diet—30 percent lower in calories, but dense in nutrients. In his study based on the two-year experience, Walford reported that the main effect of caloric restriction was to drastically lower his fellow crew members’ cholesterol levels, to 140 and below—well below the average for people in the industrialized world. Walford concluded that a calorie-restricted diet would have the same beneficial effects that he and other scientists had observed in mice.
Walford would never get to test his hypothesis fully; he died of Lou Gehrig’s disease in 2004, and in his later years he blamed his poor health on the Biosphere experience. Other, larger trials of calorie restriction in humans have pointed to similar results: In the short term, it’s been shown to benefit overweight or obese people, which is no surprise. And a longer-term study of voluntary calorie-restrictors has pointed to improved arterial health, as well as better blood glucose management and other markers of aging.
But so far, there’s no evidence that humans gain any longevity benefit from calorie restriction. “That data will not emerge until about 2040,” says Brian Delaney, president of the Calorie Restriction Society.
And when it does, chances are any effect of calorie restriction may vary from person to person, depending on genetics. “It’s complicated,” says Nir Barzilai of Einstein Medical College in New York. “To some of us it might work, and for some of us it might be dangerous.”
Several studies have shown that excessive leanness—seen often in calorie-restricting humans—can be as risky as obesity. Taken together, these studies suggest that the optimal body-mass index is about 25, which is on the verge of being overweight.
But if it’s OK to be almost overweight, it might not pay to go beyond that. Another key difference between the two monkey studies has to do with the definition of “ad libitum.” While the Wisconsin control-group monkeys were allowed to stuff themselves, with the equivalent of an all-you-can-eat buffet for several hours at feeding times, the NIA monkeys were given a fixed amount of food. “You could view it as the Wisconsin monkeys were overindulging, like the rest of the American population,” says Rozalyn Anderson, a member of the Wisconsin team. Compared with their Wisconsin brothers, then, the NIA monkeys in the non-calorie-restricted control group were arguably practicing a mild form of calorie restriction—and that, Anderson suggests, might have made a difference.
For decades, ever since McCay, the holy grail of aging research has been to extend maximum lifespan—to push out the frontiers of human longevity, past 100, 120, or more. But while in theory those limits may be malleable, a careful look at these major primate studies shows that they might not be, in practice. Even so, calorie restriction does seem to reduce—drastically in some cases—one’s risk of developing age-related diseases like cancer and diabetes. So while calorie restriction may or may not make you live longer, overeating and obesity will certainly make you die sooner. And if eating less doesn’t always increase lifespan, it does improve “healthspan,” our allotment of healthy years.
In the next few years, we’re going to learn a lot about how different genetic types respond to medicine, diet, and other things. And while we might not (yet) know how to live forever, more of us will be able to avoid a long, sad decline and will live longer, healthy lives. If we can get there simply by eating the right foods, but not too much, and avoid becoming obese—then just knowing that is a pretty good start
Individuals with defensive or low self-esteem typically focus on trying to prove themselves or impress others. They tend to use others for their own gain. Some act with arrogance and contempt towards others. They generally lack confidence in themselves, often have doubts about their worth and acceptability, and hence are reluctant to take risks or expose themselves to failure. They frequently blame others for their shortcomings rather than take responsibility for their actions.
To gain a better understanding of your self esteem, journal your answers to the following:
1.) What do you do when you make a mistake?
2.) What do you see when you look at yourself in the mirror?
3.) Do you like what you see when you look at yourself?
4.) When you are dealing with the issue of being overweight what do you do? What do you tell yourself? What do you tell others?
5.) When you make a commitment to yourself what happens to that commitment?Read Full Post | Make a Comment ( None so far )
Rob hated to run. But he hated to stop even more.
That’s when his disparaging inner voice, the one that had belittled him since seventh grade, would emerge. If he didn’t keep going, it said, he was going to get fat. He would never have the shredded abs that taunted him from every fitness magazine. He would be just a regular guy — not the superman he felt driven to become.
So on he ran. And when even six hours a day of exercise weren’t enough to quiet the voice, he started skipping meals too.
While anorexia, bulimia and other eating disorders are potentially lethal — up to 5 percent of those suffering from them die from suicide, substance abuse or medical issues, according to a study published in the American Journal of Psychiatry — they have traditionally been viewed as women’s problems. Researchers say only 10 percent of those who are treated for the conditions are male.
But a growing body of evidence suggests that number is misleading. A study published last year estimated that males actually make up 40 percent of teens who have eating disorders. An earlier Harvard survey found that men account for 25 percent of adults with anorexia and bulimia.
Some Chicago-area therapists say more men and boys are seeking help. Niquie Dworkin, who practices on the North Side, said males have been tormented by the same kind of unattainable body images that have long plagued women and girls.
“Action figures used to look normal,” she said. “Now they’re superhuman with really cut abs and really big shoulders. Even little boys are being exposed to images of men that are not realistic.”
While eating disorders in men and women appear to have similar roots in genetics, media messages, perfectionism and low self-esteem, the symptoms are often different. Experts say one big contrast is that men usually focus on muscularity, not thinness, and they tend to manage their weight by working out to incredible extremes.
That’s what happened with Rob, 24, a young man from Elgin who asked that his last name not be used. Experts said his case was typical of men with eating disorders.
His trouble began at age 14, not long after bullying schoolmates mocked him for supposedly being fat. Vowing to gain the same kind of lean, athletic physique one of his tormentors had, he started doing 100 pushups a night. He then moved to the weight room, and when he entered high school, the cross-country team.
His parents were delighted. The other runners were laid-back, friendly and supportive, and Rob’s grades improved after he joined the team. He cut junk food from his diet and worked out with a vengeance. Not even a downpour could keep him from his training.
“All the way around, it seemed like a really good thing,” Rob’s mother recalled. “We didn’t think anything of it.”
Almost imperceptibly, though, his routines grew longer. A coach at a summer running camp preached maximum effort — When you’re not running, another guy is, and he’s going to beat you — and Rob took it to heart. By the time he was a senior, he made excuses to leave practice early so he could work out even harder alone.
“I wanted to make a name for myself, be something,” he said recently. “Working harder than anyone else in the group made me better. That’s what I thought.”
Strange thing, though: Rob didn’t care that much about winning races or setting records. He didn’t really even like running. Thinking about the hours of exercise that awaited him after school filled him with dread.
But it was far worse to skip a workout or ease up on its intensity, even when he was sprinting at a 4-minute-mile pace on a treadmill set to a 12 percent incline. If he backed down, his inner voice told him, something indefinably bad would happen.
So he absorbed the pain, and after noticing an odd relief in hunger, he began skipping meals too. Mastering his body allowed him to feel as though he could manage a life that had become lonely and socially awkward.
Daniel Le Grange, director of the eating disorders program at the University of Chicago Medical Center, said it’s common for people who suffer from the disorders to express a desire for control and self-affirmation. But any contentment that emerges from starvation and hellish exercise doesn’t last long, he said.
“We have patients who are bleeding because they’re on the carpet doing a thousand pushups and situps a day,” he said. “It never gives you that feeling that you’re yearning for, that you feel good about yourself.”
Rob’s intense exercise led to stress fractures, and he decided not to join the cross-country team when he went to college in fall 2006. But he didn’t let up on his body.
Instead he rose at 6 a.m. for a quick breakfast before heading to the gym for a four-hour workout, including 90 minutes on an elliptical machine and an hour of weights. In the afternoon, after skipping lunch, he walked for two hours before doing repeats on the library steps. He picked at his dinner before rewarding himself for his suffering with a giant piece of pie.
When Rob healed enough to run, his routines grew ever more punishing, his body ever lighter — sometimes dipping below 100 pounds on his 5-foot-7 frame. A photograph taken of him at a swimming pool in July 2009 shows deep hollows beneath his cheekbones. Striated ropes of muscle press through his skin. His arms and legs appear as thin and brittle as sticks.
Rob’s family, long in denial, knew he was in trouble. He knew it too. But even though he had begun to see a therapist, it was easier to follow his compulsions than resist.
“We would have these breakthrough moments where he would say, ‘I know I have a problem, but I’m not ready to give it up yet,’” his sister said. “I always knew when he stopped calling that he was getting worse. Because then he wasn’t ready to hear it.”
In November 2011, Rob sustained another leg fracture, the result of what doctors said was a lack of calcium in his bones. Though he was ordered to rest for a month, he became so frenzied from inactivity that he grabbed his crutches and did hobbled laps around his parents’ kitchen table.
It turned out to be his moment of clarity. He called the eating disorders recovery center at Alexian Brothers Behavioral Health Hospital in Hoffman Estates and had himself admitted.
Staffers there found that his pulse was a dangerously low 32 beats per minute, said Michelle Gebhardt, the center’s clinical coordinator. Their No. 1 job was to stabilize Rob — one of the few males to enter the program — by controlling his exercise and encouraging him to eat.
That turned meals into high drama. On one of his first days, he was presented with a modest portion of scrambled eggs he dubbed “Mount Eggerest.” He could swallow only half. Another time he refused to eat until he was given a ham sandwich; he then declined to finish it.
Finally, Rob recalled, one of his fellow patients had had enough, telling him when he arrived for dinner: “If you sit here you better eat all your food because you are really triggering us with all your crap.”
Therapy and reflection eventually convinced Rob, who was diagnosed with a condition known as eating disorder not otherwise specified, that he needed to change. He yielded to the program and spent a few weeks putting on weight before transferring to Rogers Memorial Hospital near Milwaukee, home to a rare males-only eating disorders program.
His task there was to excavate the psychological turmoil that lay beneath his behavior — the desire for control, the need to feel special, even the fear of becoming an adult — and reset his mind and body to healthy habits.
It wasn’t easy. To remind himself of happier days, he hung a photograph taken a few months earlier at his sister’s wedding rehearsal dinner. The image showed him standing behind his parents and smiling, his skin stretched tightly over the bones of his face.
His roommate asked what he thought of the picture. Rob said he thought he looked pretty good.
“When I see that, I don’t see ‘good’ at all,” his roommate said. “I see death.”
In his three months at Rogers, Rob said, he learned to take a more realistic view of himself and gain more control over his eating and exercise habits. He put on about 45 pounds in treatment and now follows a diet worked up by a nutritionist, dining at appointed times even if he isn’t hungry (his long periods of starvation scrambled the neural circuitry that governs hunger — a common side effect of an eating disorder).
He works out cautiously, lifting weights with his father lest he get carried away. On a recent Sunday morning he went for a slow walk around the block, the only form of cardiovascular exercise he allows himself.
“Sometimes there’s the urge to hurry up,” he said, strolling past well-watered lawns and vibrant flower beds. “It’s a little battle. I usually win.”
Le Grange, the University of Chicago expert, said males are so scarce in eating disorder studies that there is no good data about their chances for long-term recovery. Indeed, while Rob today looks fit and healthy, he says he’ll have to be wary of backsliding for the rest of his life.
For now, though, he has managed to quiet the voice inside him with the mantra he took away from treatment: He is more than his body.
“There are so many other things that set me aside,” he said. “I have my goals and aspirations, like wanting to be a counselor. What I do physically will not be the defining characteristic for me.”Read Full Post | Make a Comment ( None so far )
Talk about a disparity between theory and practice. The American Academy of Pediatrics tells parents that children’s total entertainment media time should not exceed two hours daily. According to the Kaiser Family Foundation, average kids watch at least twice that much television. They also spend more than an hour per day online and another hour on video games. These activities, collectively called “screen time,” are widely blamed for the tripling of obesity rates in children since the 1980s.
Zoning out in front of a television or video game monitor for hours doesn’t seem healthy, but no one yet has found any causal link between time spent lolling on a couch and childhood obesity. In February, for example, researchers in Texas reported their findings on whether it would help kids lose weight to have their regular video games replaced with a more active alternative such as the Nintendo Wii console. In their study, 84 children received Wii consoles and one half of those got a collection of exercise-oriented games like Wii Sports and EA Active, which ask players to move their arms and legs or jump around to control the action. The other half of the kids got “inactive” games like Madden NFL, which can be played from a seated position with minimal full-body movements. The results of the comparison were disappointing. After three months, “there was no evidence that children receiving the active video games were more active in general or at any time,” the authors wrote. (The year before, a similar study in New Zealand had shown only minor improvement with active games; kids weighed just a pound less after six months of “exergaming”.)
Such studies are complicated by the fact that even regular video games—the ones so often blamed for the present rates of childhood obesity—may not be as passive as you think. A decade ago, a physiologist named Arlette Perry at the University of Miami worried that her 10-year-old son Thomas was spending a lot of time with a controller in his hand. To measure the effects of chronic gaming, she studied her son and 20 other children as they played Tekken 3 on a Sony PlayStation in her lab. She found that the fighting game increased the kids’ heart rates and blood pressure to the same extent as walking at 3 miles per hour. Children burned roughly twice as many calories playing Tekken 3 as they did sitting in one place, which translates to an extra 40 to 80 calories burned every hour. In other words, this traditional, “passive” video game was itself providing children with a form of exercise.
If video games aren’t the problem, then what about television? We’ve know for a long time that attempts to reduce television-watching among children have a limited effect on their body weight. For a 1999 paper in the Journal of the American Medical Association, researchers gave a group of third- and fourth-graders in California regular lessons on the dangers of excessive television. Their parents were asked to enforce time budgets (using a device to limit total screen time) and participate in television turnoffs lasting 10 days, among other projects. This very involved, two-month intervention halved television watching among participants. Eight months later, researchers measured the children’s heights and weights, and compared them to those taken from children at a school without a similar program. The drastic reduction in television-watching made for only a very modest difference: Weight gains in the experimental group were reduced by an average of only one pound.
Reducing time spent watching television or playing video games may have some benefits—more time for creative play or academic work, for example—but slimmer bodies don’t seem to be among them. It’s also not necessarily the case that increasing screen time will lead a child to gain weight: Between 1999 and 2010, screen time among kids jumped by more than two hours per day, according to the Kaiser Family Foundation. Yet childhood obesity rates remained relatively stable over the same period.
Taken together, the data above suggest that public health efforts to cut or reallocate screen time won’t have a huge impact on childhood obesity. There is indeed a well-known correlation between obesity and hours spent in front of a video screen, but the fact of that linkage doesn’t tell us anything about causality. Does watching television make kids fat, or do fat kids just happen to watch a lot of television? The accumulating work in this area suggests the latter.
In short, whatever calories a kid might burn off playing Dance Dance Revolution or turning off the TV to go for a walk are small potatoes. Even an adult will only burn off a few hundred calories by working out intensely for half an hour—a benefit that’s wiped out by a single bag of chips or a scoop of ice cream. That might be why taking up dozens of classroom hours in an effort to reduce screen time, or paying to outfit homes with active video games, yield such a small return on investment. That doesn’t mean someone can’t exercise his or her way from obesity to thinness, but the bar is very high. A more efficient way to reduce pediatric obesity would fixate less on the number of calories going out, and more on the number going in.
In the end, schoolchildren don’t get obese from mostly watching television. They pack on weight because they eat too much. Consider the results of another major clinical study, in which the families of obese 8- and 12-year-old kids received nutritional counseling, either with or without additional information about exercise. Teaching the parents and children about food habits made almost all the difference: Kids in the study lost around 15 pounds in 6 months, on average, with the extra lessons on the importance of exercise having only a minor impact—a pound or two of extra weight loss. After two years, the effects of dietary changes persisted, at least to some extent; the effects of the exercise teaching remained small by comparison. That’s also why there are no major differences in weight loss among the major dieting programs, like Weight Watchers, the Zone diet, the Atkins diet, and Ornish diet. They all work to about the same degree, because they all reduce the total of what goes in.
It’s likely that the kinds of households in which kids watch hours of television per day happen to be the same ones where healthy food options are hard to come by, or ones in which the parents are not well educated in how to make good dietary choices. Though it seems like a convenient cure, just disconnecting the cable service isn’t going to fix those problems.
In the United States, 90% of adults consume caffeine on a regular basis — most often by drinking coffee. A study suggests that coffee not only wakes people up, but also may offer some protection against depression. What’s less clear is why this might be.
Researchers at the Harvard School of Public Health and Brigham and Women’s Hospital analyzed data collected from nearly 51,000 women participating in the Nurses’ Health Study, all free of depression in 1996. The researchers then determined how many of the women had developed depression a decade later and compared their caffeine intake to determine whether it affected risk. (They also controlled for other health and lifestyle factors such as weight, cigarette smoking, and exercise.)
By 2006, 2,607 women were diagnosed with depression or had started taking antidepressants. The researchers found an inverse dose-response relationship between caffeine intake and mood: the more caffeine a woman ingested per day, the lower the likelihood that she developed depression during the study period. Women who drank the most caffeinated coffee per day were 20% less likely to develop depression than women who drank the least. Other sources of caffeine — such as tea, soda, and chocolate — did not have an impact on risk of depression. The researchers speculate that this was because these other sources provide such minimal amounts of caffeine in comparison to coffee.
In this study, women in the highest quartile of caffeine consumption — who were least likely to develop depression — were ingesting 550 mg of the stimulant per day. The caffeine content of coffee varies greatly, depending on the beans, how they’re roasted, and other factors. The average for an 8-ounce cup is about 100 mg. That means in this study, women at the highest level of caffeine consumption were drinking about five and a half cups of coffee per day.
This study was not designed to prove that drinking coffee prevents depression — but it does raise interesting questions about how one variable might influence the other.
Caffeine gets absorbed in the stomach and small intestine and is then distributed throughout the body, including the brain. The amount circulating in the blood peaks 30 to 90 minutes after it’s ingested, and just trace amounts are around eight to 10 hours later. In between, the amount circulating declines as caffeine gets metabolized and broken down by the liver.
Once it reaches the brain, caffeine probably has multiple targets, but the main one seems to be adenosine receptors. Adenosine is a brain chemical that dampens brain activity. By hogging its receptors, caffeine sets off a chain of events that affects the activity of dopamine, an important neurotransmitter involved in mood. Caffeine also indirectly affects two other neurotransmitters, serotonin and acetylcholine. In addition, caffeine acts on areas of the brain involved in sleep, arousal, pleasure, and thinking.
Although the exact way that caffeine acts in the brain remains a matter of conjecture, the behavioral effects are well known. Caffeine perks people up because it increases vigilance and arousal, boosts energy, and counters sleepiness.
One study, no matter how well conducted, is not enough for clinicians to make a recommendation about caffeine intake. And it’s important to keep in mind the limitations of this particular study.
First and most important, the Nurses’ Health Study is a prospective epidemiological study — meaning that the researchers followed participants over time. This kind of study cannot prove cause and effect. It can only suggest an association between caffeine and mood. It’s possible, as the authors point out, that women with mild symptoms of depression or those prone to depression may avoid drinking caffeinated beverages — and not that coffee prevents depression.
Second, while caffeine might boost mood as well as energy, it is also a potent stimulant. In excess, caffeine may cause jitteriness and worsen anxiety, especially in people who are already stressed or sensitive to caffeine’s effects.
In an epidemiological study of men in Finland, where rates of suicide and coffee consumption are both relatively high, the risk of suicide was lower than average for men consuming as many as seven cups of coffee per day. But risk increased for those who drank eight or more cups of coffee per day. Men who drank 10 or more cups of coffee per day were nearly twice as likely to kill themselves as men who drank no coffee.
There are other health issues to keep in mind. Especially in the short term, caffeine also has negative effects, which include raising blood pressure, making arteries stiffer, and increasing levels of insulin and possibly cholesterol. (Habitual use may cause some of these effects to wear off.)
So it’s too soon to advise people to have a cup of morning Joe (or several) to help reduce their risk of depression (or to drink seven cups to reduce suicide risk). But people who do drink coffee regularly should take heart that this is one habit that isn’t a vice — and may even be healthy.Read Full Post | Make a Comment ( None so far )
For some, aging may bring on — or rekindle — an eating disorder.
Most people who develop eating disorders — an estimated 90% — are female. Typically associated with adolescents and young women, eating disorders also affect middle-aged or elderly women — although, until fairly recently, not much was known about prevalence in this older age group.
Secrecy and shame are part of the disorder, and women may not seek help. This is particularly true if they fear being forced to gain unwanted weight or stigmatized as an older woman with a “teenager’s disease.”
Despite underdiagnosis of eating disorders in older people, clinicians at treatment centers specializing in such issues report that they’ve seen an upswing in requests for help from older women. Some of these women have struggled with disordered eating for decades, while for others the problem is new. The limited amount of research on this topic suggests that such anecdotal reports may reflect a trend.
In community surveys conducted in 1995 and again in 2005, for example, Australian researchers found that while younger women reported eating disorder behaviors more often than older women did, the rate of these disorders in older women increased dramatically between the two surveys, while it remained stable for young women. In women ages 65 and over, strict dieting, fasting, and binge eating all tripled, while purging quadrupled. In the same surveys, rates of strict dieting or fasting and purging also increased dramatically in women ages 45 to 64. A study of Canadian women surveyed in the general population likewise found that women ages 45 to 64 were more likely to binge on food, feel guilty about eating, and be preoccupied with food compared with younger women.
In the most severe cases, patients develop life-threatening complications, such as cardiac arrhythmias, kidney failure, and liver failure. This is one reason that anorexia nervosa is one of the most deadly psychiatric disorders, killing 5.6% of patients for every decade that they remain ill. Treatment is challenging because starvation not only severely damages the body, but also harms the brain — causing changes in thinking, emotions, and behaviors that may be difficult to reverse.
The Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) describes two subtypes of anorexia nervosa. In the restricting subtype, patients drastically reduce food consumption. They may also exercise excessively in an effort to lose weight. In the binge-eating/purging subtype, patients lose weight by forcing themselves to vomit or by using laxatives, diuretics, or enemas.
Once weight decreases to the threshold required for a diagnosis of anorexia nervosa, patients may experience changes in thinking processes, such as difficulty concentrating. They may develop odd food rituals, such as cutting food into tiny pieces, eating only at certain times, and weighing food. Weight gain may eventually improve these psychological problems, but it seldom eliminates them completely — which is why maintenance treatment is so important.
Bulimia nervosa. Bulimia nervosa is characterized by a cycle of binge eating followed by some type of compensatory action to avoid weight gain. Researchers estimate that one to three women out of 100 will develop bulimia nervosa at some point in their lives. In men, the rate of diagnosis is only about one-tenth the rate in women.
Although many Americans overeat by consuming too many calories per day (which helps explain why more than one in three are obese), binge eating involves consuming extreme amounts of food within a restricted time frame — usually within two hours. While on a binge, a patient may eat an entire cake rather than one or two slices, or a full gallon of ice cream rather than a bowl.
The DSM-IV describes two subtypes of bulimia nervosa, based on the strategy a patient uses to rid herself of excess calories. Patients diagnosed with the purging subtype, the most common form, may make themselves vomit or use laxatives or diuretics. This diagnosis overlaps with the binge/purge subtype of anorexia, but people with bulimia do not have the same preoccupation with maintaining a low body weight. In the nonpurging subtype, patients may exercise excessively or stop eating for a day or longer.
If a vicious cycle of overeating and deprivation takes over, patients may eat to the point of physical pain, then compensate so dramatically that they feel ravenously hungry. When the binge-and-compensation cycle occurs at least twice a week for three months, patients meet DSM-IV diagnostic criteria for bulimia nervosa.
Binge-eating disorder. Binge eaters regularly binge, usually in secret and accompanied by feelings of guilt or shame. Unlike bulimics, they don’t follow a binge with a purge, so they may be overweight or obese, and their eating disorder may remain unrecognized. In the DSM-IV, binge-eating disorder is categorized as an “eating disorder not otherwise specified,” but it is proposed for inclusion as a freestanding diagnosis in the next edition of the diagnostic manual. Many older women do not fit the strict definitions for eating disorders, yet they deserve treatment.
Grief. With age, people are increasingly likely to lose people they care about. Mourning can take away your appetite, and restricting food or purging can be a way to deal with distressing feelings. For example, the comedian Joan Rivers has written about the sudden onset of bulimia in her 50s after her husband’s death by suicide.
Divorce. In addition to grief and loss, the breakup of a marriage can spur a woman to view her body unfavorably in comparison with other singles or an ex-spouse’s new girlfriend.
Heightened awareness of aging. This can be particularly acute when women return to school or work or need to keep working past the traditional retirement age, especially in appearance-related fields.
Medical illness. If a short-term illness results in weight loss, a woman may receive compliments on her slender appearance and continue to restrict food after she has recovered to avoid regaining weight.
On the other hand, some older women decide to get professional help after years of disordered eating. This decision may emerge for any of several reasons.
For example, eating disorders take a physical toll on the body, and the impact is more apparent with age. Dental problems, arrhythmias (irregular heartbeats), or osteoporosis (a common complication of eating disorders) may prompt a woman to seek treatment. In an older body, forceful vomiting may result in a medical emergency, such as a stomach rupture or tear in the esophagus, which can bring a woman to professional attention.
A woman’s priorities may also shift over time. Disordered eating and attempts to hide it take a great deal of time and effort. Sometimes an unrelated health scare, death of a loved one, or other event sparks a realization of the sheer amount of psychic and physical energy required to maintain these behaviors, and a woman may finally decide that enough is enough — and seek treatment.
The goal of treating an eating disorder is to help a patient achieve a healthy weight, exercise level, and eating pattern; to eliminate binge eating and purging; and to address any contributing emotional problems or distorted thinking. This usually requires the help of a mental health professional, a nutritionist, and other clinicians.
Psychotherapy. This is the cornerstone of treatment for eating disorders. Various kinds of psychotherapy can help. Cognitive behavioral therapy (CBT) challenges unrealistic thoughts about food and appearance and helps people develop more productive thought patterns.
Other types of psychotherapy may also be useful in particular circumstances. For example, interpersonal and psychodynamic therapy can help people gain insight into issues such as role transitions, loss, and unresolved relationships that may underlie disordered eating and an excessive focus on body image.
Nutritional rehabilitation. A dietitian or nutritional counselor can help a woman recovering from an eating disorder learn (or relearn) the components of a healthy diet and can help motivate her to make the needed changes. At different stages in recovery, a nutrition professional will help plan how and when the patient should eat in a way that keeps the digestive system working well and avoids dangerous changes in electrolyte and fluid balances that can occur when a person begins eating again after a period of semi-starvation.
Medication. Fluoxetine (Prozac) is the only medication approved for the treatment of an eating disorder. At high doses (about 60 mg per day), it reduces binge eating and vomiting up to 70% in the first eight weeks, though results are much poorer if patients aren’t also receiving psychotherapy. Other antidepressants and the seizure medication topiramate (Topamax) may be prescribed for bulimia or binge-eating disorder, but fewer controlled trials have studied their effectiveness.
No medications are approved specifically for treating anorexia. Although antidepressants, seizure medications, and certain antipsychotic medications are sometimes used in treating the condition, food is considered the primary medication. No drug works well until some weight is restored. However, if depression or anxiety is also involved, medications may be prescribed to address these problems.
Hospitalization. Eating disorders are usually treated on an outpatient basis. But hospitalization may be recommended if a woman is dangerously underweight, unable to eat or stop vomiting, seriously depressed or suicidal, medically unstable (for example, because of heart arrhythmias, low pulse or blood pressure, or electrolyte imbalances), or has other medical complications that require hospital treatment.
Academy for Eating Disorders
International Association of Eating Disorder Professionals
There is much truth behind the phrase “stress eating.” Stress, the hormones it unleashes, and the effects of high-fat, sugary “comfort foods” push people toward overeating. Researchers have linked weight gain to stress, and according to an American Psychological Association survey, about one-fourth of Americans rate their stress level as 8 or more on a 10-point scale.
In the short term, stress can shut down appetite. A structure in the brain called the hypothalamus produces corticotropin-releasing hormone, which suppresses appetite. The brain also sends messages to the adrenal glands atop the kidneys to pump out the hormone epinephrine (also known as adrenaline). Epinephrine helps trigger the body’s fight-or-flight response, a revved-up physiological state that temporarily puts eating on hold.
But if stress persists, it’s a different story. The adrenal glands release another hormone called cortisol, and cortisol increases appetite and may also ramp up motivation in general, including the motivation to eat. Once a stressful episode is over, cortisol levels should fall, but if the stress doesn’t go away — or if a person’s stress response gets stuck in the “on” position — cortisol may stay elevated.
Stress also seems to affect food preferences. Numerous studies — granted, many of them in animals — have shown that physical or emotional distress increases the intake of food high in fat, sugar, or both. High cortisol levels, in combination with high insulin levels, may be responsible. Other research suggests that ghrelin, a “hunger hormone,” may have a role.
Once ingested, fat- and sugar-filled foods seem to have a feedback effect that inhibits activity in the parts of the brain that produce and process stress and related emotions. These foods really are “comfort” foods in that they seem to counteract stress — and this may contribute to people’s stress-induced craving for those foods.
Of course, overeating isn’t the only stress-related behavior that can add pounds. Stressed people also lose sleep, exercise less, and drink more alcohol, all of which can contribute to excess weight.
Some research suggests a gender difference in stress-coping behavior, with women being more likely to turn to food and men to alcohol or smoking. And a Finnish study that included over 5,000 men and women showed that obesity was associated with stress-related eating in women but not in men.
Harvard researchers have reported that stress from work and other sorts of problems correlates with weight gain, but only in those who were overweight at the beginning of the study period. One theory is that overweight people have elevated insulin levels, and stress-related weight gain is more likely to occur in the presence of high insulin.
How much cortisol people produce in response to stress may also factor into the stress–weight gain equation. In 2007, British researchers designed an ingenious study that showed that people who responded to stress with high cortisol levels in an experimental setting were more likely to snack in response to daily hassles in their regular lives than low-cortisol responders.
When stress affects someone’s appetite and waistline, the individual can forestall further weight gain by ridding the refrigerator and cupboards of high-fat, sugary foods. Keeping those “comfort foods” handy is just inviting trouble.
Here are some other suggestions for countering stress:
Meditation. Countless studies show that meditation reduces stress, although much of the research has focused on high blood pressure and heart disease. Meditation may also help people become more mindful of food choices. With practice, a person may be able to pay better attention to the impulse to grab a fat- and sugar-loaded comfort food and inhibit the impulse.
Exercise. Intense exercise increases cortisol levels temporarily, but low-intensity exercise seems to reduce them. University of California researchers reported that exercise — and this was vigorous exercise — may blunt some of the negative effects of stress. Some activities, such as yoga and tai chi, have elements of both exercise and meditation.
Social support. Friends, family, and other sources of social support seem to have a buffering effect on the stress that people experience. For example, research suggests that people working in stressful situations, like hospital emergency departments, have better mental health if they have adequate social support. But even people who live and work in situations where the stakes aren’t as high need help from time to time from friends and family.Read Full Post | Make a Comment ( None so far )
Q. A friend jokes she is a “chocoholic.” Can you really become addicted to chocolate or other foods?
A. With so many Americans overweight or obese, public health experts are also asking this question. Since it is so difficult to follow the simple advice to “eat a healthy diet” and “exercise more,” they realize that food might have addictive properties. Yet common sense demands that food — even chocolate — be placed in a different category from substances like heroin or alcohol.
Addiction involves three essential components: intense craving, loss of control over the object of that craving, and continuing involvement despite bad consequences. Preliminary evidence demonstrates that people can exhibit all three elements when it comes to food.
Consider the phenomenon of craving, for example. For individuals addicted to substances, environmental cues — such as a neighborhood bar or the smoking area at work — can trigger craving. Similarly, many people find that seeing or smelling food can trigger their appetite, even if they have just eaten a satisfying meal.
But your friend’s joke about being a chocoholic is a reminder that not all foods induce cravings. The midnight run for a pint of ice cream is familiar, but I’ve never heard of anyone trolling for celery at that hour. This observation is consistent with research that demonstrates what we might expect if food were addicting — high-fat and high-carbohydrate foods trigger reward pathways in animal brains, and restricting these foods can induce a stress-like response in some of the animals studied.
Chocolate, which contains both sugar and fat, is often used in studies of food addiction. In a study published in Archives of General Psychiatry, for example, researchers at Yale University asked volunteers to fill out questionnaires to assess addictive behavior. The volunteers then underwent brain imaging while being able to see and smell, and then finally drink, a chocolate milkshake. Participants who scored higher on the food addiction scale experienced a surge of activity in the part of the brain that regulates cravings and rewards when presented with the chocolate milkshake. Once they started drinking it, they showed markedly reduced activity in areas of the brain that we use to control the impulse to seek rewards. A similar pattern of brain activity is found in people addicted to drugs.
In another study, this one involving candy, researchers at Drexel University concluded that people experienced psychological reactions while eating chocolate — such as intense pleasure and craving for more — that were similar to those experienced on drugs.
The epidemic of obesity may at once epitomize all of the key components of addiction. Many people who are overweight crave food, lose control over eating, and experience negative health effects that should, but don’t, serve as a deterrent. And the influence of stress on eating provides another link between food and addictive behavior. Those who have broken free of an addiction tend to relapse when they are under stress — partly because they begin craving the comfort they experienced while using alcohol, cigarettes, or drugs. In the same way, stress is often what prompts people to go off a diet.
Despite intriguing parallels, however, there are also significant differences between drugs of addiction and food. The most obvious one is that food is necessary for survival, while addictive drugs are not. And this makes treatment more of a challenge too. It’s not possible to go off food, as it were, cold turkey.
Moreover, researchers have yet to identify the addictive component of foods. We know that nicotine in cigarettes causes addiction, for example. But what exactly is it about chocolate that causes addiction? The studies showing how the animal brain responds to sugar and fat are interesting, but far from conclusive.
Whether “chocoholism” exists or not, most of us are stuck with the simple, if frustrating, advice to moderate our consumption. Health depends less on what we call our behavior than on paying attention to the hundreds of small but important choices we have to make every day.Read Full Post | Make a Comment ( None so far )
Strategies for decreasing a child’s risk for obesity often focus on improving eating habits and maintaining a high level of physical activity. While this is one way to address the issue, another way to reduce the risk of childhood obesity could simply come down to positive parenting, according to a Temple University study published in the November issue of Child Abuse & Neglect.
“This is the first study to show the association between neglect in childhood and childhood obesity. Previous studies looked at maltreatment in childhood and how it affected these individuals in adulthood,” said Dr. Robert Whitaker, the study’s lead author and a pediatrician and professor of public health at Temple University.
Examples of neglect include a parent not showing enough affection to the child due to preoccupation with his/her own problems, not taking a child to the doctor when he/she needed it, and leaving a child at home without the proper supervision.
Data was obtained from the Fragile Families and Child Wellbeing Study, a birth cohort study of 4,898 children born between 1998 and 2000 in 20 large U.S. cities. At age 3, 2,412 of these children had their height and weight measured, and mothers answered items on the Parent-Child Conflict Tactics Scales about three types of child maltreatment in the prior year: neglect (such as not providing proper supervision for the child), corporal punishment (such as spanking the child on the bottom with a bare hand) and psychological aggression (such as threatening to spank the child but not actually doing it).
Eighteen percent of the children were obese, and the prevalence of any episode of neglect, corporal punishment or psychological aggression was 11 percent, 84 percent and 93 percent, respectively.
The odds of obesity were 50 percent greater in children who had experienced neglect, after controlling for the income and number of children in the household, the mothers’ race/ethnicity, education, marital status, body mass index, prenatal smoking and age, and the children’s sex and birth weight. Neither the frequency of corporal punishment nor psychological aggression was associated with an increased risk of obesity.
“Corporal punishment and psychological aggression are common discipline techniques resulting from a child’s misbehavior, and the child may come to anticipate them as consequences of their misbehavior,” Whitaker said.
“In contrast, the child may not understand the cause of the neglect and the child might mistakenly feel at fault,” he added.
“These experiences of neglect could translate into a great deal of stress for the child, which might, in turn, influence mood, anxiety, diet and activity. As we know, adults eat in response to stress; the same could be true for children,” Whitaker said.
“You can’t make a child’s life stress free, but parents can strive to be more of a buffer against stress, rather than one of the causes of stress,” he said.
Temple University (2007, November 16). Higher Risk Of Obesity For Children Neglected By ParentsRead Full Post | Make a Comment ( None so far )
People with personality traits of high neuroticism and low conscientiousness are likely to go through cycles of gaining and losing weight throughout their lives, according to an examination of 50 years of data in a study published by the American Psychological Association.
Impulsivity was the strongest predictor of who would be overweight, the researchers found. Study participants who scored in the top 10 percent on impulsivity weighed an average of 22 lbs. more than those in the bottom 10 percent, according to the study.
“Individuals with this constellation of traits tend to give in to temptation and lack the discipline to stay on track amid difficulties or frustration,” the researchers wrote. “To maintain a healthy weight, it is typically necessary to have a healthy diet and a sustained program of physical activity, both of which require commitment and restraint. Such control may be difficult for highly impulsive individuals.”
The researchers, from the National Institute on Aging, looked at data from a longitudinal study of 1,988 people to determine how personality traits are associated with weight and body mass index. Their conclusions were published online in the APA’s Journal of Personality and Social Psychology.
“To the best of our knowledge, we are the first to examine whether personality is associated with fluctuations in weight over time,” they wrote. “Interestingly, our pattern of associations fits nicely with the characteristics of these traits.”
Participants were drawn from the Baltimore Longitudinal Study of Aging, an ongoing multidisciplinary study of normal aging administered by the National Institute on Aging. Subjects were generally healthy and highly educated, with an average of 16.53 years of education. The sample was 71 percent white, 22 percent black, 7 percent other ethnicity; 50 percent were women. All were assessed on what’s known as the “Big Five” personality traits — openness, conscientiousness, extraversion, agreeableness and neuroticism — as well as on 30 subcategories of these personality traits. Subjects were weighed and measured over time. This resulted in a total of 14,531 assessments across the 50 years of the study.
Although weight tends to increase gradually as people age, the researchers, led by Angelina R. Sutin, PhD, found greater weight gain among impulsive people; those who enjoy taking risks; and those who are antagonistic — especially those who are cynical, competitive and aggressive.
“Previous research has found that impulsive individuals are prone to binge eating and alcohol consumption,” Sutin said. “These behavioral patterns may contribute to weight gain over time.”
Among their other findings: Conscientious participants tended to be leaner and weight did not contribute to changes in personality across adulthood.
“The pathway from personality traits to weight gain is complex and probably includes physiological mechanisms, in addition to behavioral ones,” Sutin said. “We hope that by more clearly identifying the association between personality and obesity, more tailored treatments will be developed. For example, lifestyle and exercise interventions that are done in a group setting may be more effective for extroverts than for introverts.”
Angelina R. Sutin, Luigi Ferrucci, Alan B. Zonderman, Antonio Terracciano. Personality and obesity across the adult life span.. Journal of Personality and Social Psychology, 2011;Read Full Post | Make a Comment ( None so far )
« Previous Entries