Dozens of studies are publicized every week. But those studies hardly slake people’s thirst for answers to questions about how to eat or how much to exercise. Does exercise help you maintain your memory? What kind? Walking? Intense exercise? Does eating carbohydrates make you fat? Can you prevent breast cancer by exercising when you are young? Do vegetables protect you from heart disease?
The problem is one of signal to noise. You can’t discern the signal — a lower risk of dementia, or a longer life, or less obesity, or less cancer — because the noise, the enormous uncertainty in the measurement of such things as how much you exercise or what exactly you eat, is overwhelming. The signal is often weak, meaning if there is an effect of lifestyle it is minuscule, nothing like the link between smoking and lung cancer, for example.
And there is no gold standard of measurement, nothing that everyone agrees on and uses to measure aspects of lifestyle.
The result is a large body of studies whose conclusions are not reproducible. “We don’t know how to measure diet or exercise,” said Dr. Barnett Kramer, director of the National Cancer Institute’s division of disease prevention.
His division is working on ways to sort out inconsistencies in research used to generate health advice, hoping to improve what has become a real mess: “You can ask people how many times a week or how many times a month they eat bread or berries or ask them to keep a diary of what they ate in the last 24 hours.” But, he said, it should be no surprise that people misremember or give researchers an answer they think makes them sound good.
“I can’t remember what meals I ate a week ago,” Dr. Kramer said. “Now ask me what meals I had as an adolescent, or how much I exercised.”
David Allison, director of the nutrition obesity research center at the University of Alabama at Birmingham, says the same problems plague obesity research, with only two things known with certainty. All other things being equal, if you eat more calories, you will gain weight. And all other things being equal, if you exercise enough, you will lose a small amount of weight.
Adding to the confusion is a cacophony of poorly designed research, the tendency for different researchers studying the same effect to use different measurements and report outcomes differently, and researchers’ tendency to selectively report positive or “interesting” results.
The result is what Dr. Kramer calls whipsaw literature. “One week drinking coffee is good for you, and the next week it is lethal,” he says.
The situation is so bad that what gets published tends to be what the scientists believe ahead of time, says Dr. John Ioannidis, a professor of medicine and of health research and policy at Stanford University’s medical school. “There are so many nutrients and so many diets,” he said. “So many outcomes — heart disease, cancer, stroke. What kind of data do you collect? A follow-up at two months, six months, two years, 10 years? You end up having millions of choices.”
And the scientists get to pick the one they want. “I can get you any result you want in any observational data set,” he said.
There have been rigorous lifestyle studies, but they are few and far between. A large diet study in Spain found that a Mediterranean diet, with fruits, vegetables, fish and olive oil or nuts, decreased the risk of heart attacks and strokes. Two large federal studies looked at a high-fiber diet but failed to find evidence it protects against colon cancer.
Then there are the seemingly contradictory but well-done studies. One large federal study found that — contrary to all assumptions — diet and weight loss did not prevent heart attacks and strokes in people with Type 2 diabetes. Another large federal study found that people at risk for Type 2 diabetes could stave it off by losing a modest amount of weight and exercising.
A few years ago, two researchers decided to ask just how crazy the cancer and diet literature was. They began with a cookbook, “The Boston Cooking-School Cookbook,” and randomly selected recipes, listing the ingredients, until they had 50 distinct ingredients. Then they did a literature search asking if those ingredients were associated with cancer.
Four out of five were linked to cancer, the researchers reported, either increasing or decreasing the risk. Often the same ingredient that increased risk in one study decreased it in another. Those ingredients not associated with cancer risk tended to be odd, like terrapin, and had not been studied by nutrition researchers.
But when the authors, Dr. Jonathan Schoenfeld, a radiation oncologist at the Dana-Farber Cancer Institute, and Dr. Ioannidis, looked at meta-analyses of the ingredients, which combined data from all the studies, the effects generally went away.
They titled their paper, “Is everything we eat associated with cancer?”
That study is no surprise to a group that puts together an authoritative guide, the Physicians Data Query, for the National Cancer Institute. The group’s screening and prevention board wants to make some sort of statement about whether diet affects cancer risk. But the studies are just so unreliable that it is hard to draw conclusions. The board’s feelings about whether diet has any link to cancer “are pretty consistently negative,” said Dr. Donald Berry, a biostatistician at M. D. Anderson Cancer Center in Houston, who is a member of the board.
“Were I to write a paper on the subject, I might use this variant of their title: ‘Is anything we eat associated with cancer?’” Dr. Berry said. “And my answer would be ‘No. The preponderance of the evidence is either negative or unreliable and subject to false-positive conclusions.’”
Some medical experts say the problems with lifestyle studies are so overwhelming — and the chance of finding anything reproducible and meaningful so small — that it might be best to just give up on those questions altogether.
“They may not be worth studying,” said Dr. Vinay Prasad, a cancer researcher at Oregon Health and Science University. “People want certainty, but, boy, we have no good answers.”
As for Dr. Kramer, he has not given up on rigorous research. What is needed at this point, he says, is a little more humility among researchers in interpreting and reporting the implications of their own evidence.