9 out of 10 That Die From it Never Knew They Even Had This Preventable Disease

9 out of 10 That Die From it Never Knew They Even Had This Preventable Disease.jpeg

Diverticula are out-pouchings of our intestine. Doctors like using a tire analogy: high pressures within the gut can force the intestines to balloon out through weak spots in the intestinal wall like an inner tube poking out through a worn tire tread. You can see what they actually look like in my video, Diverticulosis: When Our Most Common Gut Disorder Hardly Existed. These pockets can become inflamed and infected, and, to carry the tire analogy further, can blow out and spill fecal matter into the abdomen, and lead to death. Symptoms can range from no symptoms at all, to a little cramping and bloating, to "incapacitating pain that is a medical emergency." Nine out of ten people who die from the disease never even knew they had it.

The good news is there may be a way to prevent the disease. Diverticular disease is the most common intestinal disorder, affecting up to 70% of people by age 60. If it's that common, though, is it just an inevitable consequence of aging? No, it's a new disease. In 1907, 25 cases had been reported in the medical literature. Not cases in 25% of people, but 25 cases period. And diverticular disease is kind of hard to miss on autopsy. A hundred years ago, in 1916, it didn't even merit mention in medical and surgical textbooks. The mystery wasn't solved until 1971.

How did a disease that was almost unknown become the most common affliction of the colon in the Western world within one lifespan? Surgeons Painter and Burkitt suggested diverticulosis was a deficiency disease--i.e., a disease caused by a deficiency of fiber. In the late 1800s, roller milling was introduced, further removing fiber from grain, and we started to fill up on other fiber-deficient foods like meat and sugar. A few decades of this and diverticulosis was rampant.

This is what Painter and Burkitt thought was going on: Just as it would be easy to squeeze a lump of butter through a bicycle tube, it's easy to move large, soft, and moist intestinal contents through the gut. In contrast, try squeezing through a lump of tar. When we eat fiber-deficient diets, our feces can become small and firm, and our intestines have to really squeeze down hard to move them along. This buildup of pressure may force out those bulges. Eventually, a low-fiber diet can sometimes lead to the colon literally rupturing itself.

If this theory is true, then populations eating high­-fiber diets would have low rates of diverticulosis. That's exactly what's been found. More than 50% of African Americans in their 50s were found to have diverticulosis, compared to less than 1% in African Africans eating traditional plant-based diets. By less than 1%, we're talking zero out of a series of 2,000 autopsies in South Africa and two out of 4,000 in Uganda. That's about one thousand times lower prevalence.

What, then, do we make of a new study concluding that a low-fiber diet was not associated with diverticulosis. I cover that in my video Does Fiber Really Prevent Diverticulosis?

For more on bowel health, see:

What if your doctor says you shouldn't eat healthy foods like nuts and popcorn because of your diverticulosis? Share with them my Diverticulosis & Nuts video.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sean T Evans / Flickr. This image has been modified.

Original Link

9 out of 10 That Die From it Never Knew They Even Had This Preventable Disease

9 out of 10 That Die From it Never Knew They Even Had This Preventable Disease.jpeg

Diverticula are out-pouchings of our intestine. Doctors like using a tire analogy: high pressures within the gut can force the intestines to balloon out through weak spots in the intestinal wall like an inner tube poking out through a worn tire tread. You can see what they actually look like in my video, Diverticulosis: When Our Most Common Gut Disorder Hardly Existed. These pockets can become inflamed and infected, and, to carry the tire analogy further, can blow out and spill fecal matter into the abdomen, and lead to death. Symptoms can range from no symptoms at all, to a little cramping and bloating, to "incapacitating pain that is a medical emergency." Nine out of ten people who die from the disease never even knew they had it.

The good news is there may be a way to prevent the disease. Diverticular disease is the most common intestinal disorder, affecting up to 70% of people by age 60. If it's that common, though, is it just an inevitable consequence of aging? No, it's a new disease. In 1907, 25 cases had been reported in the medical literature. Not cases in 25% of people, but 25 cases period. And diverticular disease is kind of hard to miss on autopsy. A hundred years ago, in 1916, it didn't even merit mention in medical and surgical textbooks. The mystery wasn't solved until 1971.

How did a disease that was almost unknown become the most common affliction of the colon in the Western world within one lifespan? Surgeons Painter and Burkitt suggested diverticulosis was a deficiency disease--i.e., a disease caused by a deficiency of fiber. In the late 1800s, roller milling was introduced, further removing fiber from grain, and we started to fill up on other fiber-deficient foods like meat and sugar. A few decades of this and diverticulosis was rampant.

This is what Painter and Burkitt thought was going on: Just as it would be easy to squeeze a lump of butter through a bicycle tube, it's easy to move large, soft, and moist intestinal contents through the gut. In contrast, try squeezing through a lump of tar. When we eat fiber-deficient diets, our feces can become small and firm, and our intestines have to really squeeze down hard to move them along. This buildup of pressure may force out those bulges. Eventually, a low-fiber diet can sometimes lead to the colon literally rupturing itself.

If this theory is true, then populations eating high­-fiber diets would have low rates of diverticulosis. That's exactly what's been found. More than 50% of African Americans in their 50s were found to have diverticulosis, compared to less than 1% in African Africans eating traditional plant-based diets. By less than 1%, we're talking zero out of a series of 2,000 autopsies in South Africa and two out of 4,000 in Uganda. That's about one thousand times lower prevalence.

What, then, do we make of a new study concluding that a low-fiber diet was not associated with diverticulosis. I cover that in my video Does Fiber Really Prevent Diverticulosis?

For more on bowel health, see:

What if your doctor says you shouldn't eat healthy foods like nuts and popcorn because of your diverticulosis? Share with them my Diverticulosis & Nuts video.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sean T Evans / Flickr. This image has been modified.

Original Link

The 3 Vitamins that Prevent Brain Loss

The 3 Vitamins that Prevent Brain Loss.jpeg

By our seventies, one in five of us will suffer from cognitive impairment. Within five years, half of those cognitively impaired will progress to dementia and death. The earlier we can slow or stop this process, the better.

Although an effective treatment for Alzheimer's disease is unavailable, interventions just to control risk factors could prevent millions of cases. An immense effort has been spent on identifying such risk factors for Alzheimer's and developing treatments to reduce them.

In 1990, a small study of 22 Alzheimer's patients reported high concentrations of homocysteine in their blood. The homocysteine story goes back to 1969 when a Harvard pathologist reported two cases of children, one dating back to 1933, whose brains had turned to mush. They both suffered from extremely rare genetic mutations that led to abnormally high levels of homocysteine in their bodies. Is it possible, he asked, that homocysteine could cause brain damage even in people without genetic defects?

Here we are in the 21st century, and homocysteine is considered "a strong, independent risk factor for the development of dementia and Alzheimer's disease." Having a blood level over 14 (µmol/L) may double our risk. In the Framingham Study, researchers estimate that as many as one in six Alzheimer's cases may be attributable to elevated homocysteine in the blood, which is now thought to play a role in brain damage and cognitive and memory decline. Our body can detoxify homocysteine, though, using three vitamins: folate, vitamin B12, and vitamin B6. So why don't we put them to the test? No matter how many studies find an association between high homocysteinea and cognitive decline, dementia, or Alzheimer's disease, a cause-and-effect role can only be confirmed by interventional studies.

Initially, the results were disappointing. Vitamin supplementation did not seem to work, but the studies were tracking neuropsychological assessments, which are more subjective compared to structural neuroimaging--that is, actually seeing what's happening to the brain. A double-blind randomized controlled trial found that homocysteine-lowering by B vitamins can slow the rate of accelerated brain atrophy in people with mild cognitive impairment. As we age, our brains slowly atrophy, but the shrinking is much accelerated in patients suffering from Alzheimer's disease. An intermediate rate of shrinkage is found in people with mild cognitive impairment. The thinking is if we could slow the rate of brain loss, we may be able to slow the conversion to Alzheimer's disease. Researchers tried giving people B vitamins for two years and found it markedly slowed the rate of brain shrinkage. The rate of atrophy in those with high homocysteine levels was cut in half. A simple, safe treatment can slow the accelerated rate of brain loss.

A follow-up study went further by demonstrating that B-vitamin treatment reduces, by as much as seven-fold, the brain atrophy in the regions specifically vulnerable to the Alzheimer's disease process. You can see the amount of brain atrophy over a two-year period in the placebo group versus the B-vitamin group in my Preventing Brain Loss with B Vitamins? video.

The beneficial effect of B vitamins was confined to those with high homocysteine, indicating a relative deficiency in one of those three vitamins. Wouldn't it be better to not become deficient in the first place? Most people get enough B12 and B6. The reason these folks were stuck at a homocysteine of 11 µmoles per liter is that they probably weren't getting enough folate, which is found concentrated in beans and greens. Ninety-six percent of Americans don't even make the minimum recommended amount of dark green leafy vegetables, which is the same pitiful number who don't eat the minimum recommendation for beans.

If we put people on a healthy diet--a plant-based diet--we can drop their homocysteine levels by 20% in just one week, from around 11 mmoles per liter down to 9 mmoles per liter. The fact that they showed rapid and significant homocysteine lowering without any pills or supplements implies that multiple mechanisms may have been at work. The researchers suggest it may be because of the fiber. Every gram of daily fiber consumption may increase folate levels in the blood nearly 2%, perhaps by boosting vitamin production in the colon by all our friendly gut bacteria. It also could be from the decreased methionine intake.

Methionine is where homocysteine comes from. Homocysteine is a breakdown product of methionine, which comes mostly from animal protein. If we give someone bacon and eggs for breakfast and a steak for dinner, we can get spikes of homocysteine levels in the blood. Thus, decreased methionine intake on a plant-based diet may be another factor contributing to lower, safer homocysteine levels.

The irony is that those who eat plant-based diets long-term, not just at a health spa for a week, have terrible homocysteine levels. Meat-eaters are up at 11 µmoles per liter, but vegetarians at nearly 14 µmoles per liter and vegans at 16 µmoles per liter. Why? The vegetarians and vegans were getting more fiber and folate, but not enough vitamin B12. Most vegans were at risk for suffering from hyperhomocysteinaemia (too much homocysteine in the blood) because most vegans in the study were not supplementing with vitamin B12 or eating vitamin B12-fortified foods, which is critical for anyone eating a plant-based diet. If you take vegans and give them B12, their homocysteine levels can drop down below 5. Why not down to just 11? The reason meat-eaters were stuck up at 11 is presumably because they weren't getting enough folate. Once vegans got enough B12, they could finally fully exploit the benefits of their plant-based diets and come out with the lowest levels of all.

This is very similar to the findings in my video Vitamin B12 Necessary for Arterial Health.

For more details on ensuring a regular reliable source of vitamin B12:

There are more benefits to lowering your methionine intake. Check out Methionine Restriction as a Life Extension Strategy and Starving Cancer with Methionine Restriction.

For more on brain health in general, see these videos:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Thomas Hawk / Flickr. This image has been modified.

Original Link

The 3 Vitamins that Prevent Brain Loss

The 3 Vitamins that Prevent Brain Loss.jpeg

By our seventies, one in five of us will suffer from cognitive impairment. Within five years, half of those cognitively impaired will progress to dementia and death. The earlier we can slow or stop this process, the better.

Although an effective treatment for Alzheimer's disease is unavailable, interventions just to control risk factors could prevent millions of cases. An immense effort has been spent on identifying such risk factors for Alzheimer's and developing treatments to reduce them.

In 1990, a small study of 22 Alzheimer's patients reported high concentrations of homocysteine in their blood. The homocysteine story goes back to 1969 when a Harvard pathologist reported two cases of children, one dating back to 1933, whose brains had turned to mush. They both suffered from extremely rare genetic mutations that led to abnormally high levels of homocysteine in their bodies. Is it possible, he asked, that homocysteine could cause brain damage even in people without genetic defects?

Here we are in the 21st century, and homocysteine is considered "a strong, independent risk factor for the development of dementia and Alzheimer's disease." Having a blood level over 14 (µmol/L) may double our risk. In the Framingham Study, researchers estimate that as many as one in six Alzheimer's cases may be attributable to elevated homocysteine in the blood, which is now thought to play a role in brain damage and cognitive and memory decline. Our body can detoxify homocysteine, though, using three vitamins: folate, vitamin B12, and vitamin B6. So why don't we put them to the test? No matter how many studies find an association between high homocysteinea and cognitive decline, dementia, or Alzheimer's disease, a cause-and-effect role can only be confirmed by interventional studies.

Initially, the results were disappointing. Vitamin supplementation did not seem to work, but the studies were tracking neuropsychological assessments, which are more subjective compared to structural neuroimaging--that is, actually seeing what's happening to the brain. A double-blind randomized controlled trial found that homocysteine-lowering by B vitamins can slow the rate of accelerated brain atrophy in people with mild cognitive impairment. As we age, our brains slowly atrophy, but the shrinking is much accelerated in patients suffering from Alzheimer's disease. An intermediate rate of shrinkage is found in people with mild cognitive impairment. The thinking is if we could slow the rate of brain loss, we may be able to slow the conversion to Alzheimer's disease. Researchers tried giving people B vitamins for two years and found it markedly slowed the rate of brain shrinkage. The rate of atrophy in those with high homocysteine levels was cut in half. A simple, safe treatment can slow the accelerated rate of brain loss.

A follow-up study went further by demonstrating that B-vitamin treatment reduces, by as much as seven-fold, the brain atrophy in the regions specifically vulnerable to the Alzheimer's disease process. You can see the amount of brain atrophy over a two-year period in the placebo group versus the B-vitamin group in my Preventing Brain Loss with B Vitamins? video.

The beneficial effect of B vitamins was confined to those with high homocysteine, indicating a relative deficiency in one of those three vitamins. Wouldn't it be better to not become deficient in the first place? Most people get enough B12 and B6. The reason these folks were stuck at a homocysteine of 11 µmoles per liter is that they probably weren't getting enough folate, which is found concentrated in beans and greens. Ninety-six percent of Americans don't even make the minimum recommended amount of dark green leafy vegetables, which is the same pitiful number who don't eat the minimum recommendation for beans.

If we put people on a healthy diet--a plant-based diet--we can drop their homocysteine levels by 20% in just one week, from around 11 mmoles per liter down to 9 mmoles per liter. The fact that they showed rapid and significant homocysteine lowering without any pills or supplements implies that multiple mechanisms may have been at work. The researchers suggest it may be because of the fiber. Every gram of daily fiber consumption may increase folate levels in the blood nearly 2%, perhaps by boosting vitamin production in the colon by all our friendly gut bacteria. It also could be from the decreased methionine intake.

Methionine is where homocysteine comes from. Homocysteine is a breakdown product of methionine, which comes mostly from animal protein. If we give someone bacon and eggs for breakfast and a steak for dinner, we can get spikes of homocysteine levels in the blood. Thus, decreased methionine intake on a plant-based diet may be another factor contributing to lower, safer homocysteine levels.

The irony is that those who eat plant-based diets long-term, not just at a health spa for a week, have terrible homocysteine levels. Meat-eaters are up at 11 µmoles per liter, but vegetarians at nearly 14 µmoles per liter and vegans at 16 µmoles per liter. Why? The vegetarians and vegans were getting more fiber and folate, but not enough vitamin B12. Most vegans were at risk for suffering from hyperhomocysteinaemia (too much homocysteine in the blood) because most vegans in the study were not supplementing with vitamin B12 or eating vitamin B12-fortified foods, which is critical for anyone eating a plant-based diet. If you take vegans and give them B12, their homocysteine levels can drop down below 5. Why not down to just 11? The reason meat-eaters were stuck up at 11 is presumably because they weren't getting enough folate. Once vegans got enough B12, they could finally fully exploit the benefits of their plant-based diets and come out with the lowest levels of all.

This is very similar to the findings in my video Vitamin B12 Necessary for Arterial Health.

For more details on ensuring a regular reliable source of vitamin B12:

There are more benefits to lowering your methionine intake. Check out Methionine Restriction as a Life Extension Strategy and Starving Cancer with Methionine Restriction.

For more on brain health in general, see these videos:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Thomas Hawk / Flickr. This image has been modified.

Original Link

Comparing Pollutant Levels Between Different Diets

Comparing Pollutant Levels Between Different Diets.jpeg

The results of the CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas) study were published recently. This study of a California birth cohort investigated the relationship between exposure to flame retardant chemical pollutants in pregnancy and childhood, and subsequent neurobehavioral development. Why California? Because California children's exposures to these endocrine disruptors and neurotoxins are among the highest in the world.

What did they find? The researchers concluded that both prenatal and childhood exposures to these chemicals "were associated with poorer attention, fine motor coordination, and cognition" (particularly verbal comprehension) by the time the children reached school age. "This study, the largest to date, contributes to growing evidence suggesting that PBDEs [polybrominated diphenyl ethers, flame retardant chemicals] have adverse impacts on child neurobehavioral development." The effects may extend into adolescence, again affecting motor function as well as thyroid gland function. The effect on our thyroid glands may even extend into adulthood.

These chemicals get into moms, then into the amniotic fluid, and then into the breast milk. The more that's in the milk, the worse the infants' mental development may be. Breast milk is still best, but how did these women get exposed in the first place?

The question has been: Are we exposed mostly from diet or dust? Researchers in Boston collected breast milk samples from 46 first-time moms, vacuumed up samples of dust from their homes, and questioned them about their diets. The researchers found that both were likely to blame. Diet-wise, a number of animal products were implicated. This is consistent with what's been found worldwide. For example, in Europe, these flame retardant chemical pollutants are found mostly in meat, including fish, and other animal products. It's similar to what we see with dioxins--they are mostly found in fish and other fatty foods, with a plant-based diet offering the lowest exposure.

If that's the case, do vegetarians have lower levels of flame retardant chemical pollutants circulating in their bloodstreams? Yes. Vegetarians may have about 25% lower levels. Poultry appears to be the largest contributor of PBDEs. USDA researchers compared the levels in different meats, and the highest levels of these pollutants were found in chicken and turkey, with less in pork and even less in beef. California poultry had the highest, consistent with strict furniture flammability codes. But it's not like chickens are pecking at the sofa. Chickens and turkeys may be exposed indirectly through the application of sewer sludge to fields where feed crops are raised, contamination of water supplies, the use of flame-retarded materials in poultry housing, or the inadvertent incorporation of fire-retardant material into the birds' bedding or feed ingredients.

Fish have been shown to have the highest levels overall, but Americans don't eat a lot of fish so they don't contribute as much to the total body burden in the United States. Researchers have compared the level of PBDEs found in meat-eaters and vegetarians. The amount found in the bloodstream of vegetarians is noticeably lower, as you can see in my video Flame Retardant Pollutants and Child Development. Just to give you a sense of the contribution of chicken, higher than average poultry eaters have higher levels than omnivores as a whole, and lower than average poultry eaters have levels lower than omnivores.

What are the PBDE levels in vegans? We know the intake of many other classes of pollutants is almost exclusively from the ingestion of animal fats in the diet. What if we take them all out of the diet? It works for dioxins. Vegan dioxin levels appear markedly lower than the general population. What about for the flame retardant chemicals? Vegans have levels lower than vegetarians, with those who've been vegan around 20 years having even lower concentrations. This tendency for chemical levels to decline the longer one eats plant-based suggests that food of animal origin contributes substantially. But note that levels never get down to zero, so diet is not the only source.

The USDA researchers note that there are currently no regulatory limits on the amount of flame retardant chemical contamination in U.S. foods, "but reducing the levels of unnecessary, persistent, toxic compounds in our diet is certainly desirable."

I've previously talked about this class of chemicals in Food Sources of Flame Retardant Chemicals. The same foods seem to accumulate a variety of pollutants:

Many of these chemicals have hormone- or endocrine-disrupting effects. See, for example:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Mitchell Haindfield / Flickr. This image has been modified.

Original Link

Comparing Pollutant Levels Between Different Diets

Comparing Pollutant Levels Between Different Diets.jpeg

The results of the CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas) study were published recently. This study of a California birth cohort investigated the relationship between exposure to flame retardant chemical pollutants in pregnancy and childhood, and subsequent neurobehavioral development. Why California? Because California children's exposures to these endocrine disruptors and neurotoxins are among the highest in the world.

What did they find? The researchers concluded that both prenatal and childhood exposures to these chemicals "were associated with poorer attention, fine motor coordination, and cognition" (particularly verbal comprehension) by the time the children reached school age. "This study, the largest to date, contributes to growing evidence suggesting that PBDEs [polybrominated diphenyl ethers, flame retardant chemicals] have adverse impacts on child neurobehavioral development." The effects may extend into adolescence, again affecting motor function as well as thyroid gland function. The effect on our thyroid glands may even extend into adulthood.

These chemicals get into moms, then into the amniotic fluid, and then into the breast milk. The more that's in the milk, the worse the infants' mental development may be. Breast milk is still best, but how did these women get exposed in the first place?

The question has been: Are we exposed mostly from diet or dust? Researchers in Boston collected breast milk samples from 46 first-time moms, vacuumed up samples of dust from their homes, and questioned them about their diets. The researchers found that both were likely to blame. Diet-wise, a number of animal products were implicated. This is consistent with what's been found worldwide. For example, in Europe, these flame retardant chemical pollutants are found mostly in meat, including fish, and other animal products. It's similar to what we see with dioxins--they are mostly found in fish and other fatty foods, with a plant-based diet offering the lowest exposure.

If that's the case, do vegetarians have lower levels of flame retardant chemical pollutants circulating in their bloodstreams? Yes. Vegetarians may have about 25% lower levels. Poultry appears to be the largest contributor of PBDEs. USDA researchers compared the levels in different meats, and the highest levels of these pollutants were found in chicken and turkey, with less in pork and even less in beef. California poultry had the highest, consistent with strict furniture flammability codes. But it's not like chickens are pecking at the sofa. Chickens and turkeys may be exposed indirectly through the application of sewer sludge to fields where feed crops are raised, contamination of water supplies, the use of flame-retarded materials in poultry housing, or the inadvertent incorporation of fire-retardant material into the birds' bedding or feed ingredients.

Fish have been shown to have the highest levels overall, but Americans don't eat a lot of fish so they don't contribute as much to the total body burden in the United States. Researchers have compared the level of PBDEs found in meat-eaters and vegetarians. The amount found in the bloodstream of vegetarians is noticeably lower, as you can see in my video Flame Retardant Pollutants and Child Development. Just to give you a sense of the contribution of chicken, higher than average poultry eaters have higher levels than omnivores as a whole, and lower than average poultry eaters have levels lower than omnivores.

What are the PBDE levels in vegans? We know the intake of many other classes of pollutants is almost exclusively from the ingestion of animal fats in the diet. What if we take them all out of the diet? It works for dioxins. Vegan dioxin levels appear markedly lower than the general population. What about for the flame retardant chemicals? Vegans have levels lower than vegetarians, with those who've been vegan around 20 years having even lower concentrations. This tendency for chemical levels to decline the longer one eats plant-based suggests that food of animal origin contributes substantially. But note that levels never get down to zero, so diet is not the only source.

The USDA researchers note that there are currently no regulatory limits on the amount of flame retardant chemical contamination in U.S. foods, "but reducing the levels of unnecessary, persistent, toxic compounds in our diet is certainly desirable."

I've previously talked about this class of chemicals in Food Sources of Flame Retardant Chemicals. The same foods seem to accumulate a variety of pollutants:

Many of these chemicals have hormone- or endocrine-disrupting effects. See, for example:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Mitchell Haindfield / Flickr. This image has been modified.

Original Link

Plant-Based Diets as the Nutritional Equivalent of Quitting Smoking

The Best Kept Secret in Medicine.jpeg

Despite the most widely accepted and well-established chronic disease practice guidelines uniformly calling for lifestyle change as the first line of therapy, doctors often don't follow these recommendations. As seen in my video, The Best Kept Secret in Medicine, lifestyle interventions are not only safer and cheaper but often more effective in reducing heart disease and failure, hypertension, stroke, cancer, diabetes, and deaths from all causes than nearly any other medical intervention.

"Some useful lessons may come from the war on tobacco," Dr. Neal Barnard wrote in the American Medical Association's ethics journal. When he stopped smoking himself in the 1980s, the lung cancer death rate was peaking in the United States. As the prevalence of smoking dropped, so have lung cancer rates. No longer were doctors telling patients to "[g]ive your throat a vacation" by smoking a fresh cigarette. Doctors realized they were "more effective at counseling patients to quit smoking if they no longer had tobacco stains on their own fingers." "In other words, doctors went from being bystanders--or even enablers--to leading the fight against smoking." And today, says Dr. Barnard, "Plant-based diets are the nutritional equivalent of quitting smoking."

From an editorial in the journal Alternative Therapies in Health and Medicine: "If we were to gather the world's top nutrition scientists and experts (free from food industry influence), there would be very little debate about the essential properties of good nutrition. Unfortunately, most doctors are nutritionally illiterate. And worse, they don't know how to use the most powerful medicine available to them: food."

Physician advice matters. When doctors told patients to improve their diets by cutting down on meat, dairy, and fried foods, patients were more likely to make dietary changes. It may work even better if doctors practice what they preach. Researchers at Emory University randomized patients to watch one of two videos. In one video, a physician briefly mentioned her personal dietary and exercise practices and visible on her desk were both a bike helmet and an apple. In the other video, she did not discuss her personal healthy practices, and the helmet and apple were missing. In both videos, the doctor advised the patients to cut down on meat, not usually have meat for breakfast, and have no meats for lunch or dinner at least half the time. In the disclosure video, the physician related that she herself had successfully cut down on meat. Perhaps not surprisingly, patients rated that physician to be more believable and motivating. Physicians who walk the walk--literally--and have healthier eating habits not only tend to counsel more about exercise and diet, but have been found to seem more credible or motivating when they do so.

It may also make them better doctors. A randomized controlled intervention to clean up doctors' diets, called the Promoting Health by Self Experience (PHASE) trial, found that healthcare providers' personal lifestyles were correlated directly with their clinical performance. Healthcare providers' improved wellbeing and lifestyle cascaded to the patients and clinics, suggesting an additional strategy to achieve successful health promotion.

Are you ready for the best kept secret in medicine? Given the right conditions, the body can heal itself. For example, treating cardiovascular disease with appropriate dietary changes is good medicine, reducing mortality without any adverse effects. We should keep doing research, certainly, but educating physicians and patients alike about the existing knowledge regarding the power of nutrition as medicine may be the best investment we can make.

Of course, to advise patients about nutrition, physicians first have to educate themselves, as it is unlikely they received formal nutrition education during their medical training:

For more on the power of healthy living, see:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank. This image has been modified.

Original Link

Plant-Based Diets as the Nutritional Equivalent of Quitting Smoking

The Best Kept Secret in Medicine.jpeg

Despite the most widely accepted and well-established chronic disease practice guidelines uniformly calling for lifestyle change as the first line of therapy, doctors often don't follow these recommendations. As seen in my video, The Best Kept Secret in Medicine, lifestyle interventions are not only safer and cheaper but often more effective in reducing heart disease and failure, hypertension, stroke, cancer, diabetes, and deaths from all causes than nearly any other medical intervention.

"Some useful lessons may come from the war on tobacco," Dr. Neal Barnard wrote in the American Medical Association's ethics journal. When he stopped smoking himself in the 1980s, the lung cancer death rate was peaking in the United States. As the prevalence of smoking dropped, so have lung cancer rates. No longer were doctors telling patients to "[g]ive your throat a vacation" by smoking a fresh cigarette. Doctors realized they were "more effective at counseling patients to quit smoking if they no longer had tobacco stains on their own fingers." "In other words, doctors went from being bystanders--or even enablers--to leading the fight against smoking." And today, says Dr. Barnard, "Plant-based diets are the nutritional equivalent of quitting smoking."

From an editorial in the journal Alternative Therapies in Health and Medicine: "If we were to gather the world's top nutrition scientists and experts (free from food industry influence), there would be very little debate about the essential properties of good nutrition. Unfortunately, most doctors are nutritionally illiterate. And worse, they don't know how to use the most powerful medicine available to them: food."

Physician advice matters. When doctors told patients to improve their diets by cutting down on meat, dairy, and fried foods, patients were more likely to make dietary changes. It may work even better if doctors practice what they preach. Researchers at Emory University randomized patients to watch one of two videos. In one video, a physician briefly mentioned her personal dietary and exercise practices and visible on her desk were both a bike helmet and an apple. In the other video, she did not discuss her personal healthy practices, and the helmet and apple were missing. In both videos, the doctor advised the patients to cut down on meat, not usually have meat for breakfast, and have no meats for lunch or dinner at least half the time. In the disclosure video, the physician related that she herself had successfully cut down on meat. Perhaps not surprisingly, patients rated that physician to be more believable and motivating. Physicians who walk the walk--literally--and have healthier eating habits not only tend to counsel more about exercise and diet, but have been found to seem more credible or motivating when they do so.

It may also make them better doctors. A randomized controlled intervention to clean up doctors' diets, called the Promoting Health by Self Experience (PHASE) trial, found that healthcare providers' personal lifestyles were correlated directly with their clinical performance. Healthcare providers' improved wellbeing and lifestyle cascaded to the patients and clinics, suggesting an additional strategy to achieve successful health promotion.

Are you ready for the best kept secret in medicine? Given the right conditions, the body can heal itself. For example, treating cardiovascular disease with appropriate dietary changes is good medicine, reducing mortality without any adverse effects. We should keep doing research, certainly, but educating physicians and patients alike about the existing knowledge regarding the power of nutrition as medicine may be the best investment we can make.

Of course, to advise patients about nutrition, physicians first have to educate themselves, as it is unlikely they received formal nutrition education during their medical training:

For more on the power of healthy living, see:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank. This image has been modified.

Original Link

What Causes Diabetes?

What Causes Diabetes?.jpeg

After about age 20, we may have all the insulin-producing beta cells we're ever going to get. So if we lose them, we may lose them for good. Autopsy studies show that by the time type 2 diabetes is diagnosed, we may have already killed off half of our beta cells.

You can kill pancreatic cells right in a petri dish. If you expose the insulin-producing beta cells in our pancreas to fat, they suck it up and then start dying off. Fat breakdown products can interfere with the function of these cells and ultimately lead to their death. A chronic increase in blood fat levels can be harmful to our pancreas.

It's not just any fat; it's saturated fat. As you can see in my video, What Causes Diabetes?, predominant fat in olives, nuts, and avocados gives a tiny bump in death protein 5, but saturated fat really elevates this contributor to beta cell death. Therefore, saturated fats are harmful to beta cells. Cholesterol is, too. The uptake of bad cholesterol (LDL) can cause beta cell death as a result of free radical formation.

Diets rich in saturated fats not only cause obesity and insulin resistance, but the increased levels of circulating free fats in the blood (non-esterified fatty acids, or NEFAs) may also cause beta cell death and may thus contribute to the progressive beta cell loss we see in type 2 diabetes. These findings aren't just based on test tube studies. If researchers have infused fat into people's blood streams, they can show it directly impairing pancreatic beta cell function. The same occurs when we ingest it.

Type 2 diabetes is characterized by "defects in both insulin secretion and insulin action," and saturated fat appears to impair both. Researchers showed saturated fat ingestion reduces insulin sensitivity within hours. The subjects were non-diabetics, so their pancreases should have been able to boost insulin secretion to match the drop in sensitivity. But no, "insulin secretion failed to compensate for insulin resistance in subjects who ingested [the saturated fat]." This implies saturated fat impaired beta cell function as well, again just within hours after going into our mouth. "[I]ncreased consumption of [saturated fats] has a powerful short- and long-term effect on insulin action," contributing to the dysfunction and death of pancreatic beta cells in diabetes.

Saturated fat isn't just toxic to the pancreas. The fats found predominantly in meat and dairy--chicken and cheese are the two main sources in the American diet--are considered nearly "universally toxic." In contrast, the fats found in olives, nuts, and avocados are not. Saturated fat has been found to be particularly toxic to liver cells, contributing to the formation of fatty liver disease. If you expose human liver cells to plant fat, though, nothing happens. If you expose our liver cells to animal fat, a third of them die. This may explain why higher intake of saturated fat and cholesterol are associated with non-alcoholic fatty liver disease.

By cutting down on saturated fat consumption, we may be able to help interrupt these processes. Decreasing saturated fat intake can help bring down the need for all that excess insulin. So either being fat or eating saturated fat can both cause excess insulin in the blood. The effect of reducing dietary saturated fat intake on insulin levels is substantial, regardless of how much belly fat we have. It's not just that by eating fat we may be more likely to store it as fat. Saturated fats, independently of any role they have in making us fat, "may contribute to the development of insulin resistance and its clinical consequences." After controlling for weight, alcohol, smoking, exercise, and family history, diabetes incidence was significantly associated with the proportion of saturated fat in our blood.

So what causes diabetes? The consumption of too many calories rich in saturated fats. Just like everyone who smokes doesn't develop lung cancer, everyone who eats a lot of saturated fat doesn't develop diabetes--there is a genetic component. But just like smoking can be said to cause lung cancer, high-calorie diets rich in saturated fats are currently considered the cause of type 2 diabetes.

I have a lot of videos on diabetes, including:

Preventing the disease:

And treating it:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank. This image has been modified.

Original Link

What Causes Diabetes?

What Causes Diabetes?.jpeg

After about age 20, we may have all the insulin-producing beta cells we're ever going to get. So if we lose them, we may lose them for good. Autopsy studies show that by the time type 2 diabetes is diagnosed, we may have already killed off half of our beta cells.

You can kill pancreatic cells right in a petri dish. If you expose the insulin-producing beta cells in our pancreas to fat, they suck it up and then start dying off. Fat breakdown products can interfere with the function of these cells and ultimately lead to their death. A chronic increase in blood fat levels can be harmful to our pancreas.

It's not just any fat; it's saturated fat. As you can see in my video, What Causes Diabetes?, predominant fat in olives, nuts, and avocados gives a tiny bump in death protein 5, but saturated fat really elevates this contributor to beta cell death. Therefore, saturated fats are harmful to beta cells. Cholesterol is, too. The uptake of bad cholesterol (LDL) can cause beta cell death as a result of free radical formation.

Diets rich in saturated fats not only cause obesity and insulin resistance, but the increased levels of circulating free fats in the blood (non-esterified fatty acids, or NEFAs) may also cause beta cell death and may thus contribute to the progressive beta cell loss we see in type 2 diabetes. These findings aren't just based on test tube studies. If researchers have infused fat into people's blood streams, they can show it directly impairing pancreatic beta cell function. The same occurs when we ingest it.

Type 2 diabetes is characterized by "defects in both insulin secretion and insulin action," and saturated fat appears to impair both. Researchers showed saturated fat ingestion reduces insulin sensitivity within hours. The subjects were non-diabetics, so their pancreases should have been able to boost insulin secretion to match the drop in sensitivity. But no, "insulin secretion failed to compensate for insulin resistance in subjects who ingested [the saturated fat]." This implies saturated fat impaired beta cell function as well, again just within hours after going into our mouth. "[I]ncreased consumption of [saturated fats] has a powerful short- and long-term effect on insulin action," contributing to the dysfunction and death of pancreatic beta cells in diabetes.

Saturated fat isn't just toxic to the pancreas. The fats found predominantly in meat and dairy--chicken and cheese are the two main sources in the American diet--are considered nearly "universally toxic." In contrast, the fats found in olives, nuts, and avocados are not. Saturated fat has been found to be particularly toxic to liver cells, contributing to the formation of fatty liver disease. If you expose human liver cells to plant fat, though, nothing happens. If you expose our liver cells to animal fat, a third of them die. This may explain why higher intake of saturated fat and cholesterol are associated with non-alcoholic fatty liver disease.

By cutting down on saturated fat consumption, we may be able to help interrupt these processes. Decreasing saturated fat intake can help bring down the need for all that excess insulin. So either being fat or eating saturated fat can both cause excess insulin in the blood. The effect of reducing dietary saturated fat intake on insulin levels is substantial, regardless of how much belly fat we have. It's not just that by eating fat we may be more likely to store it as fat. Saturated fats, independently of any role they have in making us fat, "may contribute to the development of insulin resistance and its clinical consequences." After controlling for weight, alcohol, smoking, exercise, and family history, diabetes incidence was significantly associated with the proportion of saturated fat in our blood.

So what causes diabetes? The consumption of too many calories rich in saturated fats. Just like everyone who smokes doesn't develop lung cancer, everyone who eats a lot of saturated fat doesn't develop diabetes--there is a genetic component. But just like smoking can be said to cause lung cancer, high-calorie diets rich in saturated fats are currently considered the cause of type 2 diabetes.

I have a lot of videos on diabetes, including:

Preventing the disease:

And treating it:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank. This image has been modified.

Original Link