Comparing Pollutant Levels Between Different Diets

Comparing Pollutant Levels Between Different Diets.jpeg

The results of the CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas) study were published recently. This study of a California birth cohort investigated the relationship between exposure to flame retardant chemical pollutants in pregnancy and childhood, and subsequent neurobehavioral development. Why California? Because California children's exposures to these endocrine disruptors and neurotoxins are among the highest in the world.

What did they find? The researchers concluded that both prenatal and childhood exposures to these chemicals "were associated with poorer attention, fine motor coordination, and cognition" (particularly verbal comprehension) by the time the children reached school age. "This study, the largest to date, contributes to growing evidence suggesting that PBDEs [polybrominated diphenyl ethers, flame retardant chemicals] have adverse impacts on child neurobehavioral development." The effects may extend into adolescence, again affecting motor function as well as thyroid gland function. The effect on our thyroid glands may even extend into adulthood.

These chemicals get into moms, then into the amniotic fluid, and then into the breast milk. The more that's in the milk, the worse the infants' mental development may be. Breast milk is still best, but how did these women get exposed in the first place?

The question has been: Are we exposed mostly from diet or dust? Researchers in Boston collected breast milk samples from 46 first-time moms, vacuumed up samples of dust from their homes, and questioned them about their diets. The researchers found that both were likely to blame. Diet-wise, a number of animal products were implicated. This is consistent with what's been found worldwide. For example, in Europe, these flame retardant chemical pollutants are found mostly in meat, including fish, and other animal products. It's similar to what we see with dioxins--they are mostly found in fish and other fatty foods, with a plant-based diet offering the lowest exposure.

If that's the case, do vegetarians have lower levels of flame retardant chemical pollutants circulating in their bloodstreams? Yes. Vegetarians may have about 25% lower levels. Poultry appears to be the largest contributor of PBDEs. USDA researchers compared the levels in different meats, and the highest levels of these pollutants were found in chicken and turkey, with less in pork and even less in beef. California poultry had the highest, consistent with strict furniture flammability codes. But it's not like chickens are pecking at the sofa. Chickens and turkeys may be exposed indirectly through the application of sewer sludge to fields where feed crops are raised, contamination of water supplies, the use of flame-retarded materials in poultry housing, or the inadvertent incorporation of fire-retardant material into the birds' bedding or feed ingredients.

Fish have been shown to have the highest levels overall, but Americans don't eat a lot of fish so they don't contribute as much to the total body burden in the United States. Researchers have compared the level of PBDEs found in meat-eaters and vegetarians. The amount found in the bloodstream of vegetarians is noticeably lower, as you can see in my video Flame Retardant Pollutants and Child Development. Just to give you a sense of the contribution of chicken, higher than average poultry eaters have higher levels than omnivores as a whole, and lower than average poultry eaters have levels lower than omnivores.

What are the PBDE levels in vegans? We know the intake of many other classes of pollutants is almost exclusively from the ingestion of animal fats in the diet. What if we take them all out of the diet? It works for dioxins. Vegan dioxin levels appear markedly lower than the general population. What about for the flame retardant chemicals? Vegans have levels lower than vegetarians, with those who've been vegan around 20 years having even lower concentrations. This tendency for chemical levels to decline the longer one eats plant-based suggests that food of animal origin contributes substantially. But note that levels never get down to zero, so diet is not the only source.

The USDA researchers note that there are currently no regulatory limits on the amount of flame retardant chemical contamination in U.S. foods, "but reducing the levels of unnecessary, persistent, toxic compounds in our diet is certainly desirable."

I've previously talked about this class of chemicals in Food Sources of Flame Retardant Chemicals. The same foods seem to accumulate a variety of pollutants:

Many of these chemicals have hormone- or endocrine-disrupting effects. See, for example:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Mitchell Haindfield / Flickr. This image has been modified.

Original Link

Comparing Pollutant Levels Between Different Diets

Comparing Pollutant Levels Between Different Diets.jpeg

The results of the CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas) study were published recently. This study of a California birth cohort investigated the relationship between exposure to flame retardant chemical pollutants in pregnancy and childhood, and subsequent neurobehavioral development. Why California? Because California children's exposures to these endocrine disruptors and neurotoxins are among the highest in the world.

What did they find? The researchers concluded that both prenatal and childhood exposures to these chemicals "were associated with poorer attention, fine motor coordination, and cognition" (particularly verbal comprehension) by the time the children reached school age. "This study, the largest to date, contributes to growing evidence suggesting that PBDEs [polybrominated diphenyl ethers, flame retardant chemicals] have adverse impacts on child neurobehavioral development." The effects may extend into adolescence, again affecting motor function as well as thyroid gland function. The effect on our thyroid glands may even extend into adulthood.

These chemicals get into moms, then into the amniotic fluid, and then into the breast milk. The more that's in the milk, the worse the infants' mental development may be. Breast milk is still best, but how did these women get exposed in the first place?

The question has been: Are we exposed mostly from diet or dust? Researchers in Boston collected breast milk samples from 46 first-time moms, vacuumed up samples of dust from their homes, and questioned them about their diets. The researchers found that both were likely to blame. Diet-wise, a number of animal products were implicated. This is consistent with what's been found worldwide. For example, in Europe, these flame retardant chemical pollutants are found mostly in meat, including fish, and other animal products. It's similar to what we see with dioxins--they are mostly found in fish and other fatty foods, with a plant-based diet offering the lowest exposure.

If that's the case, do vegetarians have lower levels of flame retardant chemical pollutants circulating in their bloodstreams? Yes. Vegetarians may have about 25% lower levels. Poultry appears to be the largest contributor of PBDEs. USDA researchers compared the levels in different meats, and the highest levels of these pollutants were found in chicken and turkey, with less in pork and even less in beef. California poultry had the highest, consistent with strict furniture flammability codes. But it's not like chickens are pecking at the sofa. Chickens and turkeys may be exposed indirectly through the application of sewer sludge to fields where feed crops are raised, contamination of water supplies, the use of flame-retarded materials in poultry housing, or the inadvertent incorporation of fire-retardant material into the birds' bedding or feed ingredients.

Fish have been shown to have the highest levels overall, but Americans don't eat a lot of fish so they don't contribute as much to the total body burden in the United States. Researchers have compared the level of PBDEs found in meat-eaters and vegetarians. The amount found in the bloodstream of vegetarians is noticeably lower, as you can see in my video Flame Retardant Pollutants and Child Development. Just to give you a sense of the contribution of chicken, higher than average poultry eaters have higher levels than omnivores as a whole, and lower than average poultry eaters have levels lower than omnivores.

What are the PBDE levels in vegans? We know the intake of many other classes of pollutants is almost exclusively from the ingestion of animal fats in the diet. What if we take them all out of the diet? It works for dioxins. Vegan dioxin levels appear markedly lower than the general population. What about for the flame retardant chemicals? Vegans have levels lower than vegetarians, with those who've been vegan around 20 years having even lower concentrations. This tendency for chemical levels to decline the longer one eats plant-based suggests that food of animal origin contributes substantially. But note that levels never get down to zero, so diet is not the only source.

The USDA researchers note that there are currently no regulatory limits on the amount of flame retardant chemical contamination in U.S. foods, "but reducing the levels of unnecessary, persistent, toxic compounds in our diet is certainly desirable."

I've previously talked about this class of chemicals in Food Sources of Flame Retardant Chemicals. The same foods seem to accumulate a variety of pollutants:

Many of these chemicals have hormone- or endocrine-disrupting effects. See, for example:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Mitchell Haindfield / Flickr. This image has been modified.

Original Link

Treating Kidney Stones with Diet

Treating Kidney Stones with Diet.jpeg

Studies suggest that excessive consumption of animal protein poses a risk of kidney stone formation, likely due to the acid load contributed by the high content of sulfur-containing amino acids in animal protein, a topic I explore in my video, Preventing Kidney Stones with Diet. What about treating kidney stones, though? I discuss that in How to Treat Kidney Stones with Diet. Most stones are calcium oxalate, formed like rock candy when the urine becomes supersaturated. Doctors just assumed that if stones are made out of calcium, we simply have to tell people to reduce their calcium intake. That was the dietary gospel for kidney stone sufferers until a 2002 study published in the New England Journal of Medicine pitted two diets against one another--a low-calcium diet versus a diet low in animal protein and salt. The restriction of animal protein and salt provided greater protection, cutting the risk of having another kidney stone within five years in half.

What about cutting down on oxalates, which are concentrated in certain vegetables? A recent study found there was no increased risk of stone formation with higher vegetable intake. In fact, greater dietary intake of whole plant foods, fruits, and vegetables were each associated with reduced risk independent of other known risk factors for kidney stones. This means we may get additional benefits bulking up on plant foods in addition to just restricting animal foods.

A reduction in animal protein not only reduces the production of acids within the body, but should also limit the excretion of urate, uric acid crystals that can act as seeds to form calcium stones or create entire stones themselves. (Uric acid stones are the second most common kidney stones after calcium.)

There are two ways to reduce uric acid levels in the urine: a reduction of animal protein ingestion, or a variety of drugs. Removing all meat--that is, switching from the standard Western diet to a vegetarian diet--can remove 93% of uric acid crystallization risk within days.

To minimize uric acid crystallization, the goal is to get our urine pH up to ideally as high as 6.8. A number of alkalinizing chemicals have been developed for just this purpose, but we can naturally alkalize our urine up to the recommended 6.8 using purely dietary means. Namely, by removing all meat, someone eating the standard Western diet can go from a pH of 5.95 to the goal target of 6.8--simply by eating plant-based. As I describe in my video, Testing Your Diet with Pee & Purple Cabbage, we can inexpensively test our own diets with a little bathroom chemistry, for not all plant foods are alkalinizing and not all animal foods are equally acidifying.

A Load of Acid to Kidney Evaluation (LAKE) score has been developed to take into account both the acid load of foods and their typical serving sizes. It can be used to help people modify their diet for the prevention of both uric acid and calcium kidney stones, as well as other diseases. What did researchers find? The single most acid-producing food is fish, like tuna. Then, in descending order, are pork, then poultry, cheese (though milk and other dairy are much less acidifying), and beef followed by eggs. (Eggs are actually more acidic than beef, but people tend to eat fewer eggs in one sitting.) Some grains, like bread and rice, can be a little acid-forming, but pasta is not. Beans are significantly alkaline-forming, but not as much as fruits or even better, vegetables, which are the most alkaline-forming of all.

Through dietary changes alone, we may be able to dissolve uric acid stones completely and cure patients without drugs or surgery.

To summarize, the most important things we can do diet-wise is to drink 10 to 12 cups of water a day, reduce animal protein, reduce salt, and eat more vegetables and more vegetarian.

Want to try to calculate their LAKE score for the day? Just multiply the number of servings you have of each of the food groups listed in the graph in the video times the score.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Treating Kidney Stones with Diet

Treating Kidney Stones with Diet.jpeg

Studies suggest that excessive consumption of animal protein poses a risk of kidney stone formation, likely due to the acid load contributed by the high content of sulfur-containing amino acids in animal protein, a topic I explore in my video, Preventing Kidney Stones with Diet. What about treating kidney stones, though? I discuss that in How to Treat Kidney Stones with Diet. Most stones are calcium oxalate, formed like rock candy when the urine becomes supersaturated. Doctors just assumed that if stones are made out of calcium, we simply have to tell people to reduce their calcium intake. That was the dietary gospel for kidney stone sufferers until a 2002 study published in the New England Journal of Medicine pitted two diets against one another--a low-calcium diet versus a diet low in animal protein and salt. The restriction of animal protein and salt provided greater protection, cutting the risk of having another kidney stone within five years in half.

What about cutting down on oxalates, which are concentrated in certain vegetables? A recent study found there was no increased risk of stone formation with higher vegetable intake. In fact, greater dietary intake of whole plant foods, fruits, and vegetables were each associated with reduced risk independent of other known risk factors for kidney stones. This means we may get additional benefits bulking up on plant foods in addition to just restricting animal foods.

A reduction in animal protein not only reduces the production of acids within the body, but should also limit the excretion of urate, uric acid crystals that can act as seeds to form calcium stones or create entire stones themselves. (Uric acid stones are the second most common kidney stones after calcium.)

There are two ways to reduce uric acid levels in the urine: a reduction of animal protein ingestion, or a variety of drugs. Removing all meat--that is, switching from the standard Western diet to a vegetarian diet--can remove 93% of uric acid crystallization risk within days.

To minimize uric acid crystallization, the goal is to get our urine pH up to ideally as high as 6.8. A number of alkalinizing chemicals have been developed for just this purpose, but we can naturally alkalize our urine up to the recommended 6.8 using purely dietary means. Namely, by removing all meat, someone eating the standard Western diet can go from a pH of 5.95 to the goal target of 6.8--simply by eating plant-based. As I describe in my video, Testing Your Diet with Pee & Purple Cabbage, we can inexpensively test our own diets with a little bathroom chemistry, for not all plant foods are alkalinizing and not all animal foods are equally acidifying.

A Load of Acid to Kidney Evaluation (LAKE) score has been developed to take into account both the acid load of foods and their typical serving sizes. It can be used to help people modify their diet for the prevention of both uric acid and calcium kidney stones, as well as other diseases. What did researchers find? The single most acid-producing food is fish, like tuna. Then, in descending order, are pork, then poultry, cheese (though milk and other dairy are much less acidifying), and beef followed by eggs. (Eggs are actually more acidic than beef, but people tend to eat fewer eggs in one sitting.) Some grains, like bread and rice, can be a little acid-forming, but pasta is not. Beans are significantly alkaline-forming, but not as much as fruits or even better, vegetables, which are the most alkaline-forming of all.

Through dietary changes alone, we may be able to dissolve uric acid stones completely and cure patients without drugs or surgery.

To summarize, the most important things we can do diet-wise is to drink 10 to 12 cups of water a day, reduce animal protein, reduce salt, and eat more vegetables and more vegetarian.

Want to try to calculate their LAKE score for the day? Just multiply the number of servings you have of each of the food groups listed in the graph in the video times the score.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Plant versus Animal Iron

Plant versus Animal Iron.jpeg

It is commonly thought that those who eat plant-based diets may be more prone to iron deficiency, but it turns out that they're no more likely to suffer from iron deficiency anemia than anybody else. This may be because not only do those eating meat-free diets tend to get more fiber, magnesium, and vitamins like A, C, and E, but they also get more iron.

The iron found predominantly in plants is non-heme iron, which isn't absorbed as well as the heme iron found in blood and muscle, but this may be a good thing. As seen in my video, The Safety of Heme vs. Non-Heme Iron, avoidance of heme iron may be one of the key elements of plant-based protection against metabolic syndrome, and may also be beneficial in lowering the risk from other chronic diseases such as heart disease.

The data linking coronary heart disease and the intake of iron, in general, has been mixed. This inconsistency of evidence may be because of where the iron comes from. The majority of total dietary iron is non-heme iron, coming mostly from plants. So, total iron intake is associated with lower heart disease risk, but iron intake from meat is associated with significantly higher risk for heart disease. This is thought to be because iron can act as a pro-oxidant, contributing to the development of atherosclerosis by oxidizing cholesterol with free radicals. The risk has been quantified as a 27% increase in coronary heart disease risk for every 1 milligram of heme iron consumed daily.

The same has been found for stroke risk. The studies on iron intake and stroke have had conflicting results, but that may be because they had never separated out heme iron from non-heme iron... until now. Researchers found that the intake of meat (heme) iron, but not plant (non-heme) iron, was associated with an increased risk of stroke.

The researchers also found that higher intake of heme iron--but not total or plant (non-heme) iron--was significantly associated with greater risk for type 2 diabetes. There may be a 16% increase in risk for type 2 diabetes for every 1 milligram of heme iron consumed daily.

The same has also been found for cancer, with up to 12% increased risk for every milligram of daily heme iron exposure. In fact, we can actually tell how much meat someone is eating by looking at their tumors. To characterize the mechanisms underlying meat-related lung cancer development, researchers asked lung cancer patients how much meat they ate and examined the gene expression patterns in their tumors. They identified a signature pattern of heme-related gene expression. Although they looked specifically at lung cancer, they expect these meat-related gene expression changes may occur in other cancers as well.

We do need to get enough iron, but only about 3% of premenopausal white women have iron deficiency anemia these days. However, the rates are worse in African and Mexican Americans. Taking into account our leading killers--heart disease, cancer, and diabetes--the healthiest source of iron appears to be non-heme iron, found naturally in abundance in whole grains, beans, split peas, chickpeas, lentils, dark green leafy vegetables, dried fruits, nuts, and seeds.

But how much money can be made on beans, though? The processed food industry came up with a blood-based crisp bread, made out of rye flour and blood from cattle and pigs, which is one of the most concentrated sources of heme iron, about two-thirds more than blood from chickens. If blood-based crackers don't sound particularly appetizing, you can always snack on cow blood cookies. And there are always blood-filled biscuits, whose filling has been described as "a dark-colored, chocolate flavored paste with a very pleasant taste." (It's dark-colored because spray-dried pig blood can have a darkening effect on the food product's color.) The worry is not the color or taste, it's the heme iron, which, because of its potential cancer risk, is not considered safe to add to foods intended for the general population.

Previously, I've touched on the double-edged iron sword in Risk Associated With Iron Supplements and Phytates for the Prevention of Cancer. It may also help answer Why Was Heart Disease Rare in the Mediterranean?

Those eating plant-based diets get more of most nutrients since whole plant foods are so nutrient dense. See Nutrient-Dense Approach to Weight Management.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Plant versus Animal Iron

Plant versus Animal Iron.jpeg

It is commonly thought that those who eat plant-based diets may be more prone to iron deficiency, but it turns out that they're no more likely to suffer from iron deficiency anemia than anybody else. This may be because not only do those eating meat-free diets tend to get more fiber, magnesium, and vitamins like A, C, and E, but they also get more iron.

The iron found predominantly in plants is non-heme iron, which isn't absorbed as well as the heme iron found in blood and muscle, but this may be a good thing. As seen in my video, The Safety of Heme vs. Non-Heme Iron, avoidance of heme iron may be one of the key elements of plant-based protection against metabolic syndrome, and may also be beneficial in lowering the risk from other chronic diseases such as heart disease.

The data linking coronary heart disease and the intake of iron, in general, has been mixed. This inconsistency of evidence may be because of where the iron comes from. The majority of total dietary iron is non-heme iron, coming mostly from plants. So, total iron intake is associated with lower heart disease risk, but iron intake from meat is associated with significantly higher risk for heart disease. This is thought to be because iron can act as a pro-oxidant, contributing to the development of atherosclerosis by oxidizing cholesterol with free radicals. The risk has been quantified as a 27% increase in coronary heart disease risk for every 1 milligram of heme iron consumed daily.

The same has been found for stroke risk. The studies on iron intake and stroke have had conflicting results, but that may be because they had never separated out heme iron from non-heme iron... until now. Researchers found that the intake of meat (heme) iron, but not plant (non-heme) iron, was associated with an increased risk of stroke.

The researchers also found that higher intake of heme iron--but not total or plant (non-heme) iron--was significantly associated with greater risk for type 2 diabetes. There may be a 16% increase in risk for type 2 diabetes for every 1 milligram of heme iron consumed daily.

The same has also been found for cancer, with up to 12% increased risk for every milligram of daily heme iron exposure. In fact, we can actually tell how much meat someone is eating by looking at their tumors. To characterize the mechanisms underlying meat-related lung cancer development, researchers asked lung cancer patients how much meat they ate and examined the gene expression patterns in their tumors. They identified a signature pattern of heme-related gene expression. Although they looked specifically at lung cancer, they expect these meat-related gene expression changes may occur in other cancers as well.

We do need to get enough iron, but only about 3% of premenopausal white women have iron deficiency anemia these days. However, the rates are worse in African and Mexican Americans. Taking into account our leading killers--heart disease, cancer, and diabetes--the healthiest source of iron appears to be non-heme iron, found naturally in abundance in whole grains, beans, split peas, chickpeas, lentils, dark green leafy vegetables, dried fruits, nuts, and seeds.

But how much money can be made on beans, though? The processed food industry came up with a blood-based crisp bread, made out of rye flour and blood from cattle and pigs, which is one of the most concentrated sources of heme iron, about two-thirds more than blood from chickens. If blood-based crackers don't sound particularly appetizing, you can always snack on cow blood cookies. And there are always blood-filled biscuits, whose filling has been described as "a dark-colored, chocolate flavored paste with a very pleasant taste." (It's dark-colored because spray-dried pig blood can have a darkening effect on the food product's color.) The worry is not the color or taste, it's the heme iron, which, because of its potential cancer risk, is not considered safe to add to foods intended for the general population.

Previously, I've touched on the double-edged iron sword in Risk Associated With Iron Supplements and Phytates for the Prevention of Cancer. It may also help answer Why Was Heart Disease Rare in the Mediterranean?

Those eating plant-based diets get more of most nutrients since whole plant foods are so nutrient dense. See Nutrient-Dense Approach to Weight Management.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

The Five Most Important Dietary Tweaks

The Five Most Important Dietary Tweaks.jpeg

Generally, adherence to healthy lifestyle patterns has decreased during the last 18 years. Obesity is up, exercise is down, and the number of people eating just five servings of fruits and veggies a day dropped like a rock. And we didn't start out that great to begin with.

Only 3% of Americans at the turn of the 21st century had the following four healthy lifestyle characteristics: not smoking, not overweight, five daily servings of fruits and vegetables, and exercising a half hour a day at least five days a week. Whether people were wealthy or college-educated didn't matter; no sub-group even remotely met clinical or public health recommendations.

Where are people falling down the most? You can see in my video What Percent of Americans Lead Healthy Lifestyles?. If you look at heart disease risk factors, for example, most people don't smoke and about half are exercising. But if we look at the healthy diet score-which is based on things like drinking less than four cups of soda a week-a scale of zero to five, only about 1% of Americans score a four or five. The American Heart Association's aggressive 2020 target to improve that by 20% would bring us up to 1.2%.

Since we've known for decades that advanced coronary artery disease may be present by age 20--with atherosclerosis often even present in young children--it is particularly disturbing that healthy lifestyle choices are declining rather than improving in the U.S.

In terms of life expectancy, the U.S. is down around 27 or 28 out of the 34 OECD free-market democracies. The people of Slovenia live a year longer than citizens of the United States. Why? According to the most rigorous analysis of risk factors ever published, the number one cause of death and disability in the United States is our diet.

It's the food.

According to the Global Burden of Disease study, the worst five things about our diet are: we don't eat enough fruit, we don't eat enough nuts and seeds, we eat too much salt, too much processed meat, and not enough vegetables.

Studies that have looked at diet quality and chronic disease mortality risk found that those scoring higher (e.g. more whole plant foods), reduced the risk of dying prematurely from heart disease, cancer, and all causes of death combined. There is now an overwhelming body of clinical and epidemiological evidence illustrating the dramatic impact of a healthy lifestyle on reducing all-cause mortality and preventing chronic diseases such as coronary heart disease, stroke, diabetes, and cancer.

Why do we eat so poorly? Aren't we scared of dying from these horrible chronic diseases? It's almost as if we're eating as though our future didn't matter. And there's actually data to back that up, from a study entitled Death Row Nutrition.

The growing macabre fascination with speculating about one's ''last meal'' offers a window into one's true consumption desires when one's value of the future is discounted close to zero. In contrast to pop culture anecdotes, a group of Cornell researchers created a catalog of actual last meals-the final food requests of 247 individuals executed in the United States during a recent five-year period. Meat was the most common request. The researchers go out of their way to note that tofu never made the list, and no one asked for a vegetarian meal. In fact, if you compare the last meals to what Americans normally eat, there's not much difference.

If we continue to eat as though they were our last meals, eventually, they will be.


A few years ago I did a video called Nation's Diet in Crisis. It's sad that it doesn't seem like much has changed. How Many Meet the Simple Seven? is another video in which you can see how your own habits stack up.

For more on fruits and veggies and living longer, see Fruits and Longevity: How Many Minutes per Mouthful? Surprised that nuts made the longevity list? See Nuts May Help Prevent Death. What about legumes? See Increased Lifespan from Beans.

The reason public health professionals are so keen on measuring lifestyle characteristics is because modest improvements may have extraordinary effects. See, for example:

Didn't know the beginnings of heart disease may already be present in children? See my video Heart Disease Starts in Childhood. Think that's tragic? Check out Heart Disease May Start in the Womb. Is it too late if we've been eating poorly most of our lives? It's Never Too Late to Start Eating Healthier.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link

The Five Most Important Dietary Tweaks

The Five Most Important Dietary Tweaks.jpeg

Generally, adherence to healthy lifestyle patterns has decreased during the last 18 years. Obesity is up, exercise is down, and the number of people eating just five servings of fruits and veggies a day dropped like a rock. And we didn't start out that great to begin with.

Only 3% of Americans at the turn of the 21st century had the following four healthy lifestyle characteristics: not smoking, not overweight, five daily servings of fruits and vegetables, and exercising a half hour a day at least five days a week. Whether people were wealthy or college-educated didn't matter; no sub-group even remotely met clinical or public health recommendations.

Where are people falling down the most? You can see in my video What Percent of Americans Lead Healthy Lifestyles?. If you look at heart disease risk factors, for example, most people don't smoke and about half are exercising. But if we look at the healthy diet score-which is based on things like drinking less than four cups of soda a week-a scale of zero to five, only about 1% of Americans score a four or five. The American Heart Association's aggressive 2020 target to improve that by 20% would bring us up to 1.2%.

Since we've known for decades that advanced coronary artery disease may be present by age 20--with atherosclerosis often even present in young children--it is particularly disturbing that healthy lifestyle choices are declining rather than improving in the U.S.

In terms of life expectancy, the U.S. is down around 27 or 28 out of the 34 OECD free-market democracies. The people of Slovenia live a year longer than citizens of the United States. Why? According to the most rigorous analysis of risk factors ever published, the number one cause of death and disability in the United States is our diet.

It's the food.

According to the Global Burden of Disease study, the worst five things about our diet are: we don't eat enough fruit, we don't eat enough nuts and seeds, we eat too much salt, too much processed meat, and not enough vegetables.

Studies that have looked at diet quality and chronic disease mortality risk found that those scoring higher (e.g. more whole plant foods), reduced the risk of dying prematurely from heart disease, cancer, and all causes of death combined. There is now an overwhelming body of clinical and epidemiological evidence illustrating the dramatic impact of a healthy lifestyle on reducing all-cause mortality and preventing chronic diseases such as coronary heart disease, stroke, diabetes, and cancer.

Why do we eat so poorly? Aren't we scared of dying from these horrible chronic diseases? It's almost as if we're eating as though our future didn't matter. And there's actually data to back that up, from a study entitled Death Row Nutrition.

The growing macabre fascination with speculating about one's ''last meal'' offers a window into one's true consumption desires when one's value of the future is discounted close to zero. In contrast to pop culture anecdotes, a group of Cornell researchers created a catalog of actual last meals-the final food requests of 247 individuals executed in the United States during a recent five-year period. Meat was the most common request. The researchers go out of their way to note that tofu never made the list, and no one asked for a vegetarian meal. In fact, if you compare the last meals to what Americans normally eat, there's not much difference.

If we continue to eat as though they were our last meals, eventually, they will be.


A few years ago I did a video called Nation's Diet in Crisis. It's sad that it doesn't seem like much has changed. How Many Meet the Simple Seven? is another video in which you can see how your own habits stack up.

For more on fruits and veggies and living longer, see Fruits and Longevity: How Many Minutes per Mouthful? Surprised that nuts made the longevity list? See Nuts May Help Prevent Death. What about legumes? See Increased Lifespan from Beans.

The reason public health professionals are so keen on measuring lifestyle characteristics is because modest improvements may have extraordinary effects. See, for example:

Didn't know the beginnings of heart disease may already be present in children? See my video Heart Disease Starts in Childhood. Think that's tragic? Check out Heart Disease May Start in the Womb. Is it too late if we've been eating poorly most of our lives? It's Never Too Late to Start Eating Healthier.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link

Clostridium difficile in the Food Supply

Clostridium difficile in the Food Supply.jpeg

Clostridium difficile is one of our most urgent bacterial threats, sickening a quarter million Americans every year, and killing thousands at the cost of a billion dollars a year. And it's on the rise.

As shown in C. difficile Superbugs in Meat, uncomplicated cases have been traditionally managed with powerful antibiotics, but recent reports suggest that hypervirulent strains are increasingly resistant to medical management. There's been a rise in the percentage of cases that end up under the knife, which could be a marker of the emergence of these hypervirulent strains. Surgeons may need to remove our colon entirely to save our lives, although the surgery is so risky that the operation alone may kill us half the time.

Historically, most cases appeared in hospitals, but a landmark study published in the New England Journal of Medicine found that only about a third of cases could be linked to contact with an infected patient.

Another potential source is our food supply.

In the US, the frequency of contamination of retail chicken with these superbugs has been documented to be up to one in six packages off of store shelves. Pig-derived C. diff, however, have garnered the greatest attention from public health personnel, because the same human strain that's increasingly emerging in the community outside of hospitals is the major strain among pigs.

Since the turn of the century, C. diff is increasingly being reported as a major cause of intestinal infections in piglets. C. diff is now one of the most common causes of intestinal infections in baby piglets in the US. Particular attention has been paid to pigs because of high rates of C. diff shedding into their waste, which can lead to the contamination of retail pork. The U.S. has the highest levels of C. diff meat contamination tested so far anywhere in the world.

Carcass contamination by gut contents at slaughter probably contributes most to the presence of C. diff in meat and meat products. But why is the situation so much worst in the US? Slaughter techniques differ from country-to-country, with those in the United States evidently being more of the "quick and dirty" variety.

Colonization or contamination of pigs by superbugs such as C. difficile and MRSA at the farm production level may be more important than at the slaughterhouse level, though. One of the reasons sows and their piglets may have such high rates of C. diff is because of cross-contamination of feces in the farrowing crate, which are narrow metal cages that mother pigs are kept in while their piglets are nursing.

Can't you just follow food safety guidelines and cook the meat through? Unfortunately, current food safety guidelines are ineffective against C. difficile. To date, most food safety guidelines recommend cooking to an internal temperature as low as 63o C-the official USDA recommendation for pork-but recent studies show that C. diff spores can survive extended heating at 71o. Therefore, the guidelines should be raised to take this potentially killer infection into account.

One of the problems is that sources of C. diff food contamination might include not only fecal contamination on the surface of the meat, but transfer of spores from the gut into the actual muscles of the animal, inside the meat. Clostridia bacteria like C. diff comprise one of the main groups of bacteria involved in natural carcass degradation, and so by colonizing muscle tissue before death, C. diff can not only transmit to new hosts that eat the muscles, like us, but give them a head start on carcass break-down.

Never heard of C. diff? That's the Toxic Megacolon Superbug I've talked about before.

Another foodborne illness tied to pork industry practices is yersiniosis. See Yersinia in Pork.

MRSA (Methicillin-resistant Staph aureus) is another so-called superbug in the meat supply:

More on the scourge of antibiotic resistance and what can be done about it:

How is it even legal to sell foods with such pathogens? See Salmonella in Chicken & Turkey: Deadly But Not Illegal and Chicken Salmonella Thanks to Meat Industry Lawsuit.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: USDA / Flickr. This image has been modified.

Original Link

Clostridium difficile in the Food Supply

Clostridium difficile in the Food Supply.jpeg

Clostridium difficile is one of our most urgent bacterial threats, sickening a quarter million Americans every year, and killing thousands at the cost of a billion dollars a year. And it's on the rise.

As shown in C. difficile Superbugs in Meat, uncomplicated cases have been traditionally managed with powerful antibiotics, but recent reports suggest that hypervirulent strains are increasingly resistant to medical management. There's been a rise in the percentage of cases that end up under the knife, which could be a marker of the emergence of these hypervirulent strains. Surgeons may need to remove our colon entirely to save our lives, although the surgery is so risky that the operation alone may kill us half the time.

Historically, most cases appeared in hospitals, but a landmark study published in the New England Journal of Medicine found that only about a third of cases could be linked to contact with an infected patient.

Another potential source is our food supply.

In the US, the frequency of contamination of retail chicken with these superbugs has been documented to be up to one in six packages off of store shelves. Pig-derived C. diff, however, have garnered the greatest attention from public health personnel, because the same human strain that's increasingly emerging in the community outside of hospitals is the major strain among pigs.

Since the turn of the century, C. diff is increasingly being reported as a major cause of intestinal infections in piglets. C. diff is now one of the most common causes of intestinal infections in baby piglets in the US. Particular attention has been paid to pigs because of high rates of C. diff shedding into their waste, which can lead to the contamination of retail pork. The U.S. has the highest levels of C. diff meat contamination tested so far anywhere in the world.

Carcass contamination by gut contents at slaughter probably contributes most to the presence of C. diff in meat and meat products. But why is the situation so much worst in the US? Slaughter techniques differ from country-to-country, with those in the United States evidently being more of the "quick and dirty" variety.

Colonization or contamination of pigs by superbugs such as C. difficile and MRSA at the farm production level may be more important than at the slaughterhouse level, though. One of the reasons sows and their piglets may have such high rates of C. diff is because of cross-contamination of feces in the farrowing crate, which are narrow metal cages that mother pigs are kept in while their piglets are nursing.

Can't you just follow food safety guidelines and cook the meat through? Unfortunately, current food safety guidelines are ineffective against C. difficile. To date, most food safety guidelines recommend cooking to an internal temperature as low as 63o C-the official USDA recommendation for pork-but recent studies show that C. diff spores can survive extended heating at 71o. Therefore, the guidelines should be raised to take this potentially killer infection into account.

One of the problems is that sources of C. diff food contamination might include not only fecal contamination on the surface of the meat, but transfer of spores from the gut into the actual muscles of the animal, inside the meat. Clostridia bacteria like C. diff comprise one of the main groups of bacteria involved in natural carcass degradation, and so by colonizing muscle tissue before death, C. diff can not only transmit to new hosts that eat the muscles, like us, but give them a head start on carcass break-down.

Never heard of C. diff? That's the Toxic Megacolon Superbug I've talked about before.

Another foodborne illness tied to pork industry practices is yersiniosis. See Yersinia in Pork.

MRSA (Methicillin-resistant Staph aureus) is another so-called superbug in the meat supply:

More on the scourge of antibiotic resistance and what can be done about it:

How is it even legal to sell foods with such pathogens? See Salmonella in Chicken & Turkey: Deadly But Not Illegal and Chicken Salmonella Thanks to Meat Industry Lawsuit.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: USDA / Flickr. This image has been modified.

Original Link