Who Should Avoid Coffee?

Oct 19 Coffee copy.jpeg

Do coffee drinkers live longer than non-coffee drinkers? Is it "wake up and smell the coffee" or don't wake up at all? I discuss these questions in my video, Coffee and Mortality.

The largest study ever conducted on diet and health put that question to the test, examining the association between coffee drinking and subsequent mortality among hundreds of thousands of older men and women in the United States. Coffee drinkers won, though the effect was modest, a 10-15% lower risk of death for those drinking six or more cups a day. This was due specifically to lower risk of dying from heart disease, respiratory disease, stroke, injuries and accidents, diabetes, and infections.

However, another study that amount of coffee was found to increase the death rate of younger people under age 55. It may be appropriate, then, to recommend that you avoid drinking more than four cups a day. But if you review all the studies, the bottom line is that coffee consumption is associated with no change or a small reduction in mortality starting around one or two cups a day, for both men and women. The risk of dying was 3% lower for each cup of coffee consumed daily, which provides reassurance for the concern that coffee drinking might adversely affect health, or at least longevity.

A recent population study found no link between coffee consumption and symptoms of GERD, reflux diseases such as heartburn and regurgitation. If you actually stick a tube down people's throats and measure pH, though, coffee induces significant acid reflux, whereas tea does not. Is this just because tea has less caffeine? No. If you reduce the caffeine content of the coffee down to that of tea, coffee still causes significantly more acid reflux. Decaf causes even less, so GERD patients might want to choose decaffeinated coffee or, even better, opt for tea.

Coffee intake is also associated with urinary incontinence, so a decrease in caffeine intake should be discussed with patients who have the condition. About two cups of coffee a day worth of caffeine may worsen urinary leakage.

A 2014 meta-analysis suggested that daily coffee consumption was associated with a slightly increased risk of bone fractures in women, but a decreased risk of fractures in men. However, no significant association was found between coffee consumption and the risk of hip fracture specifically. Tea consumption may actually protect against hip fracture, though it appears to have no apparent relationship with fracture risk in general.

Certain populations, in particular, may want to stay away from caffeine, including those with glaucoma or a family history of glaucoma, individuals with epilepsy, and, not surprisingly, people who have trouble sleeping. Even a single cup at night can cause a significant deterioration in sleep quality.

We used to think caffeine might increase the risk of an irregular heart rhythm called atrial fibrillation, but that was based on anecdotal case reports like one of a young woman who suffered atrial fibrillation after "chocolate intake abuse." These cases invariably involved the acute ingestion of very large quantities of caffeine. As a result, the notion that caffeine ingestion may trigger abnormal heart rhythms had become "common knowledge," and this assumption led to changes in medical practice.

We now have evidence that caffeine does not increase the risk of atrial fibrillation. Low-dose caffeine--defined as less than about five cups of coffee a day--may even have a protective effect. Tea consumption also appears to lower cardiovascular disease risk, especially when it comes to stroke. But given the proliferation of energy drinks that contain massive quantities of caffeine, one might temper any message that suggests that caffeine is beneficial. Indeed, 12 highly caffeinated energy drinks within a few hours could be lethal.


To learn more about various health aspects of coffee, see my videos Coffee and Cancer, What About the Caffeine?, Preventing Liver Cancer with Coffee?, and Coffee and Artery Function.

What else can we consume to live longer? Check out Nuts May Help Prevent Death, Increased Lifespan from Beans, Fruits and Longevity: How Many Minutes per Mouthful?, and Finger on the Pulse of Longevity.

And, for more on controlling acid reflux, see Diet and GERD Acid Reflux Heartburn and Diet and Hiatal Hernia.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Original Link

Who Should Avoid Coffee?

Oct 19 Coffee copy.jpeg

Do coffee drinkers live longer than non-coffee drinkers? Is it "wake up and smell the coffee" or don't wake up at all? I discuss these questions in my video, Coffee and Mortality.

The largest study ever conducted on diet and health put that question to the test, examining the association between coffee drinking and subsequent mortality among hundreds of thousands of older men and women in the United States. Coffee drinkers won, though the effect was modest, a 10-15% lower risk of death for those drinking six or more cups a day. This was due specifically to lower risk of dying from heart disease, respiratory disease, stroke, injuries and accidents, diabetes, and infections.

However, another study that amount of coffee was found to increase the death rate of younger people under age 55. It may be appropriate, then, to recommend that you avoid drinking more than four cups a day. But if you review all the studies, the bottom line is that coffee consumption is associated with no change or a small reduction in mortality starting around one or two cups a day, for both men and women. The risk of dying was 3% lower for each cup of coffee consumed daily, which provides reassurance for the concern that coffee drinking might adversely affect health, or at least longevity.

A recent population study found no link between coffee consumption and symptoms of GERD, reflux diseases such as heartburn and regurgitation. If you actually stick a tube down people's throats and measure pH, though, coffee induces significant acid reflux, whereas tea does not. Is this just because tea has less caffeine? No. If you reduce the caffeine content of the coffee down to that of tea, coffee still causes significantly more acid reflux. Decaf causes even less, so GERD patients might want to choose decaffeinated coffee or, even better, opt for tea.

Coffee intake is also associated with urinary incontinence, so a decrease in caffeine intake should be discussed with patients who have the condition. About two cups of coffee a day worth of caffeine may worsen urinary leakage.

A 2014 meta-analysis suggested that daily coffee consumption was associated with a slightly increased risk of bone fractures in women, but a decreased risk of fractures in men. However, no significant association was found between coffee consumption and the risk of hip fracture specifically. Tea consumption may actually protect against hip fracture, though it appears to have no apparent relationship with fracture risk in general.

Certain populations, in particular, may want to stay away from caffeine, including those with glaucoma or a family history of glaucoma, individuals with epilepsy, and, not surprisingly, people who have trouble sleeping. Even a single cup at night can cause a significant deterioration in sleep quality.

We used to think caffeine might increase the risk of an irregular heart rhythm called atrial fibrillation, but that was based on anecdotal case reports like one of a young woman who suffered atrial fibrillation after "chocolate intake abuse." These cases invariably involved the acute ingestion of very large quantities of caffeine. As a result, the notion that caffeine ingestion may trigger abnormal heart rhythms had become "common knowledge," and this assumption led to changes in medical practice.

We now have evidence that caffeine does not increase the risk of atrial fibrillation. Low-dose caffeine--defined as less than about five cups of coffee a day--may even have a protective effect. Tea consumption also appears to lower cardiovascular disease risk, especially when it comes to stroke. But given the proliferation of energy drinks that contain massive quantities of caffeine, one might temper any message that suggests that caffeine is beneficial. Indeed, 12 highly caffeinated energy drinks within a few hours could be lethal.


To learn more about various health aspects of coffee, see my videos Coffee and Cancer, What About the Caffeine?, Preventing Liver Cancer with Coffee?, and Coffee and Artery Function.

What else can we consume to live longer? Check out Nuts May Help Prevent Death, Increased Lifespan from Beans, Fruits and Longevity: How Many Minutes per Mouthful?, and Finger on the Pulse of Longevity.

And, for more on controlling acid reflux, see Diet and GERD Acid Reflux Heartburn and Diet and Hiatal Hernia.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Original Link

Brown Fat: Losing Weight Through Thermogenesis

Sept 21 Brown Fat Thermo copy.jpeg

During World War I, it was discovered that many of the chemicals for new explosives had toxic or even lethal effects on the workers in the munitions factories. Chemicals such as di-nitro-phenol (DNP) can boost metabolism so much that workers were too often found wandering along the road after work, covered in sweat with temperatures of 106 to 109 degrees Fahrenheit before they died. Even after death, their temperatures kept going up, as if they were having a total body meltdown. At subacute doses, however, workers claimed to have grown thin to a notable extent after several months working with the chemical.

That got some Stanford pharmacologists excited about the "promising metabolic applications" of DNP. Our resting metabolic rate jumps up 30% after one dose of DNP, and therefore, it becomes an actual fat-burning drug. People started losing weight, as you can see in my video Brown Fat: Losing Weight Through Thermogenesis, with no apparent side effects. They felt great... and then thousands of people started going blind and users started dropping dead from hyperpyrexia, fatal fever due to the heat created by the burning fat. Of course, it continued to be sold. Ad copy read:

"Here, at last, is a [weight] reducing remedy that will bring you a figure men admire and women envy, without danger to your health or change in your regular mode of living....No diet, no exercise!"

It did work, but the therapeutic index--the difference between the effective dose and the deadly dose--was razor thin. It was not until thousands suffered irreversible harm that it got pulled from the market and remained unavailable. Unavailable, that is, until it was brought back by the internet for those dying to be thin.

There is, however, a way our body naturally burns fat to create heat. When we're born, we go from a nice tropical 98.6 in our mother's womb straight to room temperature, just when we're still all wet and slimy. As an adaptive mechanism to maintain warmth, the appearance of a unique organ around 150 million years ago allowed mammals to maintain our high body temperatures.

That unique organ is called brown adipose tissue, or BAT, and its role is to consume fat calories by generating heat in response to cold exposure. The white fat in our bellies stores fat, but the brown fat, located up between our shoulder blades, burns fat. BAT is essential for thermogenesis, the creation of heat in newborns, but has been considered unnecessary in adults who have higher metabolic rates and increased muscle mass for shivering to warm us up when we get chilled. We used to think brown tissue just shrank away when we grew up, but, if it was there, then it could potentially make a big difference for how many calories we burn every day.

When PET scans were invented to detect metabolically active tissues like cancer, oncologists kept finding hot spots in the neck and shoulder regions that on CT scans turned out not to be cancer, just fat. Then, some observant radiologists noticed they appeared in patients mostly during the cold winter months. When they looked closer at tissue samples taken from people who had undergone neck surgery, they found it: brown fat in adults.

The common message from a number of studies is that BAT is present and active in adults, and the more we have and the more active it is, the thinner we are. And we can rapidly activate our fat-burning brown fat by exposure to cold temperatures. For example, if you hang out in a cold room for two hours in your undies and put your legs on a block of ice for four minutes every five minutes, you can elicit a marked increase in energy expenditure, thanks to brown fat activation. So, the studies point to a potential "natural" intervention to stimulate energy expenditure: Turn down the heat to burn calories (and reduce the carbon footprint in the process).

Thankfully, for those of us who would rather not lay our bare legs on blocks of ice, our brown fat can also be activated by some food ingredients such as those that are covered in my Boosting Brown Fat Through Diet video.


I briefly touch on the role cold temperatures can play in weight loss in The Ice Diet and talk more about calories in (Nutrient-Dense Approach to Weight Management) and calories out (How Much Exercise to Sustain Weight Loss).

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Original Link

Brown Fat: Losing Weight Through Thermogenesis

Sept 21 Brown Fat Thermo copy.jpeg

During World War I, it was discovered that many of the chemicals for new explosives had toxic or even lethal effects on the workers in the munitions factories. Chemicals such as di-nitro-phenol (DNP) can boost metabolism so much that workers were too often found wandering along the road after work, covered in sweat with temperatures of 106 to 109 degrees Fahrenheit before they died. Even after death, their temperatures kept going up, as if they were having a total body meltdown. At subacute doses, however, workers claimed to have grown thin to a notable extent after several months working with the chemical.

That got some Stanford pharmacologists excited about the "promising metabolic applications" of DNP. Our resting metabolic rate jumps up 30% after one dose of DNP, and therefore, it becomes an actual fat-burning drug. People started losing weight, as you can see in my video Brown Fat: Losing Weight Through Thermogenesis, with no apparent side effects. They felt great... and then thousands of people started going blind and users started dropping dead from hyperpyrexia, fatal fever due to the heat created by the burning fat. Of course, it continued to be sold. Ad copy read:

"Here, at last, is a [weight] reducing remedy that will bring you a figure men admire and women envy, without danger to your health or change in your regular mode of living....No diet, no exercise!"

It did work, but the therapeutic index--the difference between the effective dose and the deadly dose--was razor thin. It was not until thousands suffered irreversible harm that it got pulled from the market and remained unavailable. Unavailable, that is, until it was brought back by the internet for those dying to be thin.

There is, however, a way our body naturally burns fat to create heat. When we're born, we go from a nice tropical 98.6 in our mother's womb straight to room temperature, just when we're still all wet and slimy. As an adaptive mechanism to maintain warmth, the appearance of a unique organ around 150 million years ago allowed mammals to maintain our high body temperatures.

That unique organ is called brown adipose tissue, or BAT, and its role is to consume fat calories by generating heat in response to cold exposure. The white fat in our bellies stores fat, but the brown fat, located up between our shoulder blades, burns fat. BAT is essential for thermogenesis, the creation of heat in newborns, but has been considered unnecessary in adults who have higher metabolic rates and increased muscle mass for shivering to warm us up when we get chilled. We used to think brown tissue just shrank away when we grew up, but, if it was there, then it could potentially make a big difference for how many calories we burn every day.

When PET scans were invented to detect metabolically active tissues like cancer, oncologists kept finding hot spots in the neck and shoulder regions that on CT scans turned out not to be cancer, just fat. Then, some observant radiologists noticed they appeared in patients mostly during the cold winter months. When they looked closer at tissue samples taken from people who had undergone neck surgery, they found it: brown fat in adults.

The common message from a number of studies is that BAT is present and active in adults, and the more we have and the more active it is, the thinner we are. And we can rapidly activate our fat-burning brown fat by exposure to cold temperatures. For example, if you hang out in a cold room for two hours in your undies and put your legs on a block of ice for four minutes every five minutes, you can elicit a marked increase in energy expenditure, thanks to brown fat activation. So, the studies point to a potential "natural" intervention to stimulate energy expenditure: Turn down the heat to burn calories (and reduce the carbon footprint in the process).

Thankfully, for those of us who would rather not lay our bare legs on blocks of ice, our brown fat can also be activated by some food ingredients such as those that are covered in my Boosting Brown Fat Through Diet video.


I briefly touch on the role cold temperatures can play in weight loss in The Ice Diet and talk more about calories in (Nutrient-Dense Approach to Weight Management) and calories out (How Much Exercise to Sustain Weight Loss).

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Original Link

Comparing Pollutant Levels Between Different Diets

Comparing Pollutant Levels Between Different Diets.jpeg

The results of the CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas) study were published recently. This study of a California birth cohort investigated the relationship between exposure to flame retardant chemical pollutants in pregnancy and childhood, and subsequent neurobehavioral development. Why California? Because California children's exposures to these endocrine disruptors and neurotoxins are among the highest in the world.

What did they find? The researchers concluded that both prenatal and childhood exposures to these chemicals "were associated with poorer attention, fine motor coordination, and cognition" (particularly verbal comprehension) by the time the children reached school age. "This study, the largest to date, contributes to growing evidence suggesting that PBDEs [polybrominated diphenyl ethers, flame retardant chemicals] have adverse impacts on child neurobehavioral development." The effects may extend into adolescence, again affecting motor function as well as thyroid gland function. The effect on our thyroid glands may even extend into adulthood.

These chemicals get into moms, then into the amniotic fluid, and then into the breast milk. The more that's in the milk, the worse the infants' mental development may be. Breast milk is still best, but how did these women get exposed in the first place?

The question has been: Are we exposed mostly from diet or dust? Researchers in Boston collected breast milk samples from 46 first-time moms, vacuumed up samples of dust from their homes, and questioned them about their diets. The researchers found that both were likely to blame. Diet-wise, a number of animal products were implicated. This is consistent with what's been found worldwide. For example, in Europe, these flame retardant chemical pollutants are found mostly in meat, including fish, and other animal products. It's similar to what we see with dioxins--they are mostly found in fish and other fatty foods, with a plant-based diet offering the lowest exposure.

If that's the case, do vegetarians have lower levels of flame retardant chemical pollutants circulating in their bloodstreams? Yes. Vegetarians may have about 25% lower levels. Poultry appears to be the largest contributor of PBDEs. USDA researchers compared the levels in different meats, and the highest levels of these pollutants were found in chicken and turkey, with less in pork and even less in beef. California poultry had the highest, consistent with strict furniture flammability codes. But it's not like chickens are pecking at the sofa. Chickens and turkeys may be exposed indirectly through the application of sewer sludge to fields where feed crops are raised, contamination of water supplies, the use of flame-retarded materials in poultry housing, or the inadvertent incorporation of fire-retardant material into the birds' bedding or feed ingredients.

Fish have been shown to have the highest levels overall, but Americans don't eat a lot of fish so they don't contribute as much to the total body burden in the United States. Researchers have compared the level of PBDEs found in meat-eaters and vegetarians. The amount found in the bloodstream of vegetarians is noticeably lower, as you can see in my video Flame Retardant Pollutants and Child Development. Just to give you a sense of the contribution of chicken, higher than average poultry eaters have higher levels than omnivores as a whole, and lower than average poultry eaters have levels lower than omnivores.

What are the PBDE levels in vegans? We know the intake of many other classes of pollutants is almost exclusively from the ingestion of animal fats in the diet. What if we take them all out of the diet? It works for dioxins. Vegan dioxin levels appear markedly lower than the general population. What about for the flame retardant chemicals? Vegans have levels lower than vegetarians, with those who've been vegan around 20 years having even lower concentrations. This tendency for chemical levels to decline the longer one eats plant-based suggests that food of animal origin contributes substantially. But note that levels never get down to zero, so diet is not the only source.

The USDA researchers note that there are currently no regulatory limits on the amount of flame retardant chemical contamination in U.S. foods, "but reducing the levels of unnecessary, persistent, toxic compounds in our diet is certainly desirable."

I've previously talked about this class of chemicals in Food Sources of Flame Retardant Chemicals. The same foods seem to accumulate a variety of pollutants:

Many of these chemicals have hormone- or endocrine-disrupting effects. See, for example:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Mitchell Haindfield / Flickr. This image has been modified.

Original Link

Comparing Pollutant Levels Between Different Diets

Comparing Pollutant Levels Between Different Diets.jpeg

The results of the CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas) study were published recently. This study of a California birth cohort investigated the relationship between exposure to flame retardant chemical pollutants in pregnancy and childhood, and subsequent neurobehavioral development. Why California? Because California children's exposures to these endocrine disruptors and neurotoxins are among the highest in the world.

What did they find? The researchers concluded that both prenatal and childhood exposures to these chemicals "were associated with poorer attention, fine motor coordination, and cognition" (particularly verbal comprehension) by the time the children reached school age. "This study, the largest to date, contributes to growing evidence suggesting that PBDEs [polybrominated diphenyl ethers, flame retardant chemicals] have adverse impacts on child neurobehavioral development." The effects may extend into adolescence, again affecting motor function as well as thyroid gland function. The effect on our thyroid glands may even extend into adulthood.

These chemicals get into moms, then into the amniotic fluid, and then into the breast milk. The more that's in the milk, the worse the infants' mental development may be. Breast milk is still best, but how did these women get exposed in the first place?

The question has been: Are we exposed mostly from diet or dust? Researchers in Boston collected breast milk samples from 46 first-time moms, vacuumed up samples of dust from their homes, and questioned them about their diets. The researchers found that both were likely to blame. Diet-wise, a number of animal products were implicated. This is consistent with what's been found worldwide. For example, in Europe, these flame retardant chemical pollutants are found mostly in meat, including fish, and other animal products. It's similar to what we see with dioxins--they are mostly found in fish and other fatty foods, with a plant-based diet offering the lowest exposure.

If that's the case, do vegetarians have lower levels of flame retardant chemical pollutants circulating in their bloodstreams? Yes. Vegetarians may have about 25% lower levels. Poultry appears to be the largest contributor of PBDEs. USDA researchers compared the levels in different meats, and the highest levels of these pollutants were found in chicken and turkey, with less in pork and even less in beef. California poultry had the highest, consistent with strict furniture flammability codes. But it's not like chickens are pecking at the sofa. Chickens and turkeys may be exposed indirectly through the application of sewer sludge to fields where feed crops are raised, contamination of water supplies, the use of flame-retarded materials in poultry housing, or the inadvertent incorporation of fire-retardant material into the birds' bedding or feed ingredients.

Fish have been shown to have the highest levels overall, but Americans don't eat a lot of fish so they don't contribute as much to the total body burden in the United States. Researchers have compared the level of PBDEs found in meat-eaters and vegetarians. The amount found in the bloodstream of vegetarians is noticeably lower, as you can see in my video Flame Retardant Pollutants and Child Development. Just to give you a sense of the contribution of chicken, higher than average poultry eaters have higher levels than omnivores as a whole, and lower than average poultry eaters have levels lower than omnivores.

What are the PBDE levels in vegans? We know the intake of many other classes of pollutants is almost exclusively from the ingestion of animal fats in the diet. What if we take them all out of the diet? It works for dioxins. Vegan dioxin levels appear markedly lower than the general population. What about for the flame retardant chemicals? Vegans have levels lower than vegetarians, with those who've been vegan around 20 years having even lower concentrations. This tendency for chemical levels to decline the longer one eats plant-based suggests that food of animal origin contributes substantially. But note that levels never get down to zero, so diet is not the only source.

The USDA researchers note that there are currently no regulatory limits on the amount of flame retardant chemical contamination in U.S. foods, "but reducing the levels of unnecessary, persistent, toxic compounds in our diet is certainly desirable."

I've previously talked about this class of chemicals in Food Sources of Flame Retardant Chemicals. The same foods seem to accumulate a variety of pollutants:

Many of these chemicals have hormone- or endocrine-disrupting effects. See, for example:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Mitchell Haindfield / Flickr. This image has been modified.

Original Link

Clostridium difficile in the Food Supply

Clostridium difficile in the Food Supply.jpeg

Clostridium difficile is one of our most urgent bacterial threats, sickening a quarter million Americans every year, and killing thousands at the cost of a billion dollars a year. And it's on the rise.

As shown in C. difficile Superbugs in Meat, uncomplicated cases have been traditionally managed with powerful antibiotics, but recent reports suggest that hypervirulent strains are increasingly resistant to medical management. There's been a rise in the percentage of cases that end up under the knife, which could be a marker of the emergence of these hypervirulent strains. Surgeons may need to remove our colon entirely to save our lives, although the surgery is so risky that the operation alone may kill us half the time.

Historically, most cases appeared in hospitals, but a landmark study published in the New England Journal of Medicine found that only about a third of cases could be linked to contact with an infected patient.

Another potential source is our food supply.

In the US, the frequency of contamination of retail chicken with these superbugs has been documented to be up to one in six packages off of store shelves. Pig-derived C. diff, however, have garnered the greatest attention from public health personnel, because the same human strain that's increasingly emerging in the community outside of hospitals is the major strain among pigs.

Since the turn of the century, C. diff is increasingly being reported as a major cause of intestinal infections in piglets. C. diff is now one of the most common causes of intestinal infections in baby piglets in the US. Particular attention has been paid to pigs because of high rates of C. diff shedding into their waste, which can lead to the contamination of retail pork. The U.S. has the highest levels of C. diff meat contamination tested so far anywhere in the world.

Carcass contamination by gut contents at slaughter probably contributes most to the presence of C. diff in meat and meat products. But why is the situation so much worst in the US? Slaughter techniques differ from country-to-country, with those in the United States evidently being more of the "quick and dirty" variety.

Colonization or contamination of pigs by superbugs such as C. difficile and MRSA at the farm production level may be more important than at the slaughterhouse level, though. One of the reasons sows and their piglets may have such high rates of C. diff is because of cross-contamination of feces in the farrowing crate, which are narrow metal cages that mother pigs are kept in while their piglets are nursing.

Can't you just follow food safety guidelines and cook the meat through? Unfortunately, current food safety guidelines are ineffective against C. difficile. To date, most food safety guidelines recommend cooking to an internal temperature as low as 63o C-the official USDA recommendation for pork-but recent studies show that C. diff spores can survive extended heating at 71o. Therefore, the guidelines should be raised to take this potentially killer infection into account.

One of the problems is that sources of C. diff food contamination might include not only fecal contamination on the surface of the meat, but transfer of spores from the gut into the actual muscles of the animal, inside the meat. Clostridia bacteria like C. diff comprise one of the main groups of bacteria involved in natural carcass degradation, and so by colonizing muscle tissue before death, C. diff can not only transmit to new hosts that eat the muscles, like us, but give them a head start on carcass break-down.

Never heard of C. diff? That's the Toxic Megacolon Superbug I've talked about before.

Another foodborne illness tied to pork industry practices is yersiniosis. See Yersinia in Pork.

MRSA (Methicillin-resistant Staph aureus) is another so-called superbug in the meat supply:

More on the scourge of antibiotic resistance and what can be done about it:

How is it even legal to sell foods with such pathogens? See Salmonella in Chicken & Turkey: Deadly But Not Illegal and Chicken Salmonella Thanks to Meat Industry Lawsuit.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: USDA / Flickr. This image has been modified.

Original Link

Clostridium difficile in the Food Supply

Clostridium difficile in the Food Supply.jpeg

Clostridium difficile is one of our most urgent bacterial threats, sickening a quarter million Americans every year, and killing thousands at the cost of a billion dollars a year. And it's on the rise.

As shown in C. difficile Superbugs in Meat, uncomplicated cases have been traditionally managed with powerful antibiotics, but recent reports suggest that hypervirulent strains are increasingly resistant to medical management. There's been a rise in the percentage of cases that end up under the knife, which could be a marker of the emergence of these hypervirulent strains. Surgeons may need to remove our colon entirely to save our lives, although the surgery is so risky that the operation alone may kill us half the time.

Historically, most cases appeared in hospitals, but a landmark study published in the New England Journal of Medicine found that only about a third of cases could be linked to contact with an infected patient.

Another potential source is our food supply.

In the US, the frequency of contamination of retail chicken with these superbugs has been documented to be up to one in six packages off of store shelves. Pig-derived C. diff, however, have garnered the greatest attention from public health personnel, because the same human strain that's increasingly emerging in the community outside of hospitals is the major strain among pigs.

Since the turn of the century, C. diff is increasingly being reported as a major cause of intestinal infections in piglets. C. diff is now one of the most common causes of intestinal infections in baby piglets in the US. Particular attention has been paid to pigs because of high rates of C. diff shedding into their waste, which can lead to the contamination of retail pork. The U.S. has the highest levels of C. diff meat contamination tested so far anywhere in the world.

Carcass contamination by gut contents at slaughter probably contributes most to the presence of C. diff in meat and meat products. But why is the situation so much worst in the US? Slaughter techniques differ from country-to-country, with those in the United States evidently being more of the "quick and dirty" variety.

Colonization or contamination of pigs by superbugs such as C. difficile and MRSA at the farm production level may be more important than at the slaughterhouse level, though. One of the reasons sows and their piglets may have such high rates of C. diff is because of cross-contamination of feces in the farrowing crate, which are narrow metal cages that mother pigs are kept in while their piglets are nursing.

Can't you just follow food safety guidelines and cook the meat through? Unfortunately, current food safety guidelines are ineffective against C. difficile. To date, most food safety guidelines recommend cooking to an internal temperature as low as 63o C-the official USDA recommendation for pork-but recent studies show that C. diff spores can survive extended heating at 71o. Therefore, the guidelines should be raised to take this potentially killer infection into account.

One of the problems is that sources of C. diff food contamination might include not only fecal contamination on the surface of the meat, but transfer of spores from the gut into the actual muscles of the animal, inside the meat. Clostridia bacteria like C. diff comprise one of the main groups of bacteria involved in natural carcass degradation, and so by colonizing muscle tissue before death, C. diff can not only transmit to new hosts that eat the muscles, like us, but give them a head start on carcass break-down.

Never heard of C. diff? That's the Toxic Megacolon Superbug I've talked about before.

Another foodborne illness tied to pork industry practices is yersiniosis. See Yersinia in Pork.

MRSA (Methicillin-resistant Staph aureus) is another so-called superbug in the meat supply:

More on the scourge of antibiotic resistance and what can be done about it:

How is it even legal to sell foods with such pathogens? See Salmonella in Chicken & Turkey: Deadly But Not Illegal and Chicken Salmonella Thanks to Meat Industry Lawsuit.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: USDA / Flickr. This image has been modified.

Original Link

Can You Eat Too Much Fruit?

Can You Eat Too Much Fruit?.jpeg

In my video If Fructose is Bad, What About Fruit?, I explored how adding berries to our meals can actually blunt the detrimental effects of high glycemic foods, but how many berries? The purpose of one study out of Finland was to determine the minimum level of blueberry consumption at which a consumer may realistically expect to receive antioxidant benefits after eating blueberries with a sugary breakfast cereal. If we eat a bowl of corn flakes with no berries, within two hours, so many free radicals are created that it puts us into oxidative debt. The antioxidant power of our bloodstream drops below where we started from before breakfast, as the antioxidants in our bodies get used up dealing with such a crappy breakfast. As you can see in How Much Fruit is Too Much? video, a quarter cup of blueberries didn't seem to help much, but a half cup of blueberries did.

What about fruit for diabetics? Most guidelines recommend eating a diet with a high intake of fiber-rich food, including fruit, because they're so healthy--antioxidants, anti-inflammatory, improving artery function, and reducing cancer risk. However, some health professionals have concerns about the sugar content of fruit and therefore recommend restricting the fruit intake. So let's put it to the test! In a study from Denmark, diabetics were randomized into two groups: one told to eat at least two pieces of fruit a day, and the other told at most, two fruits a day. The reduce fruit group indeed reduce their fruit consumption, but it had no effect on the control of their diabetes or weight, and so, the researchers concluded, the intake of fruit should not be restricted in patients with type 2 diabetes. An emerging literature has shown that low-dose fructose may actually benefit blood sugar control. Having a piece of fruit with each meal would be expected to lower, not raise the blood sugar response.

The threshold for toxicity of fructose may be around 50 grams. The problem is that's the current average adult fructose consumption. So, the levels of half of all adults are likely above the threshold for fructose toxicity, and adolescents currently average 75. Is that limit for added sugars or for all fructose? If we don't want more than 50 and there's about ten in a piece of fruit, should we not eat more than five fruit a day? Quoting from the Harvard Health Letter, "the nutritional problems of fructose and sugar come when they are added to foods. Fruit, on the other hand, is beneficial in almost any amount." What do they mean almost? Can we eat ten fruit a day? How about twenty fruit a day?

It's actually been put to the test.

Seventeen people were made to eat 20 servings a day of fruit. Despite the extraordinarily high fructose content of this diet, presumably about 200 g/d--eight cans of soda worth, the investigators reported no adverse effects (and possible benefit actually) for body weight, blood pressure, and insulin and lipid levels after three to six months. More recently, Jenkins and colleagues put people on about a 20 servings of fruit a day diet for a few weeks and found no adverse effects on weight or blood pressure or triglycerides, and an astounding 38 point drop in LDL cholesterol.

There was one side effect, though. Given the 44 servings of vegetables they had on top of all that fruit, they recorded the largest bowl movements apparently ever documented in a dietary intervention.


Cutting down on sugary foods may be easier said than done (see Are Sugary Foods Addictive?) but it's worth it. For more on the dangers of high levels of fructose in added sugars, see How Much Added Sugar Is Too Much?.

What's that about being in oxidative debt? See my three part series on how to pull yourself out of the red:

Ironically, fat may be more of a problem when it comes to diabetes than sugar, see:

In health,
Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link

Can You Eat Too Much Fruit?

Can You Eat Too Much Fruit?.jpeg

In my video If Fructose is Bad, What About Fruit?, I explored how adding berries to our meals can actually blunt the detrimental effects of high glycemic foods, but how many berries? The purpose of one study out of Finland was to determine the minimum level of blueberry consumption at which a consumer may realistically expect to receive antioxidant benefits after eating blueberries with a sugary breakfast cereal. If we eat a bowl of corn flakes with no berries, within two hours, so many free radicals are created that it puts us into oxidative debt. The antioxidant power of our bloodstream drops below where we started from before breakfast, as the antioxidants in our bodies get used up dealing with such a crappy breakfast. As you can see in How Much Fruit is Too Much? video, a quarter cup of blueberries didn't seem to help much, but a half cup of blueberries did.

What about fruit for diabetics? Most guidelines recommend eating a diet with a high intake of fiber-rich food, including fruit, because they're so healthy--antioxidants, anti-inflammatory, improving artery function, and reducing cancer risk. However, some health professionals have concerns about the sugar content of fruit and therefore recommend restricting the fruit intake. So let's put it to the test! In a study from Denmark, diabetics were randomized into two groups: one told to eat at least two pieces of fruit a day, and the other told at most, two fruits a day. The reduce fruit group indeed reduce their fruit consumption, but it had no effect on the control of their diabetes or weight, and so, the researchers concluded, the intake of fruit should not be restricted in patients with type 2 diabetes. An emerging literature has shown that low-dose fructose may actually benefit blood sugar control. Having a piece of fruit with each meal would be expected to lower, not raise the blood sugar response.

The threshold for toxicity of fructose may be around 50 grams. The problem is that's the current average adult fructose consumption. So, the levels of half of all adults are likely above the threshold for fructose toxicity, and adolescents currently average 75. Is that limit for added sugars or for all fructose? If we don't want more than 50 and there's about ten in a piece of fruit, should we not eat more than five fruit a day? Quoting from the Harvard Health Letter, "the nutritional problems of fructose and sugar come when they are added to foods. Fruit, on the other hand, is beneficial in almost any amount." What do they mean almost? Can we eat ten fruit a day? How about twenty fruit a day?

It's actually been put to the test.

Seventeen people were made to eat 20 servings a day of fruit. Despite the extraordinarily high fructose content of this diet, presumably about 200 g/d--eight cans of soda worth, the investigators reported no adverse effects (and possible benefit actually) for body weight, blood pressure, and insulin and lipid levels after three to six months. More recently, Jenkins and colleagues put people on about a 20 servings of fruit a day diet for a few weeks and found no adverse effects on weight or blood pressure or triglycerides, and an astounding 38 point drop in LDL cholesterol.

There was one side effect, though. Given the 44 servings of vegetables they had on top of all that fruit, they recorded the largest bowl movements apparently ever documented in a dietary intervention.


Cutting down on sugary foods may be easier said than done (see Are Sugary Foods Addictive?) but it's worth it. For more on the dangers of high levels of fructose in added sugars, see How Much Added Sugar Is Too Much?.

What's that about being in oxidative debt? See my three part series on how to pull yourself out of the red:

Ironically, fat may be more of a problem when it comes to diabetes than sugar, see:

In health,
Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link