The Food Safety Risk of Organic versus Conventional

The Food Safety Risk of Organic versus Conventional.jpeg

The stated principles of organic agriculture are "health, ecology, fairness, and care," but if you ask people why they buy organic, the strongest predictor is concern for their own health. People appear to spend more for organic foods for selfish reasons, rather than altruistic motives. Although organic foods may not have more nutrients per dollar (see my video Are Organic Foods More Nutritious?), consumption of organic foods may reduce exposure to pesticide residues and antibiotic-resistant bacteria.

Food safety-wise, researchers found no difference in the risk for contamination with food poisoning bacteria in general. Both organic and conventional animal products have been found to be commonly contaminated with Salmonella and Campylobacter, for example. Most chicken samples (organic and inorganic), were found to be contaminated with Campylobacter, and about a third with Salmonella, but the risk of exposure to multidrug-resistant bacteria was lower with the organic meat. They both may carry the same risk of making us sick, but food poisoning from organic meat may be easier for doctors to treat.

What about the pesticides? There is a large body of evidence on the relation between exposure to pesticides and elevated rate of chronic diseases such as different types of cancers, diabetes, neurodegenerative disorders like Parkinson's, Alzheimer's, and ALS, as well as birth defects and reproductive disorders--but these studies were largely on people who live or work around pesticides.

Take Salinas Valley California, for example, where they spray a half million pounds of the stuff. Daring to be pregnant in an agricultural community like that may impair childhood brain development, such that pregnant women with the highest levels running through their bodies (as measured in their urine) gave birth to children with an average deficit of about seven IQ points. Twenty-six out of 27 studies showed negative effects of pesticides on brain development in children. These included attention problems, developmental disorders, and short-term memory difficulties.

Even in urban areas, if you compare kids born with higher levels of a common insecticide in their umbilical cord blood, those who were exposed to higher levels are born with brain anomalies. And these were city kids, so presumably this was from residential pesticide use.

Using insecticides inside your house may also be a contributing risk factor for childhood leukemia. Pregnant farmworkers may be doubling the odds of their child getting leukemia and increase their risk of getting a brain tumor. This has lead to authorities advocating that awareness of the potentially negative health outcome for children be increased among populations occupationally exposed to pesticides, though I don't imagine most farmworkers have much of a choice.

Conventional produce may be bad for the pregnant women who pick them, but what about our own family when we eat them?

Just because we spray pesticides on our food in the fields doesn't necessarily mean it ends up in our bodies when we eat it, or at least we didn't know that until a study was published in 2006. Researchers measured the levels of two pesticides running through children's bodies by measuring specific pesticide breakdown products in their urine. In my video, Are Organic Foods Safer?, you can see the levels of pesticides flowing through the bodies of three to 11-year olds during a few days on a conventional diet. The kids then went on an organic diet for five days and then back to the conventional diet. As you can see, eating organic provides a dramatic and immediate protective effect against exposures to pesticides commonly used in agricultural production. The study was subsequently extended. It's clear by looking at the subsequent graph in the video when the kids were eating organic versus conventional. What about adults, though? We didn't know... until now.

Thirteen men and women consumed a diet of at least 80% organic or conventional food for seven days and then switched. No surprise, during the mostly organic week, pesticide exposure was significantly reduced by a nearly 90% drop.

If it can be concluded that consumption of organic foods provides protection against pesticides, does that also mean protection against disease? We don't know. The studies just haven't been done. Nevertheless, in the meantime, the consumption of organic food provides a logical precautionary approach.

For more on organic foods:

For more on the infectious disease implications of organic versus conventional, see Superbugs in Conventional vs. Organic Chicken. Organic produce may be safer too. See Norovirus Food Poisoning from Pesticides. Organic eggs may also have lower Salmonella risk, which is an egg-borne epidemic every year in the US. See my video Who Says Eggs Aren't Healthy or Safe?

More on Parkinson's and pesticides in Preventing Parkinson's Disease With Diet.

Those surprised by the California data might have missed my video California Children Are Contaminated.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: IFPRI -IMAGES / Flickr. This image has been modified.

Original Link

What Animal Protein Does in Your Colon

What Animal Protein Does in Your Colon.jpeg

There's a take-off of the industry slogan, "Beef: It's What's For Dinner" - "Beef: It's What's Rotting in Your Colon." I saw this on a shirt once with some friends and I was such the party pooper--no pun intended--explaining to everyone that meat is fully digested in the small intestine, and never makes it down into the colon. It's no fun hanging out with biology geeks.

But I was wrong!

It's been estimated that with a typical Western diet, up to 12 grams of protein can escape digestion, and when it reaches the colon, it can be turned into toxic substances like ammonia. This degradation of undigested protein in the colon is called putrefaction, so a little meat can actually end up putrefying in our colon. The problem is that some of the by-products of this putrefaction process can be toxic.

It's generally accepted that carbohydrate fermentation--the fiber and resistant starches that reach our colon--results in beneficial effects because of the generation of short-chain fatty acids like butyrate, whereas protein fermentation is considered detrimental. Protein fermentation mainly occurs in the lower end of colon and results in the production of potentially toxic metabolites. That may be why colorectal cancer and ulcerative colitis tends to happen lower down--because that's where the protein is putrefying.

Probably the simplest strategy to reduce the potential harm of protein fermentation is to reduce dietary protein intake. But the accumulation of these toxic byproducts of protein metabolism may be attenuated by the fermentation of undigested plant matter. In my video, Bowel Wars: Hydrogen Sulfide vs. Butyrate, you can see a study out of Australia showed that if you give people foods containing resistant starch you can block the accumulation of potentially harmful byproducts of protein metabolism. Resistant starch is resistant to small intestine digestion and so it makes it down to our colon where it can feed our good bacteria. Resistant starch is found in cooked beans, split peas, chickpeas, lentils, raw oatmeal, and cooled cooked pasta (like macaroni salad). Apparently, the more starch that ends up in the colon, the less ammonia that is produced.

Of course, there's protein in plants too. The difference is that animal proteins tend to have more sulfur-containing amino acids like methionine, which can be turned into hydrogen sulfide in our colon. Hydrogen sulfide is the rotten egg gas that may play a role in the development of the inflammatory bowel disease, ulcerative colitis (see Preventing Ulcerative Colitis with Diet).

The toxic effects of hydrogen sulfide appear to be a result of blocking the ability of the cells lining our colon from utilizing butyrate, which is what our good bacteria make from the fiber and resistant starch we eat. It's like this constant battle in our colon between the bad metabolites of protein, hydrogen sulfide, and the good metabolites of carbohydrates, butyrate. Using human colon samples, researchers were able to show that the adverse effects of sulfide could be reversed by butyrate. So we can either cut down on meat, eat more plants, or both.

There are two ways hydrogen sulfide can be produced, though. It's mainly present in our large intestine as a result of the breakdown of sulfur-containing proteins, but the rotten egg gas can also be generated from inorganic sulfur preservatives like sulfites and sulfur dioxide.

Sulfur dioxide is used as a preservative in dried fruit, and sulfites are added to wines. We can avoid sulfur additives by reading labels or by just choosing organic, since they're forbidden from organic fruits and beverages by law.

More than 35 years ago, studies started implicating sulfur dioxide preservatives in the exacerbation of asthma. This so-called "sulfite-sensitivity" seems to affect only about 1 in 2,000 people, so I recommended those with asthma avoid it, but otherwise I considered the preservative harmless. I am now not so sure, and advise people to avoid it when possible.

Cabbage family vegetables naturally have some sulfur compounds, but thankfully, after following more than a hundred thousand women for over 25 years, researchers concluded cruciferous vegetables were not associated with elevated colitis risk.

Because of animal protein and processed food intake, the standard American diet may contain five or six times more sulfur than a diet centered around unprocessed plant foods. This may help explain the rarity of inflammatory bowel disease among those eating traditional whole food, plant-based diets.

How could companies just add things like sulfur dioxide to foods without adequate safety testing? See Who Determines if Food Additives are Safe? For other additives that may be a problem, see Titanium Dioxide & Inflammatory Bowel Disease and Is Carrageenan Safe?

More on this epic fermentation battle in our gut in Stool pH and Colon Cancer.

Does the sulfur-containing amino acid methionine sound familiar? You may remember it from such hits as Starving Cancer with Methionine Restriction and Methionine Restriction as a Life Extension Strategy.

These short-chain fatty acids released by our good bacteria when we eat fiber and resistant starches are what may be behind the second meal effect: Beans and the Second Meal Effect.

I mentioned ulcerative colitis. What about the other inflammatory bowel disease Crohn's? See Preventing Crohn's Disease With Diet and Dietary Treatment of Crohn's Disease.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link

What Animal Protein Does in Your Colon

What Animal Protein Does in Your Colon.jpeg

There's a take-off of the industry slogan, "Beef: It's What's For Dinner" - "Beef: It's What's Rotting in Your Colon." I saw this on a shirt once with some friends and I was such the party pooper--no pun intended--explaining to everyone that meat is fully digested in the small intestine, and never makes it down into the colon. It's no fun hanging out with biology geeks.

But I was wrong!

It's been estimated that with a typical Western diet, up to 12 grams of protein can escape digestion, and when it reaches the colon, it can be turned into toxic substances like ammonia. This degradation of undigested protein in the colon is called putrefaction, so a little meat can actually end up putrefying in our colon. The problem is that some of the by-products of this putrefaction process can be toxic.

It's generally accepted that carbohydrate fermentation--the fiber and resistant starches that reach our colon--results in beneficial effects because of the generation of short-chain fatty acids like butyrate, whereas protein fermentation is considered detrimental. Protein fermentation mainly occurs in the lower end of colon and results in the production of potentially toxic metabolites. That may be why colorectal cancer and ulcerative colitis tends to happen lower down--because that's where the protein is putrefying.

Probably the simplest strategy to reduce the potential harm of protein fermentation is to reduce dietary protein intake. But the accumulation of these toxic byproducts of protein metabolism may be attenuated by the fermentation of undigested plant matter. In my video, Bowel Wars: Hydrogen Sulfide vs. Butyrate, you can see a study out of Australia showed that if you give people foods containing resistant starch you can block the accumulation of potentially harmful byproducts of protein metabolism. Resistant starch is resistant to small intestine digestion and so it makes it down to our colon where it can feed our good bacteria. Resistant starch is found in cooked beans, split peas, chickpeas, lentils, raw oatmeal, and cooled cooked pasta (like macaroni salad). Apparently, the more starch that ends up in the colon, the less ammonia that is produced.

Of course, there's protein in plants too. The difference is that animal proteins tend to have more sulfur-containing amino acids like methionine, which can be turned into hydrogen sulfide in our colon. Hydrogen sulfide is the rotten egg gas that may play a role in the development of the inflammatory bowel disease, ulcerative colitis (see Preventing Ulcerative Colitis with Diet).

The toxic effects of hydrogen sulfide appear to be a result of blocking the ability of the cells lining our colon from utilizing butyrate, which is what our good bacteria make from the fiber and resistant starch we eat. It's like this constant battle in our colon between the bad metabolites of protein, hydrogen sulfide, and the good metabolites of carbohydrates, butyrate. Using human colon samples, researchers were able to show that the adverse effects of sulfide could be reversed by butyrate. So we can either cut down on meat, eat more plants, or both.

There are two ways hydrogen sulfide can be produced, though. It's mainly present in our large intestine as a result of the breakdown of sulfur-containing proteins, but the rotten egg gas can also be generated from inorganic sulfur preservatives like sulfites and sulfur dioxide.

Sulfur dioxide is used as a preservative in dried fruit, and sulfites are added to wines. We can avoid sulfur additives by reading labels or by just choosing organic, since they're forbidden from organic fruits and beverages by law.

More than 35 years ago, studies started implicating sulfur dioxide preservatives in the exacerbation of asthma. This so-called "sulfite-sensitivity" seems to affect only about 1 in 2,000 people, so I recommended those with asthma avoid it, but otherwise I considered the preservative harmless. I am now not so sure, and advise people to avoid it when possible.

Cabbage family vegetables naturally have some sulfur compounds, but thankfully, after following more than a hundred thousand women for over 25 years, researchers concluded cruciferous vegetables were not associated with elevated colitis risk.

Because of animal protein and processed food intake, the standard American diet may contain five or six times more sulfur than a diet centered around unprocessed plant foods. This may help explain the rarity of inflammatory bowel disease among those eating traditional whole food, plant-based diets.

How could companies just add things like sulfur dioxide to foods without adequate safety testing? See Who Determines if Food Additives are Safe? For other additives that may be a problem, see Titanium Dioxide & Inflammatory Bowel Disease and Is Carrageenan Safe?

More on this epic fermentation battle in our gut in Stool pH and Colon Cancer.

Does the sulfur-containing amino acid methionine sound familiar? You may remember it from such hits as Starving Cancer with Methionine Restriction and Methionine Restriction as a Life Extension Strategy.

These short-chain fatty acids released by our good bacteria when we eat fiber and resistant starches are what may be behind the second meal effect: Beans and the Second Meal Effect.

I mentioned ulcerative colitis. What about the other inflammatory bowel disease Crohn's? See Preventing Crohn's Disease With Diet and Dietary Treatment of Crohn's Disease.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link

Antioxidant- and Folate-Rich Foods for Depression

Antioxidant- and Folate-Rich Foods for Depression.jpeg

According to the Centers for Disease Control and Prevention, the rates of all of our top 10 killers have fallen or stabilized except for one, suicide. As shown in my video, Antioxidants & Depression, accumulating evidence indicates that free radicals may play important roles in the development of various neuropsychiatric disorders including major depression, a common cause of suicide.

In a study of nearly 300,000 Canadians, for example, greater fruit and vegetable consumption was associated with lower odds of depression, psychological distress, self-reported mood and anxiety disorders and poor perceived mental health. They conclude that since a healthy diet comprised of a high intake of fruits and vegetables is rich in anti-oxidants, it may consequently dampen the detrimental effects of oxidative stress on mental health.

But that study was based on asking how many fruits and veggies people ate. Maybe people were just telling the researchers what they thought they wanted to hear. What if you actually measure the levels of carotenoid phytonutrients in people's bloodstreams? The same relationship is found. Testing nearly 2000 people across the United States, researchers found that a higher total blood carotenoid level was indeed associated with a lower likelihood of elevated depressive symptoms, and there appeared to be a dose-response relationship, meaning the higher the levels, the better people felt.

Lycopene, the red pigment predominantly found in tomatoes (but also present in watermelon, pink grapefruit, guava and papaya) is the most powerful carotenoid antioxidant. In a test tube, it's about 100 times more effective at quenching free radicals than a more familiar antioxidant like vitamin E.

Do people who eat more tomatoes have less depression, then? Apparently so. A study of about a thousand older men and women found that those who ate the most tomato products had only about half the odds of depression. The researchers conclude that a tomato-rich diet may have a beneficial effect on the prevention of depressive symptoms.

Higher consumption of fruits and vegetables has been found to lead to a lower risk of developing depression, but if it's the antioxidants can't we just take an antioxidant pill? No.

Only food sources of antioxidants were protectively associated with depression. Not antioxidants from dietary supplements. Although plant foods and food-derived phytochemicals have been associated with health benefits, antioxidants from dietary supplements appear to be less beneficial and may, in fact, be detrimental to health. This may indicate that the form and delivery of the antioxidants are important. Alternatively, the observed associations may be due not to antioxidants but rather to other dietary factors, such as folate, that also occur in plant-rich diets.

In a study of thousands of middle-aged office workers, eating lots of processed food was found to be a risk factor for at least mild to moderate depression five years later, whereas a whole food pattern was found to be protective. Yes, it could be because of the high content of antioxidants in fruits and vegetables but could also be the folate in greens and beans, as some studies have suggested an increased risk of depression in folks who may not have been eating enough.

Low folate levels in the blood are associated with depression, but since most of the early studies were cross-sectional, meaning a snapshot in time, we didn't know if the low folate led to depression or the depression led to low folate. Maybe when you have the blues you don't want to eat the greens.

But since then a number of cohort studies were published, following people over time. They show that a low dietary intake of folate may indeed be a risk factor for severe depression, as much as a threefold higher risk. Note this is for dietary folate intake, not folic acid supplements; those with higher levels were actually eating healthy foods. If you give people folic acid pills they don't seem to work. This may be because folate is found in dark green leafy vegetables like spinach, whereas folic acid is the oxidized synthetic compound used in food fortification and dietary supplements because it's more shelf-stable. It may have different effects on the body as I previously explored in Can Folic Acid Be Harmful?

These kinds of findings point to the importance of antioxidant food sources rather than dietary supplements. But there was an interesting study giving people high dose vitamin C. In contrast to the placebo group, those given vitamin C experienced a decrease in depression scores and also greater FSI. What is FSI? Frequency of Sexual Intercourse.

Evidently, high dose vitamin C improves mood and intercourse frequency, but only in sexual partners that don't live with one another. In the placebo group, those not living together had sex about once a week, and those living together a little higher, once every five days, with no big change on vitamin C. But for those not living together, on vitamin C? Every other day! The differential effect for non-cohabitants suggests that the mechanism is not a peripheral one, meaning outside the brain, but a central one--some psychological change which motivates the person to venture forth to have intercourse. The mild antidepressant effect they found was unrelated to cohabitation or frequency, so it does not appear that the depression scores improved just because of the improved FSI.

For more mental health video, see:

Anything else we can do to enhance our sexual health and attractiveness? See:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link

Antioxidant- and Folate-Rich Foods for Depression

Antioxidant- and Folate-Rich Foods for Depression.jpeg

According to the Centers for Disease Control and Prevention, the rates of all of our top 10 killers have fallen or stabilized except for one, suicide. As shown in my video, Antioxidants & Depression, accumulating evidence indicates that free radicals may play important roles in the development of various neuropsychiatric disorders including major depression, a common cause of suicide.

In a study of nearly 300,000 Canadians, for example, greater fruit and vegetable consumption was associated with lower odds of depression, psychological distress, self-reported mood and anxiety disorders and poor perceived mental health. They conclude that since a healthy diet comprised of a high intake of fruits and vegetables is rich in anti-oxidants, it may consequently dampen the detrimental effects of oxidative stress on mental health.

But that study was based on asking how many fruits and veggies people ate. Maybe people were just telling the researchers what they thought they wanted to hear. What if you actually measure the levels of carotenoid phytonutrients in people's bloodstreams? The same relationship is found. Testing nearly 2000 people across the United States, researchers found that a higher total blood carotenoid level was indeed associated with a lower likelihood of elevated depressive symptoms, and there appeared to be a dose-response relationship, meaning the higher the levels, the better people felt.

Lycopene, the red pigment predominantly found in tomatoes (but also present in watermelon, pink grapefruit, guava and papaya) is the most powerful carotenoid antioxidant. In a test tube, it's about 100 times more effective at quenching free radicals than a more familiar antioxidant like vitamin E.

Do people who eat more tomatoes have less depression, then? Apparently so. A study of about a thousand older men and women found that those who ate the most tomato products had only about half the odds of depression. The researchers conclude that a tomato-rich diet may have a beneficial effect on the prevention of depressive symptoms.

Higher consumption of fruits and vegetables has been found to lead to a lower risk of developing depression, but if it's the antioxidants can't we just take an antioxidant pill? No.

Only food sources of antioxidants were protectively associated with depression. Not antioxidants from dietary supplements. Although plant foods and food-derived phytochemicals have been associated with health benefits, antioxidants from dietary supplements appear to be less beneficial and may, in fact, be detrimental to health. This may indicate that the form and delivery of the antioxidants are important. Alternatively, the observed associations may be due not to antioxidants but rather to other dietary factors, such as folate, that also occur in plant-rich diets.

In a study of thousands of middle-aged office workers, eating lots of processed food was found to be a risk factor for at least mild to moderate depression five years later, whereas a whole food pattern was found to be protective. Yes, it could be because of the high content of antioxidants in fruits and vegetables but could also be the folate in greens and beans, as some studies have suggested an increased risk of depression in folks who may not have been eating enough.

Low folate levels in the blood are associated with depression, but since most of the early studies were cross-sectional, meaning a snapshot in time, we didn't know if the low folate led to depression or the depression led to low folate. Maybe when you have the blues you don't want to eat the greens.

But since then a number of cohort studies were published, following people over time. They show that a low dietary intake of folate may indeed be a risk factor for severe depression, as much as a threefold higher risk. Note this is for dietary folate intake, not folic acid supplements; those with higher levels were actually eating healthy foods. If you give people folic acid pills they don't seem to work. This may be because folate is found in dark green leafy vegetables like spinach, whereas folic acid is the oxidized synthetic compound used in food fortification and dietary supplements because it's more shelf-stable. It may have different effects on the body as I previously explored in Can Folic Acid Be Harmful?

These kinds of findings point to the importance of antioxidant food sources rather than dietary supplements. But there was an interesting study giving people high dose vitamin C. In contrast to the placebo group, those given vitamin C experienced a decrease in depression scores and also greater FSI. What is FSI? Frequency of Sexual Intercourse.

Evidently, high dose vitamin C improves mood and intercourse frequency, but only in sexual partners that don't live with one another. In the placebo group, those not living together had sex about once a week, and those living together a little higher, once every five days, with no big change on vitamin C. But for those not living together, on vitamin C? Every other day! The differential effect for non-cohabitants suggests that the mechanism is not a peripheral one, meaning outside the brain, but a central one--some psychological change which motivates the person to venture forth to have intercourse. The mild antidepressant effect they found was unrelated to cohabitation or frequency, so it does not appear that the depression scores improved just because of the improved FSI.

For more mental health video, see:

Anything else we can do to enhance our sexual health and attractiveness? See:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link

How Milk May Contribute to Childhood Obesity

How Milk May Contribute to Childhood Obesity.jpeg

We've known that breastfed infants may be protected against obesity later in life for more than 30 years, but why? It may be the formula. Giving infants formula based on cow's milk presents an unusual situation. Cow's milk is designed to put nearly two pounds a day onto a growing calf, 40 times the growth rate of human infants (see Formula for Childhood Obesity).

The perfect food for humans, finely tuned over millions of years, is human breast milk. Remarkably, among all mammalian species, the protein content of human milk is the lowest. The excessive protein content of cow's milk-based formula is thought to be what may be what sets the child up for obesity later in life.

And then, instead of being weaned, we continue to drink milk. The question thus arises as to whether consumption of a growth-promoting substance from another species throughout childhood fundamentally alters processes of human growth and maturation. A study out of Indiana University, for example, found evidence that greater milk intake is associated with an increased risk of premature puberty; girls drinking a lot of milk started to get their periods earlier. Thus, cross-species milk consumption and ingestion into childhood may trigger unintended consequences.

Only human milk allows appropriate metabolic programming and protects against diseases of civilization in later life, whereas consumption of cow's milk and dairy products during adolescence and adulthood is an evolutionarily novel behavior that may have long-term adverse effects on human health.

Teens exposed to dairy proteins such as casein, skim milk, or whey, experienced a significant increase in BMI and waist circumference compared to a control group. In contrast, not a single study funded by the dairy industry found a result unfavorable to milk.

The head of the Obesity Prevention Center at Boston Children's Hospital and the chair of Harvard's nutrition department wrote an editorial recently to the AMA's Pediatrics journal questioning the role of cow's milk in human nutrition. They stated the obvious: humans have no requirement for other animal's milk; in fact, dairy may play a role in certain cancers due to the high levels of reproductive hormones in the U.S. milk supply.


So what's The Best Baby Formula? Click on the link and find out!

More on dairy and infancy:

And in childhood: Childhood Constipation and Cow's Milk and Treating Infant Colic by Changing Mom's Diet

In adolescence: Saving Lives By Treating Acne With Diet

Before conception: Dairy Estrogen and Male Fertility

During pregnancy: Why Do Vegan Women Have 5x Fewer Twins?

And in adulthood:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sergey Novikov © 123RF.com. This image has been modified.

Original Link

How Milk May Contribute to Childhood Obesity

How Milk May Contribute to Childhood Obesity.jpeg

We've known that breastfed infants may be protected against obesity later in life for more than 30 years, but why? It may be the formula. Giving infants formula based on cow's milk presents an unusual situation. Cow's milk is designed to put nearly two pounds a day onto a growing calf, 40 times the growth rate of human infants (see Formula for Childhood Obesity).

The perfect food for humans, finely tuned over millions of years, is human breast milk. Remarkably, among all mammalian species, the protein content of human milk is the lowest. The excessive protein content of cow's milk-based formula is thought to be what may be what sets the child up for obesity later in life.

And then, instead of being weaned, we continue to drink milk. The question thus arises as to whether consumption of a growth-promoting substance from another species throughout childhood fundamentally alters processes of human growth and maturation. A study out of Indiana University, for example, found evidence that greater milk intake is associated with an increased risk of premature puberty; girls drinking a lot of milk started to get their periods earlier. Thus, cross-species milk consumption and ingestion into childhood may trigger unintended consequences.

Only human milk allows appropriate metabolic programming and protects against diseases of civilization in later life, whereas consumption of cow's milk and dairy products during adolescence and adulthood is an evolutionarily novel behavior that may have long-term adverse effects on human health.

Teens exposed to dairy proteins such as casein, skim milk, or whey, experienced a significant increase in BMI and waist circumference compared to a control group. In contrast, not a single study funded by the dairy industry found a result unfavorable to milk.

The head of the Obesity Prevention Center at Boston Children's Hospital and the chair of Harvard's nutrition department wrote an editorial recently to the AMA's Pediatrics journal questioning the role of cow's milk in human nutrition. They stated the obvious: humans have no requirement for other animal's milk; in fact, dairy may play a role in certain cancers due to the high levels of reproductive hormones in the U.S. milk supply.


So what's The Best Baby Formula? Click on the link and find out!

More on dairy and infancy:

And in childhood: Childhood Constipation and Cow's Milk and Treating Infant Colic by Changing Mom's Diet

In adolescence: Saving Lives By Treating Acne With Diet

Before conception: Dairy Estrogen and Male Fertility

During pregnancy: Why Do Vegan Women Have 5x Fewer Twins?

And in adulthood:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sergey Novikov © 123RF.com. This image has been modified.

Original Link

How Exactly Does Type 2 Diabetes Develop?

How Exactly Does Type 2 Diabetes Develop.jpeg

Insulin resistance is the cause of both prediabetes and type 2 diabetes. OkK, so what is the cause of insulin resistance? Insulin resistance is now accepted to be closely associated with the accumulation of fat within our muscle cells. This fat toxicity inside of our muscles is a major factor in the cause of insulin resistance and type 2 diabetes, as it interferes with the action of insulin. I've explored how fat makes our muscles insulin resistant (see What Causes Insulin Resistance?), how that fat can come from the fat we eat or the fat we wear (see The Spillover Effect Links Obesity to Diabetes), and how not all fats are the same (see Lipotoxicity: How Saturated Fat Raises Blood Sugar). It's the type of fat found predominantly in animal fats, relative to plant fats, that appears to be especially deleterious with respect to fat-induced insulin insensitivity. But this insulin resistance in our muscles starts years before diabetes is diagnosed.

In my video, Diabetes as a Disease of Fat Toxicity, you can see that insulin resistance starts over a decade before diabetes is actually diagnosed, as blood sugar levels slowly start creeping up. And then, all of the sudden, the pancreas conks out, and blood sugars skyrocket. What could underlie this relatively rapid failure of insulin secretion?

At first, the pancreas pumps out more and more insulin, trying to overcome the fat-induced insulin resistance in the muscles, and high insulin levels can lead to the accumulation of fat in the liver, called fatty liver disease. Before diagnosis of type 2 diabetes, there is a long silent scream from the liver. As fat builds up in our liver, it also becomes resistant to insulin.

Normally, the liver is constantly producing blood sugar to keep our brain alive between meals. As soon as we eat breakfast, though, the insulin released to deal with the meal normally turns off liver glucose production, which makes sense since we don't need it anymore. But when our liver is filled with fat, it becomes insulin resistant like our muscles, and doesn't respond to the breakfast signal; it keeps pumping out blood sugar all day long on top of whatever we eat. Then the pancreas pumps out even more insulin to deal with the high sugars, and our liver gets fatter and fatter. That's one of the twin vicious cycles of diabetes. Fatty muscles, in the context of too many calories, leads to a fatty liver, which leads to an even fattier liver. This is all still before we have diabetes.

Fatty liver can be deadly. The liver starts trying to offload the fat by dumping it back into the bloodstream in the form of something called VLDL, and that starts building up in the cells in the pancreas that produce the insulin in the first place. Now we know how diabetes develops: fatty muscles lead to a fatty liver, which leads to a fatty pancreas. It is now clear that type 2 diabetes is a condition of excess fat inside our organs, whether we're obese or not.

The only thing that was keeping us from diabetes-unchecked skyrocketing blood sugars-is that the pancreas was working overtime pumping out extra insulin to overcome insulin resistance. But as the so-called islet or Beta cells in the pancreas are killed off by the fatty buildup, insulin production starts to fail, and we're left with the worst of both worlds: insulin resistance combined with a failing pancreas. Unable to then overcome the resistance, blood sugar levels go up and up, and boom: type 2 diabetes.

This has implications for cancer as well. Obesity leads to insulin resistance and our blood sugars start to go up, so our pancreas starts pumping out more insulin to try to force more sugar into our muscles, and eventually the fat spills over into the pancreas, killing off the insulin-producing cells. Then we develop diabetes, in which case we may have to start injecting insulin at high levels to overcome the insulin-resistance, and these high insulin levels promote cancer. That's one of the reasons we think obese women get more breast cancer. It all traces back to fat getting into our muscle cells, causing insulin resistance: fat from our stomach (obesity) or fat going into our stomach (saturated fats in our diet).

Now it should make sense why the American Diabetes Association recommends reduced intake of dietary fat as a strategy for reducing the risk for developing diabetes.


The reason I'm going into all this detail is that I'm hoping to empower both those suffering from the disease and those treating sufferers so as to better understand dietary interventions to prevent and treat the epidemic.

Here are some videos on prevention:

And here are some on treatment:

In health,
Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Nephron. This image has been modified.

Original Link

How Exactly Does Type 2 Diabetes Develop?

How Exactly Does Type 2 Diabetes Develop.jpeg

Insulin resistance is the cause of both prediabetes and type 2 diabetes. OkK, so what is the cause of insulin resistance? Insulin resistance is now accepted to be closely associated with the accumulation of fat within our muscle cells. This fat toxicity inside of our muscles is a major factor in the cause of insulin resistance and type 2 diabetes, as it interferes with the action of insulin. I've explored how fat makes our muscles insulin resistant (see What Causes Insulin Resistance?), how that fat can come from the fat we eat or the fat we wear (see The Spillover Effect Links Obesity to Diabetes), and how not all fats are the same (see Lipotoxicity: How Saturated Fat Raises Blood Sugar). It's the type of fat found predominantly in animal fats, relative to plant fats, that appears to be especially deleterious with respect to fat-induced insulin insensitivity. But this insulin resistance in our muscles starts years before diabetes is diagnosed.

In my video, Diabetes as a Disease of Fat Toxicity, you can see that insulin resistance starts over a decade before diabetes is actually diagnosed, as blood sugar levels slowly start creeping up. And then, all of the sudden, the pancreas conks out, and blood sugars skyrocket. What could underlie this relatively rapid failure of insulin secretion?

At first, the pancreas pumps out more and more insulin, trying to overcome the fat-induced insulin resistance in the muscles, and high insulin levels can lead to the accumulation of fat in the liver, called fatty liver disease. Before diagnosis of type 2 diabetes, there is a long silent scream from the liver. As fat builds up in our liver, it also becomes resistant to insulin.

Normally, the liver is constantly producing blood sugar to keep our brain alive between meals. As soon as we eat breakfast, though, the insulin released to deal with the meal normally turns off liver glucose production, which makes sense since we don't need it anymore. But when our liver is filled with fat, it becomes insulin resistant like our muscles, and doesn't respond to the breakfast signal; it keeps pumping out blood sugar all day long on top of whatever we eat. Then the pancreas pumps out even more insulin to deal with the high sugars, and our liver gets fatter and fatter. That's one of the twin vicious cycles of diabetes. Fatty muscles, in the context of too many calories, leads to a fatty liver, which leads to an even fattier liver. This is all still before we have diabetes.

Fatty liver can be deadly. The liver starts trying to offload the fat by dumping it back into the bloodstream in the form of something called VLDL, and that starts building up in the cells in the pancreas that produce the insulin in the first place. Now we know how diabetes develops: fatty muscles lead to a fatty liver, which leads to a fatty pancreas. It is now clear that type 2 diabetes is a condition of excess fat inside our organs, whether we're obese or not.

The only thing that was keeping us from diabetes-unchecked skyrocketing blood sugars-is that the pancreas was working overtime pumping out extra insulin to overcome insulin resistance. But as the so-called islet or Beta cells in the pancreas are killed off by the fatty buildup, insulin production starts to fail, and we're left with the worst of both worlds: insulin resistance combined with a failing pancreas. Unable to then overcome the resistance, blood sugar levels go up and up, and boom: type 2 diabetes.

This has implications for cancer as well. Obesity leads to insulin resistance and our blood sugars start to go up, so our pancreas starts pumping out more insulin to try to force more sugar into our muscles, and eventually the fat spills over into the pancreas, killing off the insulin-producing cells. Then we develop diabetes, in which case we may have to start injecting insulin at high levels to overcome the insulin-resistance, and these high insulin levels promote cancer. That's one of the reasons we think obese women get more breast cancer. It all traces back to fat getting into our muscle cells, causing insulin resistance: fat from our stomach (obesity) or fat going into our stomach (saturated fats in our diet).

Now it should make sense why the American Diabetes Association recommends reduced intake of dietary fat as a strategy for reducing the risk for developing diabetes.


The reason I'm going into all this detail is that I'm hoping to empower both those suffering from the disease and those treating sufferers so as to better understand dietary interventions to prevent and treat the epidemic.

Here are some videos on prevention:

And here are some on treatment:

In health,
Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Nephron. This image has been modified.

Original Link

Why Is Milk Consumption Associated with More Bone Fractures?

Why Is Milk Consumption Associated with More Bone Fractures?.jpg

Milk is touted to build strong bones, but a compilation of all the best studies found no association between milk consumption and hip fracture risk, so drinking milk as an adult might not help bones, but what about in adolescence? Harvard researchers decided to put it to the test.

Studies have shown that greater milk consumption during childhood and adolescence contributes to peak bone mass, and is therefore expected to help avoid osteoporosis and bone fractures in later life. But that's not what researchers have found (as you can see in my video Is Milk Good for Our Bones?). Milk consumption during teenage years was not associated with a lower risk of hip fracture, and if anything, milk consumption was associated with a borderline increase in fracture risk in men.

It appears that the extra boost in total body bone mineral density from getting extra calcium is lost within a few years; even if you keep the calcium supplementation up. This suggests a partial explanation for the long-standing enigma that hip fracture rates are highest in populations with the greatest milk consumption. This may be an explanation for why they're not lower, but why would they be higher?

This enigma irked a Swedish research team, puzzled because studies again and again had shown a tendency of a higher risk of fracture with a higher intake of milk. Well, there is a rare birth defect called galactosemia, where babies are born without the enzymes needed to detoxify the galactose found in milk, so they end up with elevated levels of galactose in their blood, which can causes bone loss even as kids. So maybe, the Swedish researchers figured, even in normal people that can detoxify the stuff, it might not be good for the bones to be drinking it every day.

And galactose doesn't just hurt the bones. Galactose is what scientists use to cause premature aging in lab animals--it can shorten their lifespan, cause oxidative stress, inflammation, and brain degeneration--just with the equivalent of like one to two glasses of milk's worth of galactose a day. We're not rats, though. But given the high amount of galactose in milk, recommendations to increase milk intake for prevention of fractures could be a conceivable contradiction. So, the researchers decided to put it to the test, looking at milk intake and mortality as well as fracture risk to test their theory.

A hundred thousand men and women were followed for up to 20 years. Researchers found that milk-drinking women had higher rates of death, more heart disease, and significantly more cancer for each glass of milk. Three glasses a day was associated with nearly twice the risk of premature death, and they had significantly more bone and hip fractures. More milk, more fractures.

Men in a separate study also had a higher rate of death with higher milk consumption, but at least they didn't have higher fracture rates. So, the researchers found a dose dependent higher rate of both mortality and fracture in women, and a higher rate of mortality in men with milk intake, but the opposite for other dairy products like soured milk and yogurt, which would go along with the galactose theory, since bacteria can ferment away some of the lactose. To prove it though, we need a randomized controlled trial to examine the effect of milk intake on mortality and fractures. As the accompanying editorial pointed out, we better find this out soon since milk consumption is on the rise around the world.

What can we do for our bones, then? Weight-bearing exercise such as jumping, weight-lifting, and walking with a weighted vest or backpack may help, along with getting enough calcium (Alkaline Diets, Animal Protein, & Calcium Loss) and vitamin D (Resolving the Vitamin D-Bate). Eating beans (Phytates for the Prevention of Osteoporosis) and avoiding phosphate additives (Phosphate Additives in Meat Purge and Cola) may also help.

Maybe the galactose angle can help explain the findings on prostate cancer (Prostate Cancer and Organic Milk vs. Almond Milk) and Parkinson's disease (Preventing Parkinson's Disease With Diet).

Galactose is a milk sugar. There's also concern about milk proteins (see my casomorphin series) and fats (The Saturated Fat Studies: Buttering Up the Public and Trans Fat in Meat and Dairy) as well as the hormones (Dairy Estrogen and Male Fertility, Estrogen in Meat, Dairy, and Eggs and Why Do Vegan Women Have 5x Fewer Twins?).

Milk might also play a role in diabetes (Does Casein in Milk Trigger Type 1 Diabetes, Does Bovine Insulin in Milk Trigger Type 1 Diabetes?) and breast cancer (Is Bovine Leukemia in Milk Infectious?, The Role of Bovine Leukemia Virus in Breast Cancer, and Industry Response to Bovine Leukemia Virus in Breast Cancer).

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link