Can Dehydration Affect Our Mood?

Sept 7 Dehydration copy.jpeg

Water is by far the number-one nutrient in our diet. Studies have suggested that proper hydration may lower our risk of heart disease and cancer, and may even make us better kissers. Brushing artificial skin against the lips of young women, researchers found that hydrated lips showed greater sensitivity to light touch.

Although it is well known that water is essential for human survival, it's only recently that we have begun to understand its role in the maintenance of brain function. It makes sense. Our brain is 75% water. When we get dehydrated, our brain actually shrinks. Even mild dehydration, which can be caused by simply exercising on a hot day, has been shown to change brain function.

I've talked about the role of hydration for cognitive function in Does a Drink of Water Make Children Smarter?, but current findings suggest that our mood states may also be positively influenced by water consumption.

The effects of dehydration in real life have not been not well documented. It wasn't until 2013 that the first study to investigate the effects of mild dehydration on a variety of feelings was published. What did the researchers find? The most important effects of fluid deprivation were increased sleepiness and fatigue, lower levels of vigor and alertness, and increased confusion. But as soon as they gave the subjects some water, the deleterious effects on alertness, happiness, and confusion were immediately reversed.

Water absorption actually happens very rapidly, within 5 minutes from mouth to bloodstream, peaking around minute 20. Interestingly, the temperature of the water appears to affect this speed. Which do you think is absorbed more rapidly--cold water or warm, body temperature water? It turns out cold water gets sucked in about 20% faster!

How can we tell if we're dehydrated or not? Well, why don't we ask our bodies? If we chug down some water and then turn around and just pee it all out, presumably that would be our body's way of saying, "I'm good, all topped off." But if we drink a bunch of water and our body keeps most of it, then presumably our tank was low. Researchers from the University of Connecticut formalized the technique. You empty your bladder, chug down 11 milliliters per kilogram of body weight (about 3 three cups of water for an average-sized person) and then an hour later see how much you pee. Basically, if you drink 3 cups and pee out less than 1, there's a good chance you were dehydrated. You can see the findings of this chug-and-pee test around minute 3 in my Can Dehydration Affect Our Mood? video.


For more on water, see my How Many Glasses of Water Should We Drink a Day?, Does a Drink Of Water Make Children Smarter?, and Treating Dry Eyes with Diet: Just Add Water?

Other healthy beverages include hibiscus tea (Hibiscus Tea vs. Plant-Based Diets for Hypertension) and green tea (Dietary Brain Wave Alteration and Benefits of Green Tea for Boosting Antiviral Immune Function).

What else can affect our mood?

What about the omega-3s in fish? That's the subject of another video: Fish Consumption and Suicide.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Original Link

Can Dehydration Affect Our Mood?

Sept 7 Dehydration copy.jpeg

Water is by far the number-one nutrient in our diet. Studies have suggested that proper hydration may lower our risk of heart disease and cancer, and may even make us better kissers. Brushing artificial skin against the lips of young women, researchers found that hydrated lips showed greater sensitivity to light touch.

Although it is well known that water is essential for human survival, it's only recently that we have begun to understand its role in the maintenance of brain function. It makes sense. Our brain is 75% water. When we get dehydrated, our brain actually shrinks. Even mild dehydration, which can be caused by simply exercising on a hot day, has been shown to change brain function.

I've talked about the role of hydration for cognitive function in Does a Drink of Water Make Children Smarter?, but current findings suggest that our mood states may also be positively influenced by water consumption.

The effects of dehydration in real life have not been not well documented. It wasn't until 2013 that the first study to investigate the effects of mild dehydration on a variety of feelings was published. What did the researchers find? The most important effects of fluid deprivation were increased sleepiness and fatigue, lower levels of vigor and alertness, and increased confusion. But as soon as they gave the subjects some water, the deleterious effects on alertness, happiness, and confusion were immediately reversed.

Water absorption actually happens very rapidly, within 5 minutes from mouth to bloodstream, peaking around minute 20. Interestingly, the temperature of the water appears to affect this speed. Which do you think is absorbed more rapidly--cold water or warm, body temperature water? It turns out cold water gets sucked in about 20% faster!

How can we tell if we're dehydrated or not? Well, why don't we ask our bodies? If we chug down some water and then turn around and just pee it all out, presumably that would be our body's way of saying, "I'm good, all topped off." But if we drink a bunch of water and our body keeps most of it, then presumably our tank was low. Researchers from the University of Connecticut formalized the technique. You empty your bladder, chug down 11 milliliters per kilogram of body weight (about 3 three cups of water for an average-sized person) and then an hour later see how much you pee. Basically, if you drink 3 cups and pee out less than 1, there's a good chance you were dehydrated. You can see the findings of this chug-and-pee test around minute 3 in my Can Dehydration Affect Our Mood? video.


For more on water, see my How Many Glasses of Water Should We Drink a Day?, Does a Drink Of Water Make Children Smarter?, and Treating Dry Eyes with Diet: Just Add Water?

Other healthy beverages include hibiscus tea (Hibiscus Tea vs. Plant-Based Diets for Hypertension) and green tea (Dietary Brain Wave Alteration and Benefits of Green Tea for Boosting Antiviral Immune Function).

What else can affect our mood?

What about the omega-3s in fish? That's the subject of another video: Fish Consumption and Suicide.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Original Link

Can Dehydration Affect Our Mood?

Sept 7 Dehydration copy.jpeg

Water is by far the number-one nutrient in our diet. Studies have suggested that proper hydration may lower our risk of heart disease and cancer, and may even make us better kissers. Brushing artificial skin against the lips of young women, researchers found that hydrated lips showed greater sensitivity to light touch.

Although it is well known that water is essential for human survival, it's only recently that we have begun to understand its role in the maintenance of brain function. It makes sense. Our brain is 75% water. When we get dehydrated, our brain actually shrinks. Even mild dehydration, which can be caused by simply exercising on a hot day, has been shown to change brain function.

I've talked about the role of hydration for cognitive function in Does a Drink of Water Make Children Smarter?, but current findings suggest that our mood states may also be positively influenced by water consumption.

The effects of dehydration in real life have not been not well documented. It wasn't until 2013 that the first study to investigate the effects of mild dehydration on a variety of feelings was published. What did the researchers find? The most important effects of fluid deprivation were increased sleepiness and fatigue, lower levels of vigor and alertness, and increased confusion. But as soon as they gave the subjects some water, the deleterious effects on alertness, happiness, and confusion were immediately reversed.

Water absorption actually happens very rapidly, within 5 minutes from mouth to bloodstream, peaking around minute 20. Interestingly, the temperature of the water appears to affect this speed. Which do you think is absorbed more rapidly--cold water or warm, body temperature water? It turns out cold water gets sucked in about 20% faster!

How can we tell if we're dehydrated or not? Well, why don't we ask our bodies? If we chug down some water and then turn around and just pee it all out, presumably that would be our body's way of saying, "I'm good, all topped off." But if we drink a bunch of water and our body keeps most of it, then presumably our tank was low. Researchers from the University of Connecticut formalized the technique. You empty your bladder, chug down 11 milliliters per kilogram of body weight (about 3 three cups of water for an average-sized person) and then an hour later see how much you pee. Basically, if you drink 3 cups and pee out less than 1, there's a good chance you were dehydrated. You can see the findings of this chug-and-pee test around minute 3 in my Can Dehydration Affect Our Mood? video.


For more on water, see my How Many Glasses of Water Should We Drink a Day?, Does a Drink Of Water Make Children Smarter?, and Treating Dry Eyes with Diet: Just Add Water?

Other healthy beverages include hibiscus tea (Hibiscus Tea vs. Plant-Based Diets for Hypertension) and green tea (Dietary Brain Wave Alteration and Benefits of Green Tea for Boosting Antiviral Immune Function).

What else can affect our mood?

What about the omega-3s in fish? That's the subject of another video: Fish Consumption and Suicide.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Original Link

The 3 Vitamins that Prevent Brain Loss

The 3 Vitamins that Prevent Brain Loss.jpeg

By our seventies, one in five of us will suffer from cognitive impairment. Within five years, half of those cognitively impaired will progress to dementia and death. The earlier we can slow or stop this process, the better.

Although an effective treatment for Alzheimer's disease is unavailable, interventions just to control risk factors could prevent millions of cases. An immense effort has been spent on identifying such risk factors for Alzheimer's and developing treatments to reduce them.

In 1990, a small study of 22 Alzheimer's patients reported high concentrations of homocysteine in their blood. The homocysteine story goes back to 1969 when a Harvard pathologist reported two cases of children, one dating back to 1933, whose brains had turned to mush. They both suffered from extremely rare genetic mutations that led to abnormally high levels of homocysteine in their bodies. Is it possible, he asked, that homocysteine could cause brain damage even in people without genetic defects?

Here we are in the 21st century, and homocysteine is considered "a strong, independent risk factor for the development of dementia and Alzheimer's disease." Having a blood level over 14 (µmol/L) may double our risk. In the Framingham Study, researchers estimate that as many as one in six Alzheimer's cases may be attributable to elevated homocysteine in the blood, which is now thought to play a role in brain damage and cognitive and memory decline. Our body can detoxify homocysteine, though, using three vitamins: folate, vitamin B12, and vitamin B6. So why don't we put them to the test? No matter how many studies find an association between high homocysteinea and cognitive decline, dementia, or Alzheimer's disease, a cause-and-effect role can only be confirmed by interventional studies.

Initially, the results were disappointing. Vitamin supplementation did not seem to work, but the studies were tracking neuropsychological assessments, which are more subjective compared to structural neuroimaging--that is, actually seeing what's happening to the brain. A double-blind randomized controlled trial found that homocysteine-lowering by B vitamins can slow the rate of accelerated brain atrophy in people with mild cognitive impairment. As we age, our brains slowly atrophy, but the shrinking is much accelerated in patients suffering from Alzheimer's disease. An intermediate rate of shrinkage is found in people with mild cognitive impairment. The thinking is if we could slow the rate of brain loss, we may be able to slow the conversion to Alzheimer's disease. Researchers tried giving people B vitamins for two years and found it markedly slowed the rate of brain shrinkage. The rate of atrophy in those with high homocysteine levels was cut in half. A simple, safe treatment can slow the accelerated rate of brain loss.

A follow-up study went further by demonstrating that B-vitamin treatment reduces, by as much as seven-fold, the brain atrophy in the regions specifically vulnerable to the Alzheimer's disease process. You can see the amount of brain atrophy over a two-year period in the placebo group versus the B-vitamin group in my Preventing Brain Loss with B Vitamins? video.

The beneficial effect of B vitamins was confined to those with high homocysteine, indicating a relative deficiency in one of those three vitamins. Wouldn't it be better to not become deficient in the first place? Most people get enough B12 and B6. The reason these folks were stuck at a homocysteine of 11 µmoles per liter is that they probably weren't getting enough folate, which is found concentrated in beans and greens. Ninety-six percent of Americans don't even make the minimum recommended amount of dark green leafy vegetables, which is the same pitiful number who don't eat the minimum recommendation for beans.

If we put people on a healthy diet--a plant-based diet--we can drop their homocysteine levels by 20% in just one week, from around 11 mmoles per liter down to 9 mmoles per liter. The fact that they showed rapid and significant homocysteine lowering without any pills or supplements implies that multiple mechanisms may have been at work. The researchers suggest it may be because of the fiber. Every gram of daily fiber consumption may increase folate levels in the blood nearly 2%, perhaps by boosting vitamin production in the colon by all our friendly gut bacteria. It also could be from the decreased methionine intake.

Methionine is where homocysteine comes from. Homocysteine is a breakdown product of methionine, which comes mostly from animal protein. If we give someone bacon and eggs for breakfast and a steak for dinner, we can get spikes of homocysteine levels in the blood. Thus, decreased methionine intake on a plant-based diet may be another factor contributing to lower, safer homocysteine levels.

The irony is that those who eat plant-based diets long-term, not just at a health spa for a week, have terrible homocysteine levels. Meat-eaters are up at 11 µmoles per liter, but vegetarians at nearly 14 µmoles per liter and vegans at 16 µmoles per liter. Why? The vegetarians and vegans were getting more fiber and folate, but not enough vitamin B12. Most vegans were at risk for suffering from hyperhomocysteinaemia (too much homocysteine in the blood) because most vegans in the study were not supplementing with vitamin B12 or eating vitamin B12-fortified foods, which is critical for anyone eating a plant-based diet. If you take vegans and give them B12, their homocysteine levels can drop down below 5. Why not down to just 11? The reason meat-eaters were stuck up at 11 is presumably because they weren't getting enough folate. Once vegans got enough B12, they could finally fully exploit the benefits of their plant-based diets and come out with the lowest levels of all.

This is very similar to the findings in my video Vitamin B12 Necessary for Arterial Health.

For more details on ensuring a regular reliable source of vitamin B12:

There are more benefits to lowering your methionine intake. Check out Methionine Restriction as a Life Extension Strategy and Starving Cancer with Methionine Restriction.

For more on brain health in general, see these videos:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Thomas Hawk / Flickr. This image has been modified.

Original Link

The 3 Vitamins that Prevent Brain Loss

The 3 Vitamins that Prevent Brain Loss.jpeg

By our seventies, one in five of us will suffer from cognitive impairment. Within five years, half of those cognitively impaired will progress to dementia and death. The earlier we can slow or stop this process, the better.

Although an effective treatment for Alzheimer's disease is unavailable, interventions just to control risk factors could prevent millions of cases. An immense effort has been spent on identifying such risk factors for Alzheimer's and developing treatments to reduce them.

In 1990, a small study of 22 Alzheimer's patients reported high concentrations of homocysteine in their blood. The homocysteine story goes back to 1969 when a Harvard pathologist reported two cases of children, one dating back to 1933, whose brains had turned to mush. They both suffered from extremely rare genetic mutations that led to abnormally high levels of homocysteine in their bodies. Is it possible, he asked, that homocysteine could cause brain damage even in people without genetic defects?

Here we are in the 21st century, and homocysteine is considered "a strong, independent risk factor for the development of dementia and Alzheimer's disease." Having a blood level over 14 (µmol/L) may double our risk. In the Framingham Study, researchers estimate that as many as one in six Alzheimer's cases may be attributable to elevated homocysteine in the blood, which is now thought to play a role in brain damage and cognitive and memory decline. Our body can detoxify homocysteine, though, using three vitamins: folate, vitamin B12, and vitamin B6. So why don't we put them to the test? No matter how many studies find an association between high homocysteinea and cognitive decline, dementia, or Alzheimer's disease, a cause-and-effect role can only be confirmed by interventional studies.

Initially, the results were disappointing. Vitamin supplementation did not seem to work, but the studies were tracking neuropsychological assessments, which are more subjective compared to structural neuroimaging--that is, actually seeing what's happening to the brain. A double-blind randomized controlled trial found that homocysteine-lowering by B vitamins can slow the rate of accelerated brain atrophy in people with mild cognitive impairment. As we age, our brains slowly atrophy, but the shrinking is much accelerated in patients suffering from Alzheimer's disease. An intermediate rate of shrinkage is found in people with mild cognitive impairment. The thinking is if we could slow the rate of brain loss, we may be able to slow the conversion to Alzheimer's disease. Researchers tried giving people B vitamins for two years and found it markedly slowed the rate of brain shrinkage. The rate of atrophy in those with high homocysteine levels was cut in half. A simple, safe treatment can slow the accelerated rate of brain loss.

A follow-up study went further by demonstrating that B-vitamin treatment reduces, by as much as seven-fold, the brain atrophy in the regions specifically vulnerable to the Alzheimer's disease process. You can see the amount of brain atrophy over a two-year period in the placebo group versus the B-vitamin group in my Preventing Brain Loss with B Vitamins? video.

The beneficial effect of B vitamins was confined to those with high homocysteine, indicating a relative deficiency in one of those three vitamins. Wouldn't it be better to not become deficient in the first place? Most people get enough B12 and B6. The reason these folks were stuck at a homocysteine of 11 µmoles per liter is that they probably weren't getting enough folate, which is found concentrated in beans and greens. Ninety-six percent of Americans don't even make the minimum recommended amount of dark green leafy vegetables, which is the same pitiful number who don't eat the minimum recommendation for beans.

If we put people on a healthy diet--a plant-based diet--we can drop their homocysteine levels by 20% in just one week, from around 11 mmoles per liter down to 9 mmoles per liter. The fact that they showed rapid and significant homocysteine lowering without any pills or supplements implies that multiple mechanisms may have been at work. The researchers suggest it may be because of the fiber. Every gram of daily fiber consumption may increase folate levels in the blood nearly 2%, perhaps by boosting vitamin production in the colon by all our friendly gut bacteria. It also could be from the decreased methionine intake.

Methionine is where homocysteine comes from. Homocysteine is a breakdown product of methionine, which comes mostly from animal protein. If we give someone bacon and eggs for breakfast and a steak for dinner, we can get spikes of homocysteine levels in the blood. Thus, decreased methionine intake on a plant-based diet may be another factor contributing to lower, safer homocysteine levels.

The irony is that those who eat plant-based diets long-term, not just at a health spa for a week, have terrible homocysteine levels. Meat-eaters are up at 11 µmoles per liter, but vegetarians at nearly 14 µmoles per liter and vegans at 16 µmoles per liter. Why? The vegetarians and vegans were getting more fiber and folate, but not enough vitamin B12. Most vegans were at risk for suffering from hyperhomocysteinaemia (too much homocysteine in the blood) because most vegans in the study were not supplementing with vitamin B12 or eating vitamin B12-fortified foods, which is critical for anyone eating a plant-based diet. If you take vegans and give them B12, their homocysteine levels can drop down below 5. Why not down to just 11? The reason meat-eaters were stuck up at 11 is presumably because they weren't getting enough folate. Once vegans got enough B12, they could finally fully exploit the benefits of their plant-based diets and come out with the lowest levels of all.

This is very similar to the findings in my video Vitamin B12 Necessary for Arterial Health.

For more details on ensuring a regular reliable source of vitamin B12:

There are more benefits to lowering your methionine intake. Check out Methionine Restriction as a Life Extension Strategy and Starving Cancer with Methionine Restriction.

For more on brain health in general, see these videos:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Thomas Hawk / Flickr. This image has been modified.

Original Link

Comparing Pollutant Levels Between Different Diets

Comparing Pollutant Levels Between Different Diets.jpeg

The results of the CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas) study were published recently. This study of a California birth cohort investigated the relationship between exposure to flame retardant chemical pollutants in pregnancy and childhood, and subsequent neurobehavioral development. Why California? Because California children's exposures to these endocrine disruptors and neurotoxins are among the highest in the world.

What did they find? The researchers concluded that both prenatal and childhood exposures to these chemicals "were associated with poorer attention, fine motor coordination, and cognition" (particularly verbal comprehension) by the time the children reached school age. "This study, the largest to date, contributes to growing evidence suggesting that PBDEs [polybrominated diphenyl ethers, flame retardant chemicals] have adverse impacts on child neurobehavioral development." The effects may extend into adolescence, again affecting motor function as well as thyroid gland function. The effect on our thyroid glands may even extend into adulthood.

These chemicals get into moms, then into the amniotic fluid, and then into the breast milk. The more that's in the milk, the worse the infants' mental development may be. Breast milk is still best, but how did these women get exposed in the first place?

The question has been: Are we exposed mostly from diet or dust? Researchers in Boston collected breast milk samples from 46 first-time moms, vacuumed up samples of dust from their homes, and questioned them about their diets. The researchers found that both were likely to blame. Diet-wise, a number of animal products were implicated. This is consistent with what's been found worldwide. For example, in Europe, these flame retardant chemical pollutants are found mostly in meat, including fish, and other animal products. It's similar to what we see with dioxins--they are mostly found in fish and other fatty foods, with a plant-based diet offering the lowest exposure.

If that's the case, do vegetarians have lower levels of flame retardant chemical pollutants circulating in their bloodstreams? Yes. Vegetarians may have about 25% lower levels. Poultry appears to be the largest contributor of PBDEs. USDA researchers compared the levels in different meats, and the highest levels of these pollutants were found in chicken and turkey, with less in pork and even less in beef. California poultry had the highest, consistent with strict furniture flammability codes. But it's not like chickens are pecking at the sofa. Chickens and turkeys may be exposed indirectly through the application of sewer sludge to fields where feed crops are raised, contamination of water supplies, the use of flame-retarded materials in poultry housing, or the inadvertent incorporation of fire-retardant material into the birds' bedding or feed ingredients.

Fish have been shown to have the highest levels overall, but Americans don't eat a lot of fish so they don't contribute as much to the total body burden in the United States. Researchers have compared the level of PBDEs found in meat-eaters and vegetarians. The amount found in the bloodstream of vegetarians is noticeably lower, as you can see in my video Flame Retardant Pollutants and Child Development. Just to give you a sense of the contribution of chicken, higher than average poultry eaters have higher levels than omnivores as a whole, and lower than average poultry eaters have levels lower than omnivores.

What are the PBDE levels in vegans? We know the intake of many other classes of pollutants is almost exclusively from the ingestion of animal fats in the diet. What if we take them all out of the diet? It works for dioxins. Vegan dioxin levels appear markedly lower than the general population. What about for the flame retardant chemicals? Vegans have levels lower than vegetarians, with those who've been vegan around 20 years having even lower concentrations. This tendency for chemical levels to decline the longer one eats plant-based suggests that food of animal origin contributes substantially. But note that levels never get down to zero, so diet is not the only source.

The USDA researchers note that there are currently no regulatory limits on the amount of flame retardant chemical contamination in U.S. foods, "but reducing the levels of unnecessary, persistent, toxic compounds in our diet is certainly desirable."

I've previously talked about this class of chemicals in Food Sources of Flame Retardant Chemicals. The same foods seem to accumulate a variety of pollutants:

Many of these chemicals have hormone- or endocrine-disrupting effects. See, for example:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Mitchell Haindfield / Flickr. This image has been modified.

Original Link

Comparing Pollutant Levels Between Different Diets

Comparing Pollutant Levels Between Different Diets.jpeg

The results of the CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas) study were published recently. This study of a California birth cohort investigated the relationship between exposure to flame retardant chemical pollutants in pregnancy and childhood, and subsequent neurobehavioral development. Why California? Because California children's exposures to these endocrine disruptors and neurotoxins are among the highest in the world.

What did they find? The researchers concluded that both prenatal and childhood exposures to these chemicals "were associated with poorer attention, fine motor coordination, and cognition" (particularly verbal comprehension) by the time the children reached school age. "This study, the largest to date, contributes to growing evidence suggesting that PBDEs [polybrominated diphenyl ethers, flame retardant chemicals] have adverse impacts on child neurobehavioral development." The effects may extend into adolescence, again affecting motor function as well as thyroid gland function. The effect on our thyroid glands may even extend into adulthood.

These chemicals get into moms, then into the amniotic fluid, and then into the breast milk. The more that's in the milk, the worse the infants' mental development may be. Breast milk is still best, but how did these women get exposed in the first place?

The question has been: Are we exposed mostly from diet or dust? Researchers in Boston collected breast milk samples from 46 first-time moms, vacuumed up samples of dust from their homes, and questioned them about their diets. The researchers found that both were likely to blame. Diet-wise, a number of animal products were implicated. This is consistent with what's been found worldwide. For example, in Europe, these flame retardant chemical pollutants are found mostly in meat, including fish, and other animal products. It's similar to what we see with dioxins--they are mostly found in fish and other fatty foods, with a plant-based diet offering the lowest exposure.

If that's the case, do vegetarians have lower levels of flame retardant chemical pollutants circulating in their bloodstreams? Yes. Vegetarians may have about 25% lower levels. Poultry appears to be the largest contributor of PBDEs. USDA researchers compared the levels in different meats, and the highest levels of these pollutants were found in chicken and turkey, with less in pork and even less in beef. California poultry had the highest, consistent with strict furniture flammability codes. But it's not like chickens are pecking at the sofa. Chickens and turkeys may be exposed indirectly through the application of sewer sludge to fields where feed crops are raised, contamination of water supplies, the use of flame-retarded materials in poultry housing, or the inadvertent incorporation of fire-retardant material into the birds' bedding or feed ingredients.

Fish have been shown to have the highest levels overall, but Americans don't eat a lot of fish so they don't contribute as much to the total body burden in the United States. Researchers have compared the level of PBDEs found in meat-eaters and vegetarians. The amount found in the bloodstream of vegetarians is noticeably lower, as you can see in my video Flame Retardant Pollutants and Child Development. Just to give you a sense of the contribution of chicken, higher than average poultry eaters have higher levels than omnivores as a whole, and lower than average poultry eaters have levels lower than omnivores.

What are the PBDE levels in vegans? We know the intake of many other classes of pollutants is almost exclusively from the ingestion of animal fats in the diet. What if we take them all out of the diet? It works for dioxins. Vegan dioxin levels appear markedly lower than the general population. What about for the flame retardant chemicals? Vegans have levels lower than vegetarians, with those who've been vegan around 20 years having even lower concentrations. This tendency for chemical levels to decline the longer one eats plant-based suggests that food of animal origin contributes substantially. But note that levels never get down to zero, so diet is not the only source.

The USDA researchers note that there are currently no regulatory limits on the amount of flame retardant chemical contamination in U.S. foods, "but reducing the levels of unnecessary, persistent, toxic compounds in our diet is certainly desirable."

I've previously talked about this class of chemicals in Food Sources of Flame Retardant Chemicals. The same foods seem to accumulate a variety of pollutants:

Many of these chemicals have hormone- or endocrine-disrupting effects. See, for example:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Mitchell Haindfield / Flickr. This image has been modified.

Original Link

Using a Smell Test to Diagnose Alzheimer’s Disease

Using a Smell Test to Diagnose Alzheimer's Disease.jpeg

Alzheimer's disease (AD) pathology appears to start in the part of the brain that handles smell before subsequently spreading to additional brain regions and then, ultimately, taking over much of the rest of the brain. This led some to speculate that Alzheimer's disease may begin in the nose. Perhaps there's some environmental agent that might enter the brain through some portal in the nostrils?

This is the so-called olfactory vector hypothesis. The anatomy of the nose is well suited for the transfer of things directly into the brain, since the olfactory nerves that stick out into the nose project directly into the brain, bypassing the blood-brain barrier. The nose was actually a major infection route for the polio virus. Public health officials you started cauterizing the nasal passages of schoolchildren by spraying caustic chemicals up their noses in an effort to prevent the disease.

The concern is if people breathe in some ionized metals like aluminum dust, for example, it could be transported into the brain through these olfactory nerves at a rate of about 2 millimeters an hour, which is practically 2 inches a day. Doubt has been cast on this theory, however, by a case report of a woman born with a birth defect in which she had no smell nerves yet still developed Alzheimer's-like pathology. And so, to date, all the supporting evidence is really just circumstantial. It is clear, though, that changes in the sense of smell is among the first clinical signs of Alzheimer's, occurring during the preclinical phase--that is, before there's any noticeable cognitive decline. Could we use these changes to predict or diagnose the disease?

For years, researchers have been trying to find markers of brain illness hidden in people's ability to smell using all sorts of fancy gadgets. For example, functional MRI scans can detect differences in brain activation in response to an odor. In my video, Peanut Butter Smell Test for Alzheimer's, you can see the responses to lavender. You'll see a representation of a normal brain's responses to the odor versus an Alzheimer's brain. This unequivocally demonstrates that we can pick up changes in smell function due to Alzheimer's. But do we really need a million-dollar machine?

An ingenious group of researchers at the University of Florida discovered all we may need is some peanut butter and a ruler.

Considering that the left side of the brain primarily processes what we smell through our left nostril and the right side of our brain covers the right nostril, and understanding that Alzheimer's strikes the left side more than the right, what if you performed the following experiment: Close your eyes and mouth, breathe normally through the nose, then close one nostril, and hold a foot-long ruler out from the open nostril. Once your eyes, mouth, and one nostril are closed, open a container of peanut butter at the bottom of the ruler (one foot away from your open nostril). Move the peanut butter closer by 1 centimeter upon each exhale until you can detect the odor. Then repeat the whole procedure again using your other nostril.

This is exactly what the University of Florida researchers did with their subjects. What did they find? The normal elderly control subjects in the study smelled the peanut butter as soon as it came within an average of 18 centimeters (about 7 inches) from either nostril. It was about the same, roughly 7 inches, in the right nostrils of Alzheimer's patients. But in their left nostrils, it was a mere 2 inches! The peanut butter had to be only 2 inches away before the Alzheimer's patients could detect it through their left nostrils. This happened every single time. Indeed, the researchers found that a "left nostril impairment of odor detection was present in all the patients with probable AD." There was no left-right difference in the control group; they could smell the peanut butter when it was the same distance away from both their left and right nostrils. In the Alzheimer's group, however, there was a 12-centimeter difference.

The disparity was so great that we may be able to set a cutoff value for the diagnosis of Alzheimer's. The researchers reported that "[c]ompared to patients with other causes of dementia this nostril asymmetry of odor detection...was 100% sensitive and 100% specific for probable AD," meaning no false positives and no false negatives. Compared to healthy people, it was 100% sensitive in picking up cases of probable Alzheimer's and 92% specific. What exactly does that mean? In this study, if you had Alzheimer's, there was a 100% chance of having that wide left-right discrepancy. But, if you did have that discrepancy, the chance of having Alzheimer's was only 92%. This means there were some false positives.

The reason it's only "probable" Alzheimer's is because the only way we can really confirm someone has the disease is on autopsy. The current criteria for diagnosing Alzheimer's require an extensive evaluation, combined with fancy positron emission tomography (PET) scans and spinal taps. All of these tests are expensive and hard to get, can be invasive, and can have potential complications. On top of that, they are neither highly sensitive nor specific. The left-right nostril / peanut butter odor detection test, however, was fast, simple, non-invasive, and inexpensive. They concluded that may make peanut butter an ideal instrument for the early detection of Alzheimer's disease.

Does all this sound a bit too good to be true? It may be. A University of Pennsylvania research team was unable to replicate the results. Click here to read their paper. So at this point, the data are mixed. I'll do another post once more studies are published and we have a better handle on whether it's useful or not.

Of course, it's better to prevent Alzheimer's in the first place. Check out these videos for more information.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Using a Smell Test to Diagnose Alzheimer’s Disease

Using a Smell Test to Diagnose Alzheimer's Disease.jpeg

Alzheimer's disease (AD) pathology appears to start in the part of the brain that handles smell before subsequently spreading to additional brain regions and then, ultimately, taking over much of the rest of the brain. This led some to speculate that Alzheimer's disease may begin in the nose. Perhaps there's some environmental agent that might enter the brain through some portal in the nostrils?

This is the so-called olfactory vector hypothesis. The anatomy of the nose is well suited for the transfer of things directly into the brain, since the olfactory nerves that stick out into the nose project directly into the brain, bypassing the blood-brain barrier. The nose was actually a major infection route for the polio virus. Public health officials you started cauterizing the nasal passages of schoolchildren by spraying caustic chemicals up their noses in an effort to prevent the disease.

The concern is if people breathe in some ionized metals like aluminum dust, for example, it could be transported into the brain through these olfactory nerves at a rate of about 2 millimeters an hour, which is practically 2 inches a day. Doubt has been cast on this theory, however, by a case report of a woman born with a birth defect in which she had no smell nerves yet still developed Alzheimer's-like pathology. And so, to date, all the supporting evidence is really just circumstantial. It is clear, though, that changes in the sense of smell is among the first clinical signs of Alzheimer's, occurring during the preclinical phase--that is, before there's any noticeable cognitive decline. Could we use these changes to predict or diagnose the disease?

For years, researchers have been trying to find markers of brain illness hidden in people's ability to smell using all sorts of fancy gadgets. For example, functional MRI scans can detect differences in brain activation in response to an odor. In my video, Peanut Butter Smell Test for Alzheimer's, you can see the responses to lavender. You'll see a representation of a normal brain's responses to the odor versus an Alzheimer's brain. This unequivocally demonstrates that we can pick up changes in smell function due to Alzheimer's. But do we really need a million-dollar machine?

An ingenious group of researchers at the University of Florida discovered all we may need is some peanut butter and a ruler.

Considering that the left side of the brain primarily processes what we smell through our left nostril and the right side of our brain covers the right nostril, and understanding that Alzheimer's strikes the left side more than the right, what if you performed the following experiment: Close your eyes and mouth, breathe normally through the nose, then close one nostril, and hold a foot-long ruler out from the open nostril. Once your eyes, mouth, and one nostril are closed, open a container of peanut butter at the bottom of the ruler (one foot away from your open nostril). Move the peanut butter closer by 1 centimeter upon each exhale until you can detect the odor. Then repeat the whole procedure again using your other nostril.

This is exactly what the University of Florida researchers did with their subjects. What did they find? The normal elderly control subjects in the study smelled the peanut butter as soon as it came within an average of 18 centimeters (about 7 inches) from either nostril. It was about the same, roughly 7 inches, in the right nostrils of Alzheimer's patients. But in their left nostrils, it was a mere 2 inches! The peanut butter had to be only 2 inches away before the Alzheimer's patients could detect it through their left nostrils. This happened every single time. Indeed, the researchers found that a "left nostril impairment of odor detection was present in all the patients with probable AD." There was no left-right difference in the control group; they could smell the peanut butter when it was the same distance away from both their left and right nostrils. In the Alzheimer's group, however, there was a 12-centimeter difference.

The disparity was so great that we may be able to set a cutoff value for the diagnosis of Alzheimer's. The researchers reported that "[c]ompared to patients with other causes of dementia this nostril asymmetry of odor detection...was 100% sensitive and 100% specific for probable AD," meaning no false positives and no false negatives. Compared to healthy people, it was 100% sensitive in picking up cases of probable Alzheimer's and 92% specific. What exactly does that mean? In this study, if you had Alzheimer's, there was a 100% chance of having that wide left-right discrepancy. But, if you did have that discrepancy, the chance of having Alzheimer's was only 92%. This means there were some false positives.

The reason it's only "probable" Alzheimer's is because the only way we can really confirm someone has the disease is on autopsy. The current criteria for diagnosing Alzheimer's require an extensive evaluation, combined with fancy positron emission tomography (PET) scans and spinal taps. All of these tests are expensive and hard to get, can be invasive, and can have potential complications. On top of that, they are neither highly sensitive nor specific. The left-right nostril / peanut butter odor detection test, however, was fast, simple, non-invasive, and inexpensive. They concluded that may make peanut butter an ideal instrument for the early detection of Alzheimer's disease.

Does all this sound a bit too good to be true? It may be. A University of Pennsylvania research team was unable to replicate the results. Click here to read their paper. So at this point, the data are mixed. I'll do another post once more studies are published and we have a better handle on whether it's useful or not.

Of course, it's better to prevent Alzheimer's in the first place. Check out these videos for more information.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

What’s the Mediterranean Diet’s Secret?

Why Was Heart Disease Rare in the Mediterranean?.jpg

The Mediterranean Diet is an "in" topic nowadays in both the medical literature and the lay media. As one researcher put it, "Uncritical laudatory coverage is common, but specifics are hard to come by: What is it? Where did it come from? Why is it good? Merits are rarely detailed; possible downsides are never mentioned." So, let's dig in....

After World War II, the government of Greece asked the Rockefeller foundation to come in and assess the situation. Impressed by the low rates of heart disease in the region, nutrition scientist Ancel Keys--after which "K" rations were named--initiated his famous seven countries study. In this study, he found the rate of fatal heart disease on the Greek isle of Crete was 20 times lower than in the United States. They also had the lowest cancer rates and fewest deaths overall. What were they eating? Their diets were more than 90% plant-based, which may explain why coronary heart disease was such a rarity. A rarity, that is, except for a small class of rich people whose diet differed from that of the general population--they ate meat every day instead of every week or two.

So, the heart of the Mediterranean diet is mainly plant-based, and low in meat and dairy, which Keys considered the "major villains in the diet" because of their saturated fat content. Unfortunately, no one is really eating the traditional Mediterranean diet anymore, even in the Mediterranean. The prevalence of coronary heart disease skyrocketed by an order of magnitude within a few decades in Crete, blamed on the increased consumption of meat and cheese at the expense of plant foods.

Everyone is talking about the Mediterranean diet, but few do it properly. People think of pizza or spaghetti with meat sauce, but while "Italian restaurants brag about the healthy measuring in diet, they serve a travesty of it." If no one's really eating this way anymore, how do you study it?

Researchers came up with a variety of Mediterranean diet adherence scoring systems to see if people who are eating more Mediterranean-ish do better. You get maximum points the more plant foods you eat, and effectively you get points deducted by eating just a single serving of meat or dairy a day. So it's no surprise those that eat relatively higher on the scale have a lower risk of heart disease, cancer, and death overall. After all, the Mediterranean diet can be considered to be a "near vegetarian" diet. "As such, it should be expected to produce the well-established health benefits of vegetarian diets." That is, less heart disease, cancer, death, and inflammation; improved arterial function; a lower risk of developing type 2 diabetes; a reduced risk for stroke, depression, and cognitive impairment.

How might it work? I've talked about the elegant studies showing that those who eat plant-based diets have more plant-based compounds, like aspirin, circulating within their systems. Polyphenol phytonutrients in plant foods are associated with a significantly lower risk of dying. Magnesium consumption is also associated with a significantly lower risk of dying, and is found in dark green leafy vegetables, as well as fruits, beans, nuts, soy, and whole grains.

Heme iron, on the other hand--the iron found in blood and muscle--acts as a pro-oxidant and appears to increase the risk of diabetes, whereas plant-based, non-heme iron appears safe. Similarly, with heart disease, animal-based iron was found to significantly increase the risk of coronary heart disease, our number one killer, but not plant-based iron. The Mediterranean diet is protective compared to the Standard American Diet--no question--but any diet rich in whole plant foods and low in animal-fat consumption could be expected to confer protection against many of our leading killers.

Here are some more videos on the Mediterranean Diet:

For more information on heme iron, see Risk Associated With Iron Supplements.

More on magnesium is found in How Do Nuts Prevent Sudden Cardiac Death? and Mineral of the Year--Magnesium.

And more on polyphenols can be seen in videos like How to Slow Brain Aging by Two Years and Juicing Removes More Than Just Fiber.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Couleur / Pixabay. This image has been modified.

Original Link