Treating Kidney Stones with Diet

Treating Kidney Stones with Diet.jpeg

Studies suggest that excessive consumption of animal protein poses a risk of kidney stone formation, likely due to the acid load contributed by the high content of sulfur-containing amino acids in animal protein, a topic I explore in my video, Preventing Kidney Stones with Diet. What about treating kidney stones, though? I discuss that in How to Treat Kidney Stones with Diet. Most stones are calcium oxalate, formed like rock candy when the urine becomes supersaturated. Doctors just assumed that if stones are made out of calcium, we simply have to tell people to reduce their calcium intake. That was the dietary gospel for kidney stone sufferers until a 2002 study published in the New England Journal of Medicine pitted two diets against one another--a low-calcium diet versus a diet low in animal protein and salt. The restriction of animal protein and salt provided greater protection, cutting the risk of having another kidney stone within five years in half.

What about cutting down on oxalates, which are concentrated in certain vegetables? A recent study found there was no increased risk of stone formation with higher vegetable intake. In fact, greater dietary intake of whole plant foods, fruits, and vegetables were each associated with reduced risk independent of other known risk factors for kidney stones. This means we may get additional benefits bulking up on plant foods in addition to just restricting animal foods.

A reduction in animal protein not only reduces the production of acids within the body, but should also limit the excretion of urate, uric acid crystals that can act as seeds to form calcium stones or create entire stones themselves. (Uric acid stones are the second most common kidney stones after calcium.)

There are two ways to reduce uric acid levels in the urine: a reduction of animal protein ingestion, or a variety of drugs. Removing all meat--that is, switching from the standard Western diet to a vegetarian diet--can remove 93% of uric acid crystallization risk within days.

To minimize uric acid crystallization, the goal is to get our urine pH up to ideally as high as 6.8. A number of alkalinizing chemicals have been developed for just this purpose, but we can naturally alkalize our urine up to the recommended 6.8 using purely dietary means. Namely, by removing all meat, someone eating the standard Western diet can go from a pH of 5.95 to the goal target of 6.8--simply by eating plant-based. As I describe in my video, Testing Your Diet with Pee & Purple Cabbage, we can inexpensively test our own diets with a little bathroom chemistry, for not all plant foods are alkalinizing and not all animal foods are equally acidifying.

A Load of Acid to Kidney Evaluation (LAKE) score has been developed to take into account both the acid load of foods and their typical serving sizes. It can be used to help people modify their diet for the prevention of both uric acid and calcium kidney stones, as well as other diseases. What did researchers find? The single most acid-producing food is fish, like tuna. Then, in descending order, are pork, then poultry, cheese (though milk and other dairy are much less acidifying), and beef followed by eggs. (Eggs are actually more acidic than beef, but people tend to eat fewer eggs in one sitting.) Some grains, like bread and rice, can be a little acid-forming, but pasta is not. Beans are significantly alkaline-forming, but not as much as fruits or even better, vegetables, which are the most alkaline-forming of all.

Through dietary changes alone, we may be able to dissolve uric acid stones completely and cure patients without drugs or surgery.

To summarize, the most important things we can do diet-wise is to drink 10 to 12 cups of water a day, reduce animal protein, reduce salt, and eat more vegetables and more vegetarian.

Want to try to calculate their LAKE score for the day? Just multiply the number of servings you have of each of the food groups listed in the graph in the video times the score.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Plant versus Animal Iron

Plant versus Animal Iron.jpeg

It is commonly thought that those who eat plant-based diets may be more prone to iron deficiency, but it turns out that they're no more likely to suffer from iron deficiency anemia than anybody else. This may be because not only do those eating meat-free diets tend to get more fiber, magnesium, and vitamins like A, C, and E, but they also get more iron.

The iron found predominantly in plants is non-heme iron, which isn't absorbed as well as the heme iron found in blood and muscle, but this may be a good thing. As seen in my video, The Safety of Heme vs. Non-Heme Iron, avoidance of heme iron may be one of the key elements of plant-based protection against metabolic syndrome, and may also be beneficial in lowering the risk from other chronic diseases such as heart disease.

The data linking coronary heart disease and the intake of iron, in general, has been mixed. This inconsistency of evidence may be because of where the iron comes from. The majority of total dietary iron is non-heme iron, coming mostly from plants. So, total iron intake is associated with lower heart disease risk, but iron intake from meat is associated with significantly higher risk for heart disease. This is thought to be because iron can act as a pro-oxidant, contributing to the development of atherosclerosis by oxidizing cholesterol with free radicals. The risk has been quantified as a 27% increase in coronary heart disease risk for every 1 milligram of heme iron consumed daily.

The same has been found for stroke risk. The studies on iron intake and stroke have had conflicting results, but that may be because they had never separated out heme iron from non-heme iron... until now. Researchers found that the intake of meat (heme) iron, but not plant (non-heme) iron, was associated with an increased risk of stroke.

The researchers also found that higher intake of heme iron--but not total or plant (non-heme) iron--was significantly associated with greater risk for type 2 diabetes. There may be a 16% increase in risk for type 2 diabetes for every 1 milligram of heme iron consumed daily.

The same has also been found for cancer, with up to 12% increased risk for every milligram of daily heme iron exposure. In fact, we can actually tell how much meat someone is eating by looking at their tumors. To characterize the mechanisms underlying meat-related lung cancer development, researchers asked lung cancer patients how much meat they ate and examined the gene expression patterns in their tumors. They identified a signature pattern of heme-related gene expression. Although they looked specifically at lung cancer, they expect these meat-related gene expression changes may occur in other cancers as well.

We do need to get enough iron, but only about 3% of premenopausal white women have iron deficiency anemia these days. However, the rates are worse in African and Mexican Americans. Taking into account our leading killers--heart disease, cancer, and diabetes--the healthiest source of iron appears to be non-heme iron, found naturally in abundance in whole grains, beans, split peas, chickpeas, lentils, dark green leafy vegetables, dried fruits, nuts, and seeds.

But how much money can be made on beans, though? The processed food industry came up with a blood-based crisp bread, made out of rye flour and blood from cattle and pigs, which is one of the most concentrated sources of heme iron, about two-thirds more than blood from chickens. If blood-based crackers don't sound particularly appetizing, you can always snack on cow blood cookies. And there are always blood-filled biscuits, whose filling has been described as "a dark-colored, chocolate flavored paste with a very pleasant taste." (It's dark-colored because spray-dried pig blood can have a darkening effect on the food product's color.) The worry is not the color or taste, it's the heme iron, which, because of its potential cancer risk, is not considered safe to add to foods intended for the general population.

Previously, I've touched on the double-edged iron sword in Risk Associated With Iron Supplements and Phytates for the Prevention of Cancer. It may also help answer Why Was Heart Disease Rare in the Mediterranean?

Those eating plant-based diets get more of most nutrients since whole plant foods are so nutrient dense. See Nutrient-Dense Approach to Weight Management.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Plant versus Animal Iron

Plant versus Animal Iron.jpeg

It is commonly thought that those who eat plant-based diets may be more prone to iron deficiency, but it turns out that they're no more likely to suffer from iron deficiency anemia than anybody else. This may be because not only do those eating meat-free diets tend to get more fiber, magnesium, and vitamins like A, C, and E, but they also get more iron.

The iron found predominantly in plants is non-heme iron, which isn't absorbed as well as the heme iron found in blood and muscle, but this may be a good thing. As seen in my video, The Safety of Heme vs. Non-Heme Iron, avoidance of heme iron may be one of the key elements of plant-based protection against metabolic syndrome, and may also be beneficial in lowering the risk from other chronic diseases such as heart disease.

The data linking coronary heart disease and the intake of iron, in general, has been mixed. This inconsistency of evidence may be because of where the iron comes from. The majority of total dietary iron is non-heme iron, coming mostly from plants. So, total iron intake is associated with lower heart disease risk, but iron intake from meat is associated with significantly higher risk for heart disease. This is thought to be because iron can act as a pro-oxidant, contributing to the development of atherosclerosis by oxidizing cholesterol with free radicals. The risk has been quantified as a 27% increase in coronary heart disease risk for every 1 milligram of heme iron consumed daily.

The same has been found for stroke risk. The studies on iron intake and stroke have had conflicting results, but that may be because they had never separated out heme iron from non-heme iron... until now. Researchers found that the intake of meat (heme) iron, but not plant (non-heme) iron, was associated with an increased risk of stroke.

The researchers also found that higher intake of heme iron--but not total or plant (non-heme) iron--was significantly associated with greater risk for type 2 diabetes. There may be a 16% increase in risk for type 2 diabetes for every 1 milligram of heme iron consumed daily.

The same has also been found for cancer, with up to 12% increased risk for every milligram of daily heme iron exposure. In fact, we can actually tell how much meat someone is eating by looking at their tumors. To characterize the mechanisms underlying meat-related lung cancer development, researchers asked lung cancer patients how much meat they ate and examined the gene expression patterns in their tumors. They identified a signature pattern of heme-related gene expression. Although they looked specifically at lung cancer, they expect these meat-related gene expression changes may occur in other cancers as well.

We do need to get enough iron, but only about 3% of premenopausal white women have iron deficiency anemia these days. However, the rates are worse in African and Mexican Americans. Taking into account our leading killers--heart disease, cancer, and diabetes--the healthiest source of iron appears to be non-heme iron, found naturally in abundance in whole grains, beans, split peas, chickpeas, lentils, dark green leafy vegetables, dried fruits, nuts, and seeds.

But how much money can be made on beans, though? The processed food industry came up with a blood-based crisp bread, made out of rye flour and blood from cattle and pigs, which is one of the most concentrated sources of heme iron, about two-thirds more than blood from chickens. If blood-based crackers don't sound particularly appetizing, you can always snack on cow blood cookies. And there are always blood-filled biscuits, whose filling has been described as "a dark-colored, chocolate flavored paste with a very pleasant taste." (It's dark-colored because spray-dried pig blood can have a darkening effect on the food product's color.) The worry is not the color or taste, it's the heme iron, which, because of its potential cancer risk, is not considered safe to add to foods intended for the general population.

Previously, I've touched on the double-edged iron sword in Risk Associated With Iron Supplements and Phytates for the Prevention of Cancer. It may also help answer Why Was Heart Disease Rare in the Mediterranean?

Those eating plant-based diets get more of most nutrients since whole plant foods are so nutrient dense. See Nutrient-Dense Approach to Weight Management.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Should Cancer Patients Avoid Raw Fruits and Vegetables?

Should Cancer Patients Avoid Raw Fruits and Vegetables?.jpeg

Back in the 1960s, a patient isolator unit was developed for cancer patients undergoing chemotherapy. Because our immune system cells were often caught in the friendly fire, up to 50% of cancer patients died of infections before they could even complete the chemo because their immune systems had become so compromised. So, a bubble boy-like contraption was developed. The patient was shaved, dipped in disinfectant, rinsed off with alcohol, rubbed with antibiotic ointment into every orifice, and placed on a rotating regimen of a dozen of the most powerful antibiotics they had. Procedures were performed through plastic sleeves on the sides of the unit, and everything in and out had to be sterilized and passed through airlocks. So, the patient wasn't allowed any fresh fruits or vegetables.

People went crazy cooped up in these bubble-like units, with 38% even experiencing hallucinations. Fifteen years later the results were in: it simply didn't work. People were still dying at the same rate, so the whole thing was scrapped--except the diet. The airlocks and alcohol baths were abandoned, but they continued to make sure no one got to eat a salad.

Neutrophils are white blood cells that serve as our front line of defense. When we're immunocompromised and don't have enough neutrophils, we're called "neutropenic." So, the chemotherapy patients were put on a so-called neutropenic diet without any fresh fruits and vegetables. The problem is there's a glaring lack of evidence that such a neutropenic diet actually helps (see my video Is a Neutropenic Diet Necessary for Cancer Patients?).

Ironically, the neutropenic diet is the one remaining component of those patient isolator unit protocols that's still practiced, yet it has the least evidence supporting its use. Why? The rationale is: there are bacteria in salads, bacteria cause infections, immunocompromised patients are at increased risk for infections, and therefore, no salad. What's more, they were actually glad there aren't any studies on this because it could be way too risky to give a cancer patient an apple or something. So, its continued use seems to be based on a ''better safe than sorry'' philosophy.

The problem is that kids diagnosed with cancer are already low in dietary antioxidants, so the last thing we should do is tell them they can't have any fresh fruit or veggies. In addition to the lack of clinical evidence for this neutropenic diet, there may be some drawbacks. Restricting fruits and vegetables may even increase the risk of infection and compromise their nutritional status.

So, are neutropenic diets for cancer patients "reasonable prudence" or "clinical superstition"? Starting in the 1990s, there was a resurgence of research when greater importance was placed on the need to "support clinical practice with evidence."

What a concept!

Three randomized controlled trials were published, and not one supported the neutropenic diet. In the biggest study, an all-cooked diet was compared to one that allowed raw fruits and veggies, and there was no difference in infection and death rates. As a result of the study, the principal investigator at the MD Anderson Cancer Center described how their practice has changed and now everyone is allowed to eat their vegetables--a far cry from "please don't eat the salads" 31 years earlier.

Today, neither the Food and Drug Administration, the Centers for Disease Control and Prevention, nor the American Cancer Society support the neutropenic diet. The real danger comes from pathogenic food-poisoning bacteria like Campylobacter, Salmonella, and E. coli. So we still have to keep patients away from risky foods like undercooked eggs, meat, dairy, and sprouts. At this point, though, there really shouldn't be a debate about whether cancer patients should be on a neutropenic diet. Nevertheless, many institutions still tell cancer patients they shouldn't eat fresh fruits and veggies. According to the latest survey, more than half of pediatric cancer doctors continue to prescribe these diets, though it's quite variable even among those at the same institution.

Why are doctors still reluctant to move away from the neutropenic diet? There are several reasons why physicians may be hesitant to incorporate evidence-based medicine into their practices. They may have limited time to review the literature. They'd like to dig deep into studies, but simply don't have the time to look at the evidence. Hmm, if only there was a website... :)

Bone marrow transplants are the final frontier. Sometimes it's our immune system itself that is cancerous, such as in leukemia or lymphoma. In these cases, the immune system is wiped out on purpose to rebuild it from scratch. So, inherent in the procedure is a profound immunodeficiency for which a neutropenic diet is often recommended. This has also had never been tested--until now.

Not only did it not work, a strict neutropenic diet was actually associated with an increased risk for infection, maybe because you don't get the good bugs from fruits and vegetables crowding out the bad guys in the gut. So not only was the neutropenic diet found to be unbeneficial; there was a suggestion that it has the potential to be harmful. This wouldn't be the first time an intervention strategy made good sense theoretically, but, when put to the test, was ultimately ineffective.

Unfortunately, there's an inertia in medicine that can result in medical practice that is at odds with the available evidence. Sometimes this disconnect can have devastating consequences. See, for example, Evidence-Based Medicine or Evidence-Biased? and The Tomato Effect.

The reason it is so important to straighten out the neutropenic diet myth is that fruits and vegetables may actually improve cancer survival:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Should Cancer Patients Avoid Raw Fruits and Vegetables?

Should Cancer Patients Avoid Raw Fruits and Vegetables?.jpeg

Back in the 1960s, a patient isolator unit was developed for cancer patients undergoing chemotherapy. Because our immune system cells were often caught in the friendly fire, up to 50% of cancer patients died of infections before they could even complete the chemo because their immune systems had become so compromised. So, a bubble boy-like contraption was developed. The patient was shaved, dipped in disinfectant, rinsed off with alcohol, rubbed with antibiotic ointment into every orifice, and placed on a rotating regimen of a dozen of the most powerful antibiotics they had. Procedures were performed through plastic sleeves on the sides of the unit, and everything in and out had to be sterilized and passed through airlocks. So, the patient wasn't allowed any fresh fruits or vegetables.

People went crazy cooped up in these bubble-like units, with 38% even experiencing hallucinations. Fifteen years later the results were in: it simply didn't work. People were still dying at the same rate, so the whole thing was scrapped--except the diet. The airlocks and alcohol baths were abandoned, but they continued to make sure no one got to eat a salad.

Neutrophils are white blood cells that serve as our front line of defense. When we're immunocompromised and don't have enough neutrophils, we're called "neutropenic." So, the chemotherapy patients were put on a so-called neutropenic diet without any fresh fruits and vegetables. The problem is there's a glaring lack of evidence that such a neutropenic diet actually helps (see my video Is a Neutropenic Diet Necessary for Cancer Patients?).

Ironically, the neutropenic diet is the one remaining component of those patient isolator unit protocols that's still practiced, yet it has the least evidence supporting its use. Why? The rationale is: there are bacteria in salads, bacteria cause infections, immunocompromised patients are at increased risk for infections, and therefore, no salad. What's more, they were actually glad there aren't any studies on this because it could be way too risky to give a cancer patient an apple or something. So, its continued use seems to be based on a ''better safe than sorry'' philosophy.

The problem is that kids diagnosed with cancer are already low in dietary antioxidants, so the last thing we should do is tell them they can't have any fresh fruit or veggies. In addition to the lack of clinical evidence for this neutropenic diet, there may be some drawbacks. Restricting fruits and vegetables may even increase the risk of infection and compromise their nutritional status.

So, are neutropenic diets for cancer patients "reasonable prudence" or "clinical superstition"? Starting in the 1990s, there was a resurgence of research when greater importance was placed on the need to "support clinical practice with evidence."

What a concept!

Three randomized controlled trials were published, and not one supported the neutropenic diet. In the biggest study, an all-cooked diet was compared to one that allowed raw fruits and veggies, and there was no difference in infection and death rates. As a result of the study, the principal investigator at the MD Anderson Cancer Center described how their practice has changed and now everyone is allowed to eat their vegetables--a far cry from "please don't eat the salads" 31 years earlier.

Today, neither the Food and Drug Administration, the Centers for Disease Control and Prevention, nor the American Cancer Society support the neutropenic diet. The real danger comes from pathogenic food-poisoning bacteria like Campylobacter, Salmonella, and E. coli. So we still have to keep patients away from risky foods like undercooked eggs, meat, dairy, and sprouts. At this point, though, there really shouldn't be a debate about whether cancer patients should be on a neutropenic diet. Nevertheless, many institutions still tell cancer patients they shouldn't eat fresh fruits and veggies. According to the latest survey, more than half of pediatric cancer doctors continue to prescribe these diets, though it's quite variable even among those at the same institution.

Why are doctors still reluctant to move away from the neutropenic diet? There are several reasons why physicians may be hesitant to incorporate evidence-based medicine into their practices. They may have limited time to review the literature. They'd like to dig deep into studies, but simply don't have the time to look at the evidence. Hmm, if only there was a website... :)

Bone marrow transplants are the final frontier. Sometimes it's our immune system itself that is cancerous, such as in leukemia or lymphoma. In these cases, the immune system is wiped out on purpose to rebuild it from scratch. So, inherent in the procedure is a profound immunodeficiency for which a neutropenic diet is often recommended. This has also had never been tested--until now.

Not only did it not work, a strict neutropenic diet was actually associated with an increased risk for infection, maybe because you don't get the good bugs from fruits and vegetables crowding out the bad guys in the gut. So not only was the neutropenic diet found to be unbeneficial; there was a suggestion that it has the potential to be harmful. This wouldn't be the first time an intervention strategy made good sense theoretically, but, when put to the test, was ultimately ineffective.

Unfortunately, there's an inertia in medicine that can result in medical practice that is at odds with the available evidence. Sometimes this disconnect can have devastating consequences. See, for example, Evidence-Based Medicine or Evidence-Biased? and The Tomato Effect.

The reason it is so important to straighten out the neutropenic diet myth is that fruits and vegetables may actually improve cancer survival:

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

How Much Nutrition Education Do Doctors Get?

How Much Nutrition Education Do Doctors Get?.jpeg

In the United States, most deaths are preventable and related to nutrition. Given that the number-one cause of death and the number-one cause of disability in this country is diet, surely nutrition is the number-one subject taught in medical school, right? Sadly, that is not the case.

As shown in my video, Physician's May Be Missing Their Most Important Tool, a group of prominent physicians wrote in 2014 that "nutrition receives little attention in medical practice" and "the reason stems, in large part, from the severe deficiency of nutrition education at all levels of medical training." They note this is particularly shocking since it has been proven that a whole foods, plant-based diet low in animal products and refined carbohydrates can reverse coronary heart disease--our number-one killer--and provide potent protection against other leading causes fof death such as cancer and type 2 diabetes.

So, how has medical education been affected by this knowledge? Medical students are still getting less than 20 hours of nutrition education over 4 years, and even most of that has limited clinical relevance. Thirty years ago, only 37 percent of medical schools had a single course in nutrition. According to the most recent national survey, that number has since dropped to 27 percent. And it gets even worse after students graduate.

According to the official list of all the requirements for those specializing in cardiology, Fellows must perform at least 50 stress tests, participate in at least 100 catheterizations, and so on. But nowhere in the 34-page list of requirements is there any mention of nutrition. Maybe they leave that to the primary care physicians? No. In the official 35-page list of requirements for internal medicine doctors, once again, nutrition doesn't get even a single mention.

There are no requirements for nutrition before medical school either. Instead, aspiring doctors need to take courses like calculus, organic chemistry, and physics. Most of these common pre-med requirements are irrelevant to the practice of medicine and are primarily used to "weed out" students. Shouldn't we be weeding out based on skills a physician actually uses? An important paper published in the Archives of Internal Medicine states: "The pernicious and myopic nature of this process of selection becomes evident when one realizes that those qualities that may lead to success in a premedical organic chemistry course...[like] a brutal competitiveness, an unquestioning, meticulous memorization, are not necessarily the same qualities that are present in a competent clinician."

How about requiring a course in nutrition instead of calculus, or ethics instead of physics?

Despite the neglect of nutrition in medical education, physicians are considered by the public to be among the most trusted sources for information related to nutrition. But if doctors don't know what they're talking about, they could actually be contributing to diet-related disease. If we're going to stop the prevailing trend of chronic illness in the United States, physicians need to become part of the solution.

There's still a lot to learn about the optimal diet, but we don't need a single additional study to take nutrition education seriously right now. It's health care's low-hanging fruit. While we've had the necessary knowledge for some time, what we've been lacking is the will to put that knowledge into practice. If we emphasized the powerful role of nutrition, we could dramatically reduce suffering and needless death.

Take, for example, the "Million Hearts" initiative. More than 2 million Americans have a heart attack or stroke each year. In 2011, U.S. federal, state, and local government agencies launched the Million Hearts initiative to prevent 1 million of the 10 million heart attacks and strokes that will occur in the next 5 years. "But why stop at a million?" a doctor asked in the American Journal of Cardiology. Already, we possess all the information needed to eradicate atherosclerotic disease, which is our number-one killer while being virtually nonexistent in populations who consume plant-based diets. Some of the world's most renowned cardiovascular pathologists have stated we just need to get our cholesterol low enough in order to not only prevent--but also reverse--the disease in more than 80% of patients. We can open up arteries without drugs and surgery, and stabilize or improve blood flow in 99% of those who choose to eat healthily and clean up their bad habits. We can essentially eliminate our risk of having a heart attack even in the most advanced cases of heart disease.

Despite this, medical students aren't even taught these concepts while they're in school. Instead, the focus is on cutting people open, which frequently provides only symptomatic relief because we're not treating the actual cause of the disease. Fixing medical education is the solution to this travesty. Knowledge of nutrition can help doctors eradicate the world's leading killer.

I've previously addressed how Doctors Tend to Know Less Than They Think About Nutrition, which is no surprise given most medical schools in the United States fail to provide even a bare minimum of nutrition training (see Medical School Nutrition Education), with mainstream medical associations even actively lobbying against additional nutrition training.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

How Much Nutrition Education Do Doctors Get?

How Much Nutrition Education Do Doctors Get?.jpeg

In the United States, most deaths are preventable and related to nutrition. Given that the number-one cause of death and the number-one cause of disability in this country is diet, surely nutrition is the number-one subject taught in medical school, right? Sadly, that is not the case.

As shown in my video, Physician's May Be Missing Their Most Important Tool, a group of prominent physicians wrote in 2014 that "nutrition receives little attention in medical practice" and "the reason stems, in large part, from the severe deficiency of nutrition education at all levels of medical training." They note this is particularly shocking since it has been proven that a whole foods, plant-based diet low in animal products and refined carbohydrates can reverse coronary heart disease--our number-one killer--and provide potent protection against other leading causes fof death such as cancer and type 2 diabetes.

So, how has medical education been affected by this knowledge? Medical students are still getting less than 20 hours of nutrition education over 4 years, and even most of that has limited clinical relevance. Thirty years ago, only 37 percent of medical schools had a single course in nutrition. According to the most recent national survey, that number has since dropped to 27 percent. And it gets even worse after students graduate.

According to the official list of all the requirements for those specializing in cardiology, Fellows must perform at least 50 stress tests, participate in at least 100 catheterizations, and so on. But nowhere in the 34-page list of requirements is there any mention of nutrition. Maybe they leave that to the primary care physicians? No. In the official 35-page list of requirements for internal medicine doctors, once again, nutrition doesn't get even a single mention.

There are no requirements for nutrition before medical school either. Instead, aspiring doctors need to take courses like calculus, organic chemistry, and physics. Most of these common pre-med requirements are irrelevant to the practice of medicine and are primarily used to "weed out" students. Shouldn't we be weeding out based on skills a physician actually uses? An important paper published in the Archives of Internal Medicine states: "The pernicious and myopic nature of this process of selection becomes evident when one realizes that those qualities that may lead to success in a premedical organic chemistry course...[like] a brutal competitiveness, an unquestioning, meticulous memorization, are not necessarily the same qualities that are present in a competent clinician."

How about requiring a course in nutrition instead of calculus, or ethics instead of physics?

Despite the neglect of nutrition in medical education, physicians are considered by the public to be among the most trusted sources for information related to nutrition. But if doctors don't know what they're talking about, they could actually be contributing to diet-related disease. If we're going to stop the prevailing trend of chronic illness in the United States, physicians need to become part of the solution.

There's still a lot to learn about the optimal diet, but we don't need a single additional study to take nutrition education seriously right now. It's health care's low-hanging fruit. While we've had the necessary knowledge for some time, what we've been lacking is the will to put that knowledge into practice. If we emphasized the powerful role of nutrition, we could dramatically reduce suffering and needless death.

Take, for example, the "Million Hearts" initiative. More than 2 million Americans have a heart attack or stroke each year. In 2011, U.S. federal, state, and local government agencies launched the Million Hearts initiative to prevent 1 million of the 10 million heart attacks and strokes that will occur in the next 5 years. "But why stop at a million?" a doctor asked in the American Journal of Cardiology. Already, we possess all the information needed to eradicate atherosclerotic disease, which is our number-one killer while being virtually nonexistent in populations who consume plant-based diets. Some of the world's most renowned cardiovascular pathologists have stated we just need to get our cholesterol low enough in order to not only prevent--but also reverse--the disease in more than 80% of patients. We can open up arteries without drugs and surgery, and stabilize or improve blood flow in 99% of those who choose to eat healthily and clean up their bad habits. We can essentially eliminate our risk of having a heart attack even in the most advanced cases of heart disease.

Despite this, medical students aren't even taught these concepts while they're in school. Instead, the focus is on cutting people open, which frequently provides only symptomatic relief because we're not treating the actual cause of the disease. Fixing medical education is the solution to this travesty. Knowledge of nutrition can help doctors eradicate the world's leading killer.

I've previously addressed how Doctors Tend to Know Less Than They Think About Nutrition, which is no surprise given most medical schools in the United States fail to provide even a bare minimum of nutrition training (see Medical School Nutrition Education), with mainstream medical associations even actively lobbying against additional nutrition training.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank

Original Link

Best Foods for Acid Reflux

Best Foods for Acid Reflux.jpeg

Gastroesophageal reflux disease (GERD) is one of the most common disorders of the digestive tract. The two most typical symptoms are heartburn and regurgitation of stomach contents into the back of the throat, but GERD is not just burning pain and a sour taste in your mouth. It causes millions of doctor visits and hospitalizations every year in the United States. The most feared complication is cancer.

You start out with a normal esophagus. If the acid keeps creeping up, your esophagus can get inflamed and result in esophagitis. Esophagitis can transform into Barrett's esophagus, a precancerous condition which can then turn into adenocarcinoma (a type of cancer). To prevent all that, we need to prevent the acid reflux in the first place.

In the last three decades, the incidence of this cancer in the US has increased six-fold, an increase greater than that of melanoma, breast, or prostate cancer. This is because acid reflux is on the rise. In the United States, we're up to about 1 in 4 people suffering at least weekly heartburn and/or acid regurgitation, compared to around 5% in Asia. This suggests that dietary factors may play a role.

In general, high fat intake is associated with increased risk, whereas high fiber foods appear to be protective. The reason fat intake may be associated with GERD symptoms and erosive esophagitis is because when we eat fatty foods, the sphincter at the top of the stomach that's supposed to keep the food down becomes relaxed, so more acid can creep up into the esophagus. In my video Diet & GERD Acid Reflux Heartburn, you can see a study in which researchers fed volunteers a high-fat meal--a McDonald's sausage and egg McMuffin--compared to a low-fat meal (McDonald's hot cakes), and there was significantly more acid squirted up in the esophagus after the high-fat meal.

In terms of later stages of disease progression, over the last twenty years 45 studies have been published in the association between diet and Barrett's esophagus and esophageal cancer. In general, they found that meat and high-fat meals appeared to increase cancer risk. Different meats were associated with cancers in different locations, thoughj. Red meat was more associated with cancer in the esophagus, whereas poultry was more associated with cancer at the top of the stomach. Plant-based sources of protein, such as beans and nuts, were associated with a significantly decreased risk of cancer.

Those eating the most antioxidant-rich foods have half the odds of esophageal cancer, while there is practically no reduction in risk among those who used antioxidant vitamin supplements, such as vitamin C or E pills. The most protective produce may be red-orange vegetables, dark green leafies, berries, apples, and citrus. The benefit may come from more than just eating plants. Eating healthy foods crowds out less healthy foods, so it may be a combination of both.

Based on a study of 3,000 people, the consumption of non-vegetarian foods (including eggs) was an independent predictor of GERD. Egg yolks cause an increase in the hormone cholecystokinin, which may overly relax the sphincter that separates the esophagus from the stomach. The same hormone is increased by meat, which may help explain why plant-based diets appear to be a protective factor for reflux esophagitis.

Researchers found that those eating meat had twice the odds of reflux-induced esophageal inflammation. Therefore, plant-based diets may offer protection, though it's uncertain whether it's attributable to the absence of meat in the diet or the increased consumption of healthy foods. Those eating vegetarian consume greater amounts of fruits and vegetables containing innumerable phytochemicals, dietary fiber, and antioxidants. They also restrict their consumption of animal sources of food, which tend to be fattier and can thus relax that sphincter and aggravate reflux.

GERD is common; its burdens are enormous. It relapses frequently and can cause bleeding, strictures, and a deadly cancer. The mainstay of treatment is proton pump inhibitor drugs, which rake in billions of dollars. We spend four billion dollars on Nexium alone, three billion on Prevacid, two billion on Protonix, one billion on Aciphex. These drugs can cause nutrient deficiencies and increase the risk for pneumonia, food poisoning, and bone fractures. Thus, it is important to find correctable risk factors and correct them. Known correctable risk factors have been things like obesity, smoking and alcohol consumption. Until recently, though, there hadn't been studies on specifically what to eat and what to avoid, but now we have other correctable factors to help prevent this disease.

For more on GERD, see: Diet & Hiatal Hernia, Coffee & Mortality, and Club Soda for Stomach Pain & Constipation.

I also have a video about esophageal cancer, detailing the extraordinary reversal of the kinds of precancerous changes that lead to the devastating condition--with nothing but strawberries: Strawberries versus Esophageal Cancer.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: PDPics / Pixabay. Image has been modified.

Original Link

Best Foods for Acid Reflux

Best Foods for Acid Reflux.jpeg

Gastroesophageal reflux disease (GERD) is one of the most common disorders of the digestive tract. The two most typical symptoms are heartburn and regurgitation of stomach contents into the back of the throat, but GERD is not just burning pain and a sour taste in your mouth. It causes millions of doctor visits and hospitalizations every year in the United States. The most feared complication is cancer.

You start out with a normal esophagus. If the acid keeps creeping up, your esophagus can get inflamed and result in esophagitis. Esophagitis can transform into Barrett's esophagus, a precancerous condition which can then turn into adenocarcinoma (a type of cancer). To prevent all that, we need to prevent the acid reflux in the first place.

In the last three decades, the incidence of this cancer in the US has increased six-fold, an increase greater than that of melanoma, breast, or prostate cancer. This is because acid reflux is on the rise. In the United States, we're up to about 1 in 4 people suffering at least weekly heartburn and/or acid regurgitation, compared to around 5% in Asia. This suggests that dietary factors may play a role.

In general, high fat intake is associated with increased risk, whereas high fiber foods appear to be protective. The reason fat intake may be associated with GERD symptoms and erosive esophagitis is because when we eat fatty foods, the sphincter at the top of the stomach that's supposed to keep the food down becomes relaxed, so more acid can creep up into the esophagus. In my video Diet & GERD Acid Reflux Heartburn, you can see a study in which researchers fed volunteers a high-fat meal--a McDonald's sausage and egg McMuffin--compared to a low-fat meal (McDonald's hot cakes), and there was significantly more acid squirted up in the esophagus after the high-fat meal.

In terms of later stages of disease progression, over the last twenty years 45 studies have been published in the association between diet and Barrett's esophagus and esophageal cancer. In general, they found that meat and high-fat meals appeared to increase cancer risk. Different meats were associated with cancers in different locations, thoughj. Red meat was more associated with cancer in the esophagus, whereas poultry was more associated with cancer at the top of the stomach. Plant-based sources of protein, such as beans and nuts, were associated with a significantly decreased risk of cancer.

Those eating the most antioxidant-rich foods have half the odds of esophageal cancer, while there is practically no reduction in risk among those who used antioxidant vitamin supplements, such as vitamin C or E pills. The most protective produce may be red-orange vegetables, dark green leafies, berries, apples, and citrus. The benefit may come from more than just eating plants. Eating healthy foods crowds out less healthy foods, so it may be a combination of both.

Based on a study of 3,000 people, the consumption of non-vegetarian foods (including eggs) was an independent predictor of GERD. Egg yolks cause an increase in the hormone cholecystokinin, which may overly relax the sphincter that separates the esophagus from the stomach. The same hormone is increased by meat, which may help explain why plant-based diets appear to be a protective factor for reflux esophagitis.

Researchers found that those eating meat had twice the odds of reflux-induced esophageal inflammation. Therefore, plant-based diets may offer protection, though it's uncertain whether it's attributable to the absence of meat in the diet or the increased consumption of healthy foods. Those eating vegetarian consume greater amounts of fruits and vegetables containing innumerable phytochemicals, dietary fiber, and antioxidants. They also restrict their consumption of animal sources of food, which tend to be fattier and can thus relax that sphincter and aggravate reflux.

GERD is common; its burdens are enormous. It relapses frequently and can cause bleeding, strictures, and a deadly cancer. The mainstay of treatment is proton pump inhibitor drugs, which rake in billions of dollars. We spend four billion dollars on Nexium alone, three billion on Prevacid, two billion on Protonix, one billion on Aciphex. These drugs can cause nutrient deficiencies and increase the risk for pneumonia, food poisoning, and bone fractures. Thus, it is important to find correctable risk factors and correct them. Known correctable risk factors have been things like obesity, smoking and alcohol consumption. Until recently, though, there hadn't been studies on specifically what to eat and what to avoid, but now we have other correctable factors to help prevent this disease.

For more on GERD, see: Diet & Hiatal Hernia, Coffee & Mortality, and Club Soda for Stomach Pain & Constipation.

I also have a video about esophageal cancer, detailing the extraordinary reversal of the kinds of precancerous changes that lead to the devastating condition--with nothing but strawberries: Strawberries versus Esophageal Cancer.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: PDPics / Pixabay. Image has been modified.

Original Link

The Best Diet to Prevent Kidney Stones

The Best Diet to Prevent Kidney Stones.jpeg

In my video How to Prevent Kidney Stones With Diet you can see what the jagged surface of a kidney stone looks like under a microscope. Imagine one of those scraping down your urinary canal! Kidney stones affect approximately 1 in 11 people in the United States. Twenty years ago it was only 1 in 20, representing a dramatic increase in the prevalence of the disease that started rising after World War II. Our first clue as to why was a study published in the 70's, which found a striking relationship between stone incidence and the consumption of animal protein. This was a population study, though, so it couldn't prove cause and effect.

That study inspired researchers in Britain to do an interventional study, adding animal protein to subjects' diets, such as an extra can of tuna fish a day, and measuring stone-forming risk factors in their urine. Participants' overall probability of forming stones increased 250% during those days they were eating that extra fish. And the so-called "high animal protein diet" was just enough to bring intake up to that of the average American. So Americans' intake of meat appears to markedly increase the risk of kidney stones.

What about consuming no meat at all? By the late 70's we knew that the only dietary factor consistently associated with kidney stones was animal protein. The higher the intake of animal protein, the more likely the individual was to not only get their first kidney stone, but to then suffer from subsequent multiple stones. This effect was not found for high protein intake in general, but specifically high animal protein intake. Conversely, a diet low in animal protein may dramatically reduce the overall probability of forming stones. This may explain the apparently low incidence of stones in vegetarian societies, so researchers advocated "a more vegetarian form of diet" as a means of reducing the risk.

It wasn't until 2014 that vegetarian kidney stone risk was studied in detail, though. Using hospital admissions data, researchers found that vegetarians were indeed at a lower risk of being hospitalized for kidney stones. It's not all or nothing, though. Among meat-eaters, increasing meat intake is associated with a higher risk of developing kidney stones, whereas a high intake of fresh fruit, fiber, and magnesium may reduce the risk.

Which animal protein is the worst? People who form kidney stones are commonly advised to restrict the intake of red meat to decrease stone risk, but what about chicken and fish? Despite compelling evidence that excessive animal protein consumption enhances the risk of stone formation, the effect of different sources of animal protein had not been explored until another study in 2014. Researchers compared the effects of salmon and cod, chicken breast meat, and burger and steak. In terms of uric acid production, they found that gram for gram fish may actually be worse. However, the overall effects were complex. Basically, stone formers should be counseled to limit the intake of all animal proteins, and not by just a little bit. Only those who markedly decrease their animal protein intake may expect to benefit.

Making our urine more alkaline can also help prevent the formation of kidney stones (and even dissolve and cure uric acid stones). How can you tell the pH of your urine? See my video Testing Your Diet with Pee & Purple Cabbage.

For more on kidney stones, see How to Treat Kidney Stones with Diet and Do Vitamin C Supplements Prevent Colds but Cause Kidney Stones?. And check out my overview of kidney health in How Not to Die from Kidney Disease.

Uric acid can also crystallize in our joints, but the good news is that there are natural treatments. See Gout Treatment with a Cherry on Top and Treating Gout with Cherry Juice.

Kidney stones are just one more reason that Plant Protein is Preferable.

In health,

Michael Greger, M.D.

PS: If you haven't yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Image Credit: Sally Plank / Flickr. This image has been modified.

Original Link