Thursday, December 30, 2010

Grain Consumption By Neanderthals

The recent PNAS  publication of "Microfossils in calculus demonstrate consumption of plants and cooked foods in Neanderthal diets" by Amanda Henry, Alison Brooks, and Dolores Piperno
has created some stir in the paleo diet community and of course among paleo diet opponents.  The abstract of this article:

The nature and causes of the disappearance of Neanderthals and their apparent replacement by modern humans are subjects of considerable debate. Many researchers have proposed biologically or technologically mediated dietary differences between the two groups as one of the fundamental causes of Neanderthal disappearance. Some scenarios have focused on the apparent lack of plant foods in Neanderthal diets. Here we report direct evidence for Neanderthal consumption of a variety of plant foods, in the form of phytoliths and starch grains recovered from dental calculus of Neanderthal skeletons from Shanidar Cave, Iraq, and Spy Cave, Belgium. Some of the plants are typical of recent modern human diets, including date palms (Phoenix spp.), legumes, and grass seeds (Triticeae), whereas others are known to be edible but are not heavily used today. Many of the grass seed starches showed damage that is a distinctive marker of cooking. Our results indicate that in both warm eastern Mediterranean and cold northwestern European climates, and across their latitudinal range, Neanderthals made use of the diverse plant foods available in their local environment and transformed them into more easily digestible foodstuffs in part through cooking them, suggesting an overall sophistication in Neanderthal dietary regimes.

It seems that whenever any evidence arises for grain consumption by prehistory hominins in the Upper Paleolithic age, someone asserts or wonders if this constitutes evidence (no matter how slim) that, contrary to the widely accepted paleo principle, humans have adapted to eating grains.  But before I get to that, let me comment on a part of this abstract.

Henry et al imply in this abstract that some people think that the Neanderthals went extinct because of "the apparent lack of plant foods in Neanderthal diets."  The idea that lack of plant foods caused nutritional deficiencies that wiped out the Neanderthals got entertained by a CNN reporter, Samira Said:

Researchers found starch granules from plant grains in their teeth, leading them to believe the early humans did not -- as previously thought -- have an exclusively meat-based diet. It also debunks the theory that Neanderthals became extinct because of dietary deficiencies.

Let's have some fun unraveling the non-sequitors in this passage.  So you find some starch on the teeth of some Neanderthal remains, which shows that those individuals definitely ate some plants.  Now you leap to the conclusion that "early humans did not....have an exclusively meat-based diet."  This is like finding some starch granules on the teeth of some domestic cats fed commercial cat foods, then concluding that earlier wild cats did not have a carnivorous diet.  It is entirely possible that these particular Neanderthals were at that point in time eating some plant foods (out of desperation), while earlier Neanderthals, or Neanderthals in richer ecosystems, ate an almost exclusively meat diet.

By the way, what does Said mean by the phrase "exclusively meat-based diet"?  Strictly speaking, "meat-based" means just that, based on, or composed primarily of, meat.  To wit, a diet that is 80% meat certainly is "meat-based."   It does not mean exclusively meat, any more than "plant-based" means "exclusively plants."  Thus, "exclusively meat-based" means "only based on meat" which means that the "exclusively" is not even redundant, just plain unnecessary.

I really don't know how any anthropologist could seriously entertain the idea that dietary deficiencies due to lack of plant foods would cause the extinction of Neanderthals, who according to unrefuted stable isotope studies were almost exclusively carnivorous and apex predators.  Just for the record, in their report on their isotopic studies, Richard et al  stated that the Neanderthals "occupied the top trophic level, obtaining nearly [italics mine] all of their dietary protein from animal sources." See that "nearly"?  To refresh the memories of Henry et al, "nearly" means "almost."  In other words, from isotopic studies we already knew that Neanderthals ate some plant foods.


Further, previous researchers had already previously established that Neanderthals consumed grass seeds, legumes, and other plant foods, as discussed by Dr. BG in her blog post Neanderthals Consumed Grains and Legumes.  However, these previous researchers also established that "cereal grains were an insignificant food source" for Neanderthals; making me wonder why these three researchers presented their findings as if it established that grains were a mainstay and hedge against nutritional deficiencies for the Neanderthals.  Do they have an axe to grind?
 
Back to those who entertain the idea that dietary deficiencies due to lack of plant foods would cause the extinction of Neanderthals.  Have they no knowledge of the Inuit?  Inuit people clearly show that humans can live indefinitely and reproduce successfully for millenia eating almost nothing but animal products.  How could lack of plant foods kill off a species that obviously can obtain all its nutritional requirements from consumption of the various parts of animals?  Its like believing that the lions could go extinct for lack of salads and goji berries.  It could only happen indirectly i.e. not enough plants for their prey.

Henry et al are referring to Neanderthal remains found in Shanidar cave in Iraq, and Spy, Belgium.
Shanidar Cave.  Source:  Wikipedia

The Shanidar III remains they examined date to 60K to 80K years before present (YBP), and the Spy specimens are dated to about 36K YBP.    This means they lived in the Middle to Upper Paleolithic.

Now for a little prehistory lesson.  Humans have been evolving for more than 2 million years.  Most of that time the Earth maintained an ice age environment. According to Wikipedia:

The current ice age, the Pliocene-Quaternary glaciation, started about 2.58 million years ago during the late Pliocene when the spread of ice sheets in the Northern Hemisphere began. Since then, the world has seen cycles of glaciation with ice sheets advancing and retreating on 40,000- and 100,000-year time scales called glacial periods, glacials or glacial advances, and interglacial periods, interglacials or glacial retreats. The earth is currently in an interglacial, and the last glacial period ended about 10,000 years ago. 

During that 2.6 million years, periods of glacial advance exceeded interglacial periods, which typically lasted only 12K years, although some lasted up to 30K years.  During periods of glacial advance, the cold, dry climate favored the growth of grasses but not other plants; and thus it favored the survival of animals that could eat grass, or animals that could eat grass-eating animals.  Isotopic studies such as I cited above, and discussed here as well, clearly show that humans belong among the latter group.  As reported in "A brief review of the archaeological evidence for Palaeolithic and Neolithic subsistence" published here:

There have only been two studies of Palaeolithic modern humans, Homo sapiens sapiens. A study of the isotope values of humans from the late Upper Palaeolithic (ca 13 000 years old) site of Gough's and Sun Hole Cave in Southern England (Richards et al, 2000a) indicated, again by the delta15N values, that the main source of dietary protein was animal-based, and most likely herbivore flesh. The second study (Richards et al, 2001) was a survey of isotope values of humans from Gravettian and later (approximately 30 000-20 000 years old) Eurasian sites. The delta13C and delta15N values here indicated high animal protein diets....


Between 50K and 21K years before present (YBP), the earth entered a period of full glaciation, during which the climate became colder and drier.   As reported by Science News, between 50K and 3K YBP, 65% of mammal species weighing over 44kg, together with a smaller proportion of mammals of lesser size, went extinct, and it appears that this climate change played a major role in the extinction of the large mammals previously hunted by humans. 


So far, it is only during the Middle and Upper Paleolithic, that we evidence evidence of both Neanderthal and Cro Magnon humans consuming cereal grains.  Given the isotopic studies cited above and known climatic and faunal changes, I would conclude that this situation reflects humans choosing between starving to death and trying to live on previously unexploited plant foods, not a choice of "more balanced, healthier diet," nor what Henry et al decided in their abstract to call "an overall sophistication in Neanderthal dietary regimes."

That line about made me laugh.  So Henry et al think that if Neanderthal people ate a diet composed almost exclusively of meat, then the Neanderthal diet was not "sophisticated," but since they did, they had a "sophisticated dietary regime."  Its as if they think Neanderthals are more respectable if they did eat plants than if they didn't.  Lions must not be "sophisticated" either since they avoid plants. Who cares about sophistication?  What about adaptation?

Which brings me to my speculation.  Why did Neanderthals go extinct?  Maybe you could connect some dots.  Climate change and the hunting prowess of migrating modern humans, equipped with more sophisticated tools, language, shamans, and domesticated dogs, resulted in rapidly declining stocks of mammals, the primary food source for the Neanderthals.  They were like other carnivores supremely adapted to hunting mammals but apparently unlike the African transplant they did not know how to hunt smaller game or seafoods, and did not adapt to more diverse or plant-based diets.  Cro Magnon proved to have the upper hand on hunting in the same ecosystems as Neanderthals, depleting herds rapidly or before Neanderthals could get their hands on them.  Just as would any other true carnivore, they may have gone extinct for lack of meat, not for lack of plants.

In my view, the starch on their teeth is not a mark of sophistication, but of desperation.

Then Homo sapiens sapiens (my we have a high opinion of our selves) went through the same process.  We think we proved smarter and more adaptable by (sort of) adapting grains to us (by cooking), but Nature always has the last laugh.  We've hung on grains for 10K years, but this proves nothing; 10K years is a blink of the evolutionary eye.  Extinction is more the rule than the exception, and agriculture may yet prove to put the proud one out on his ass.

Unless we come to our primal senses.

P.S.  If you like this post and want to see more like it, please consider making a small donation or a recurring subscription payment using the PayPal buttons in the right hand column.  Fighting fallacies is a full time job I love to do, but I need support to continue doing it.  Also consider sending a link to this post to all of your Facebook and other friends.  

Fiber Choice & Beano: The Perfect Pair!

Tracy just found this ad page in one of those local circulars, and commented "Perfect, buy the Beano to deal with the fiber you can't digest!"

Wednesday, December 29, 2010

A Practically Primal Perspective on Conventional Beef, Part 1: Hormones


7/13/11 update:  I decided that I don't want to endorse or appear to endorse the use of any meat produced by conventional methods of feeding the livestock grains, primarily corn and soybeans.  Since animals consume 80% of the grain and soy produced by U.S. agriculture, this system drives the ongoing destruction of our topsoil both through crops and through grazing.  Animal food production consumes 87% of all freshwater used in the U.S. each year, and thus is the primary driver of depletion of water reserves.  This system also produces most of the water pollution occurring in the U.S.  Our conventional livestock production system has enormous costs detailed in this article from Cornell University.  Since I have known of these costs for more than 20 years, I feel embarrassed and remorseful that I wrote this series and other articles that endorsed the use of conventional animal products.  I served as part of a system that promotes a completely unsustainable approach to human nutrition. 

---------------------------------------------------------------
Some people wonder if one can safely eat conventional meat as a part of a practically paleo lifestyle.   Documentary films like Food Inc, along with a plethora of anti-meat, pro-vegetarian literature, have given people the impression that conventional meat products come from inhumane production systems and that conventional meat contains hazardous amounts of hormones, antibiotics, and pesticide residues.  Shall we believe this?  Let’s take a look, starting with hormones.

Hormones

Does conventional meat contain hazardous levels of hormones?  Short answer:  No.  For details, read on.

First, a primal perspective.

I once had a native African explain to me that among her people, they have a taboo against hunting female animals.   This taboo makes a lot of sense for a tribe dependent on hunting.  If you kill a female, you are eliminating a bunch of potential offspring at the same time; while killing a few bulls will have essentially no effect on the fecundity of the herd. 

So hunters would have preferred eating bulls to cows, and in Europe still today some producers  raise bulls.  Similarly, in the U.S. we get most of our beef from steer—neutered bulls—while we save the cows for calving and milk production.  

Bulls and steer differ hormonally.  Bull meat samples tested by Fritsche and Steinhart [1] contained medians of 0.34 mg/kg testosterone and 0.32 mg/kg epitestosterone, while steer meat samples (from unsupplemented steers) contain medians of 0.01 mg/kg testosterone and 0.12 mg/kg epitestosterone.   Bull meat had up to 1.05 mg/kg testosterone.  Thus, bull meat contains a median of 34 times more testosterone and more than twice as much epitestosterone than steer meat; and bull meat might have up to 105 times as much testosterone as a steer.

Their data indicates that at least some of the meat typically eaten by hunter-gatherers would have had  between eight and one hundred times more endogenous steroid hormone—in the form of testosterone— than an untreated modern “organic” steer.  Thus, it would seem that through evolution humans adapted to consumption of meat containing considerably greater levels of steroid hormone than what we find in a modern untreated or “organic” steer meat.

Since testosterone promotes muscle tissue growth, a steer has much less growth potential than a bull.  However, testosterone also makes bulls aggressive and harder to handle, so we can more easily control steers.  Modern husbandry attempts to restore the growth potential without the aggression by substituting for a small amount of the lost testosterone other anabolic but less androgenic steroids, primarily estrogens.

Currently the FDA allows the use of five hormones in cattle for meat production: progesterone, testosterone, estradiol-17β, zeranol, and trenbolone acetate.

The first three are natural hormones, the same as produced by an intact animal.   Zeranol occurs naturally also,  produced by fungi.   It acts as a non-steroidal estrogen agonist, meaning it acts like estrogen. Trenbolone acts as an androgenic steroid that promotes muscle growth.

Does the use of these in raising cattle result in meat or dairy products with unusually high, potentially harmful levels of dietary estrogens or other steroids? 

According to Doyle [2], four studies have found that muscle meat from an untreated steer provides estrodiol in a range of 2.8-14.4 pg/g.  Two other studies found estrodiol at concentrations of 12 pg/g in liver, and 12.6 pg/g in kidney.   Doyle also cites an FAO report finding that meat from implanted steers had 9.7 pg/g estrodiol at 15 days after implantation, and 7.3 pg/g at 61 days after implantation.  In short, the levels of estrodiol in hormone-treated meat falls in the normal range found in meat from untreated cattle.  Doyle comments:



“Estradiol levels in edible tissues of implanted cattle are usually significantly higher than in controls but the increases are small, in the ng/kg range. The greatest increases reported in an FAO report on estradiol residues were 0.002, 0.0065, 0.005, and 0.0084 mg/kg for implanted bulls, steers, heifers, and calves, respectively. These increases are well below the FDA recommended limits listed in the table on p. 2 and well below estradiol concentrations in muscles of pregnant heifers (0.016 to 0.033 mg/kg).”


Similarly, Hartmann, Lacorn, and Stienhart examined the “Natural occurrence of steroid hormones in food.” [3]  Using gas chromatography-mass spectrometry, they measured the levels of twelve steroids occurring in market-sourced meats, milk products, plants, yeast, and alcoholic beverages, including both naturally occurring and residues of additional hormones used in production.  They tested beef (bull, steer, heifer), veal, pork, poultry, eggs, fish, and plants (potatoes, wheat, rice, soybeans, haricots beans, muschrooms, olive oil, safflower oil, and corn oil). 

They found no significant difference in hormone levels between meat from hormone-treated and untreated animals.  In the typical diet, meat, poultry, and eggs proved to supply less hormones than non-paleo milk products :


“Meat does not play a dominant role in the daily intake of steroid hormones. Meat, meat products and fish contribute to the hormone supply according to their proportion in human nutrition (average about one quarter). The main source of estrogens and progesterone are milk products (60-80%). Eggs and vegetable food contribute in the same order of magnitude to the hormone supply as meat does.”


Thus, if you eat a paleo diet, and avoid milk products, you actually would eliminate the greatest dietary source of estrogens, and might reduce your total dietary intake of estrogens by more than 50%.

As for health effects of these hormones in foods, Hartmann et al compared the intake of hormones from diet from all sources to natural human hormone production levels, concluding that the food contents of steroids are insignificant compared to endogenous production [p. 18]:


“These values [amounts provided by diet] are far exceeded by the human steroid production (Table 10). Children, who show the lowest production of steroid hormones, produce about 20 times the amount of progesterone and about 1000 times the amount of testosterone and estrogens that are ingested with food on average per day. It has further to be taken into consideration that about 90% of the ingested hormones are inactivated by the first-pass-effect of the liver.  This leads to the conclusion that no hormonal effects, and as a consequence no tumor promoting effects, can be expected from naturally occurring steroids in food.”


For example, a prepubertal boy, most vulnerable to adverse effects of excess dietary estrogens,  produces about 100 micrograms of estrogen daily.  Beef muscle meat contains less than 0.02 micrograms of estrogens per kilogram.  To get from beef an intake of estrogens equal to just one percent of his endogenous estrogen production, i.e. 1 microgram, he would have to consume 50 kilograms--110 pounds-- of beef in a day!  

According to Doyle [2], the lowest observed effect level of exogenous estrogen is 5 micrograms of estrogens per kilogram of body weight.  Thus, a 40 kg prepubertal boy would have to consume 200 micrograms of estrogens daily to observe an effect.  To get this from conventional hormone-treated beef, he would have to consume 10,000 kg of beef every day.  Clearly he isn't going to get feminized by eating beef.

Opponents of the use of hormones in modern animal husbandry often claim that the use of hormones in production of meat and dairy products causes early puberty, obesity, and cancer in modern industrial nations.

Cornell University has a website discussing these Consumer Concerns About Hormones in Foods:

Can steroid hormones in meat affect the age of puberty for girls?
Early puberty in girls has been found to be associated with a higher risk for breast cancer. Height, weight, diet, exercise, and family history have all been found to influence age of puberty (see BCERF Fact Sheet #08, Childhood Life Events and the Risk of Breast Cancer). Steroid hormones in food were suspected to cause early puberty in girls in some reports. However, exposure to higher than natural levels of steroid hormones through hormone-treated meat or poultry has never been documented. Large epidemiological studies have not been done to see whether or not early puberty in developing girls is associated with having eaten growth hormone-treated foods.
A concern about an increase in cases of girls reaching puberty or menarche early (at age eight or younger) in Puerto Rico, led to an investigation in the early 1980s by the Centers for Disease Control (CDC). Samples of meat and chicken from Puerto Rico were tested for steroid hormone residues. One laboratory found a chicken sample from a local market to have higher than normal level of estrogen. Also, residues of zeranol were reported in the blood of some of the girls who had reached puberty early. However, these results could not be verified by other laboratories. Following CDC's investigation, USDA tested 150 to 200 beef, poultry and milk samples from Puerto Rico in 1985, and found no residues of DES, zeranol or estrogen in these samples.
In another study in Italy, steroid hormone residues in beef and poultry in school meals were suspected as the cause of breast enlargement in very young girls and boys. However, the suspect beef and poultry samples were not available to test for the presence of hormones. Without proof that exposure to higher levels of steroid hormones occurred through food, it is not possible to conclude whether or not eating hormone-treated meat or poultry caused the breast enlargement in these cases.
Can eating meat from hormone-treated animals affect breast cancer risk?
Evidence does not exist to answer this question. The amount of steroid hormone that is eaten through meat of a treated animal is negligible compared to what the human body produces each day. The breast cancer risk of women who eat meat from hormone-treated animals has not been compared with the risk of women who eat meat from untreated animals.

Similarly, we don't have any studies of comparing the prostate cancer risk of men who eat meat from hormone-treated animals to men who eat meat from untreated animals.

In Hyperinsulinemic Diseases of Civilization:  More Than Just Syndrome X, Cordain, Eades, and Eades [4] point out that current evidence actually implicates high carbohydrate intake as the promoter of these hormone-related disorders, because high carbohydrate intake raises insulin levels which increases levels of insulin-like growth factors and increases endogenous production of steroids, by far the main source of steroid exposure, while reducing sex hormone-binding globulins that reduce steroid activity. In the abstract they summarize:

Specifically, hyperinsulinemia elevates serum concentrations of free insulin-like growth factor-1 (IGF-1) and androgens, while simultaneously reducing insulin-like growth factor-binding protein 3 (IGFBP-3) and sex hormone-binding globulin (SHBG). Since IGFBP-3 is a ligand for the nuclear retinoid X receptor a, insulin-mediated reductions in IGFBP-3 may also influence transcription of anti-proliferative genes normally activated by the body’s endogenous retinoids. These endocrine shifts alter cellular proliferation and growth in a variety of tissues, the clinical course of which may promote acne, early menarche, certain epithelial cell carcinomas, increased stature, myopia, cutaneous papillomas (skin tags), acanthosis nigricans, polycystic ovary syndrome (PCOS) and male vertex balding. Consequently, these illnesses and conditions may, in part, have hyperinsulinemia at their root cause and therefore should be classified among the diseases of Syndrome X. [Italics added]

Using the paleo principle to evaluate this claim, we should expect ill effects of hormones in meat to appear in heavy meat-eating hunter-gatherer groups since they ate meat from intact animals, particularly bulls having 10 times as much testosterone as domesticated animals.

Alas for the hormone hypothesis, hunter-gatherers eating strictly native foods had no obesity, and, so far as we can tell, no cancer. [5]  Further, in The Paleolithic Prescription, S. Boyd Eaton, M.D., Melvin Konner M.D., Ph.D., and Marjorie Shostak present data on reproductive milestones among recent hunter-gatherers.  Among three recent hunter-gatherer tribes (Agta, !Kung, Ache), the average age of menarche (onset of menses) is about 16 years of age, compared to 12.5 years in the U.S. according to Wikipedia.

Since hunter-gatherers eating bull meat on a regular basis and consuming around 50% of energy from meat had none of the problems attributed to hormones in meat, it seems unlikely that hormones in meat can account for cancer or any other hormone-related problem in modern people.

I prefer to see people eat meat from animals not treated with hormones or hormone analogues, but if for budget reasons you choose to eat conventional meat (hormone treated or not) instead of grass fed, I think you don't need to worry that its hormone content will harm you in any way.  You should worry more that by avoiding the meat, you will consume too many carbohydrates that will much more profoundly alter your endocrine system in harmful directions.

But before you get meat from hormone-treated animals, check to see if you can find a supplier for meats from animals raised on typical feeds (corn, soy, etc.) but without added hormones.  In Phoenix, we have at least two markets--Sprouts and Sunflower--that supply meat that comes from such animals.
These markets sell their meats at prices comparable and sometimes lower than what I see at more conventional supermarkets where the meat comes from hormone-treated animals.

We will look at antibiotics and other issues in upcoming posts.

Thanks to Matt Schoeneberger, co-author of S.P.E.E.D. Weight Loss Book, for help accessing one of the articles I used as a reference for this article.  

Notes:

1. Fritsche S and Steinhart H. Differences in natural steroid hormone patterns of beef from bulls and steers.  J. Anim. Sci. 1998. 76:1621–1625.  Full text: jas.fass.org/cgi/reprint/76/6/1621.pdf


2.  Doyle E.  Human Safety of Hormone Implants Used to Promote Growth in Cattle:
A Review of the Scientific Literature.  Food Research Institute, University of Wisconsin
Madison, WI 53706.  Full text:  fri.wisc.edu/docs/pdf/hormone.pdf

3.  Hartmann S, Lacorn M, and Stienhart H. Natural occurrence of steroid hormones in food. Food Chemistry 62(1);7-20 

4.  Cordain L, Eades M, Eades M.  Hyerinsulinemic diseases of civilization: More than just syndrome X.  Comparative Biochemistry and Physiology Part A 136 (2003) 95–112.  PDF available here.

Why should a man die while sage grows in his garden?


Salvia spp.


Sage 
Botanical Name: Salvia officinalis
Plant Family: Mint, Lamiaceae
Properties: Bitter, pungent, astringent, oily, warming and cooling, antiseptic 






Wherever sage grows around the world it is used and revered as medicine. There are many varieties of sage plants and mainly discuses Salvia officinalis, or culinary sage. This plant is native to the Mediterranean and southern Europe and is now cultivated all over the world. 



For most people in North America, sage is used just a few times a year as a compliment to the turkey stuffing but sage has a long history of use. 

Its name comes from the roots of “to save” or “to heal”. Maude Grieve writes in A Modern Herbal that sage was even sometimes known as Salvia Salvatrix (Sage the Savior). 

In medieval times there was a saying, 

“Cur moriatur homo cui Salvia crescit in horto?”

Which translates to “why should a man die while sage grows in his garden?”.

Let’s take a look at the properties of sage so that we can better understand how this plant is used. 

The taste of sage is both bitter and pungent. Often, when we see these two attributes together we know this plant can be used to promote digestion. And sage is a wonderful carminative. It can ease gas and bloating, can move stagnant digestion (when you eat food and it feels like you have an immobile rock in your belly), and can even relieve painful cramping in the gut. Sage is especially appropriate for people who cannot digest fats well. It can be taken as a tea prior to or following a meal or simply used as a spice within the meal. 

Sage is a wonderful astringent herb and can be used to tighten and tone tissues. It has a special affinity for the mouth and can be used to relieve pain and heal mouth ulcers, canker sores, bleeding gums, spongy gums, and cold sores. Sage is a common ingredient in many tooth powder and mouthwash recipes. 

The astringent and antiseptic qualities of sage make it perfect for sore throats. I like to combine sage tea with some lemon and honey for this purpose. 

In fact, sage is effective for a variety of discomforts that can accompany a cold or flu. Taken as a warm tea it acts as a stimulating diaphoretic, making it a good choice for fevers when the person feels cold and is shivering. As an herbal steam it can help to decongest the sinuses and loosen congestion in the lungs. 
You’ll notice that sage is listed as being both warming and cooling. Taken as a warm tea sage can open your pores and increase sweating. But taken as a lukewarm or slightly cold tea it has the opposite effect; it actually decreases excessive secretions. 

These two qualities may seem contradictory in nature but sage, like so many plants, has the ability to bring balance to the body. 

Herbalist Kiva Rose explains: 
When reading some of the seemingly contradictory actions and indication in the description of Sage, it will be helpful to keep in mind that the herb seems to act primarily as a balancer of fluids in the body, whether there is too much or not enough. 

This ability to stop excessive secretions can be helpful in a variety of ways. 


  • To stop excessive diarrhea. 
  • To stop the excessive sweating related to night sweats or fevers. 
  • To stop the flow of breast milk in the weaning process. 
  • To stop the excessive flow of mucous in the sinuses. 
  • To stop excessive vaginal discharge, for example discharge associated with yeast infections.

The red sage of China is well known for its affinity for the blood, but our culinary sage can also be used for a variety of stagnant blood conditions such as blood clots and varicose veins. In the Earthwise Herbal, herbalist Matthew Wood describes using sage on multiple occasions for dissolving blood clots. The term for this in traditional herbalism is “blood mover”.

As a blood mover, sage can be used for those with poor circulation resulting in cold hands and feet. Improving the blood flow to all parts of the body, including the brain, sage is also well known to improve cognitive function and can help prevent memory loss and clear foggy thinking. 

Sage is singularly good for the head and the brain, it quickeneth the senses and memory, strengthening the sinews, restoreth health to those that have the palsy, and taketh away shakey trembling of the members. 
Gerard

Sage is commonly used for menopausal complaints such as night sweats and hot flashes. Herbalist Phyllis Light says sage is 

“specific for the transition from fertility levels of estrogen to post-menopausal levels. In other words, it helps the adrenal cortex take over the manufacture of sexual hormones as the gonads atrophy”. It is specific for symptoms of “drying out”.

Sage can relieve pain and increase circulation and, keeping this in mind, we can see that sage lends itself well to external use. It can be infused into oil and used for massage.  It can be infused in witch hazel or alcohol and used for varicose veins. Infused in vinegar it can both relieve the pain of sunburns and soothe the dry heat. A tea of the leaves or a diluted vinegar infusion can be used as a wash to remove dandruff. 

Sage can also be used for a variety of pain ailments. Culpepper recommends it for headaches, rheumatic pains and joint pains.

Maude Grieve lists this sage recipe as a cure for sprains. 

Bruise a handful of sage leaves and boil them in a gill of vinegar for five minutes; apply this in a folded napkin as hot as it can be borne to the part affected. 

There are many sage varieties from around the world that are revered as medicine. 

In China, Dan shen (Salvia miltiorrhiza) has been used for thousands of years as blood tonic. 

Salvia militiorrhiza

The white sage of California (Salvia apiana) is used extensively, both historically and in present day, in ceremonies. 

Botanical Description
Sage grows anywhere from 1-3’ high. It’s a member of the mint family and has square stems and opposite leaves. 

The leaves are lanceolate (long oval) and are grayish green in color. 


Sage has blue to purple flowers on terminal spikes. 
Typically, sage is gathered just before flowering. 



There are so many ways to use sage.


tea
tincture
oil, salve
wash
steam inhalation
essential oil
tooth powder
vinegar
butter
wine

Sage is generally considered safe for everyone. However, sage is contraindicated in pregnancy. It can also dry up the flow of milk during lactation so, unless the mother is wishing to wean, it is generally contraindicated during nursing.  


Resources used: 
Earthwise Herbal by Matthew Wood
Writings by Kiva Rose
Herbal Medicine by Sharol Tilgner
Personal studies at East West School of Herbology

Tuesday, December 21, 2010

My Meals 12/21/2010: More Budget Primal

On Saturday night I used some of the leftover $1.97/lb cross rib roast to make a stir-fry.   I sliced onions, carrots, and collards into strips, and cut the beef into bite-sized chunks.  I heated leftover bacon fat in the pan, then stir-fried them successively, while I reheated the leftover mashed squash in the convection oven.  My portion looked like this:


On Sunday night I took another of those roasts and used Richard Nikoley's prime rib method to prepare it. I heated the oven to 500 degrees F, then put the ~3 lb. roast in for an intended 15 minutes, setting the timer.  However, Tracy and I decided to go out for a short walk, and I forgot the roast until we returned some time later to the timer buzzing.  I wasn't sure how long we had been gone.  I checked the roast and found the surface well-browned, but not burned, so I turned the heat down to 150 degrees (the lowest mark on my oven) and left it overnight.  Despite my mistake and the small size of the roast, it turned out with a great rind,  a medium rare center, and a texture like smoked meat.  I prefer it rare, but in this case I feel glad it didn't get well done.  I deglazed the pan with vegetable stock and made a gravy by thickening it with arrowroot powder. Some of it appears in a meal below.

As I heated the oven to 500 for the meat, I put in 4 large sweet potatoes that I purchased from Smart & Final for $0.50 per pound, half what we usually pay for sweets.  They were done by the time we got back from our walk. 

This past Friday we got a fantastic deal on locally produced Hickman's eggs at Sunflower market; $0.99 for 18 eggs (6 cents per egg!) marked down because they were near the pull date:


We keep our eyes open for these kinds of deals when shopping. Eggs keep for about a month after a pull date if kept in the refrigerator, but we will easily use 3 dozen eggs in two weeks.  I have already had a dozen of these eggs myself in just 3 days.

Today Tracy put some of those eggs to good use.  She took some leftover vegetables and potatoes and heated them in an omelet pan using olive oil and butter:


While the vegetables heated she beat 5 eggs then poured them over the vegetables:

She covered the pan let it cook on medium low heat until it was almost firmed up:

Meanwhile she set the oven on broil and when the eggs were almost ready, she removed the cover and put them under the broiler.  She then served it with some remaining leftover potatoes, and some of the slow-roasted meat covered with gravy.  Here's her plate:

And mine:
I put some salsa atop one wedge of the egg dish, and we both spread some sour cream on the eggs. I got the sour cream on sale at Sprouts market, 16 ounces for $1.99.  After finishing this plate I had another wedge of the egg dish, and a small bowl of pineapple, 3 prunes, and some walnuts.  I ate that at 10am after heavy strength training on an empty stomach; as I finish this post it is almost 4pm and I'm still satisfied, 6 hours later.

Thursday, December 16, 2010

Institute Of Medicine Vitamin D Panelist Has Conflicts of Interest

On November 30, 2010, the Institute of Medicine release its "Consensus Report" suggesting updated Dietary Reference Intakes for Calcium and Vitamin D, in which it "reviewed" the evidence on vitamin D and health and came to the conclusion:

The IOM finds that the evidence supports a role for vitamin D and calcium in bone health but not in other health conditions. Further, emerging evidence indicates that too much of these nutrients may be harmful, challenging the concept that “more is better.”
The IOM decided that the "evidence" indicates that people do not require more than 600 IU of vitamin D daily, and that 4000 IU is the "Upper Level Intake" which we should not .  This conclusion could have come about only by ignoring the research of vitamin D experts,  which has determined that the average person uses about 4000 IU of vitamin D daily, and produces 10,000 IU by endogenous synthesis stimulated by 20-30 minutes of mid-day summer sun exposure [check here].  

If you were wondering what happened, the Alliance for Natural Health may have an answer for you:

A pharmaceutical company is developing a patentable man-made vitamin D analog—yes, a synthetic drug version of vitamin D. And Glenville Jones, PhD, one of the committee members who determined the new vitamin D guidelines and who is quoted as saying that under these guidelines, most people “probably don’t have vitamin D deficiency” and “We think there has been an exaggeration of the public’s interest in vitamin D deficiency,” is an advisor for that same pharmaceutical company.

Wednesday, December 15, 2010

Weekly Strength Training Provides Long-Term Cognitive and Economic Benefits

Joyce Mar works on her strength training at the Langara Family YMCA in Vancouver on Tuesday.

Photograph by: Steve Bosch, Vancouver Sun, Vancouver Sun

Today the Archives of Internal Medicine published results of a study that found that women aged 65 to 75 years who engaged in progressive strength training once or twice weekly over 12 months had improved executive cognitive functions and lower medical care costs than control women when evaluated again one year later.

Today's publication provides a follow-up on the Brain Power study which the Archives of Internal Medicine published inpublished in its January 2010 issue.  The original study demonstrated that 12 months of once-weekly or twice-weekly progressive strength training improved executive cognitive function in women aged 65- to 75- years- old.  

 One report of the the study results released today states:

Both studies were led by Teresa Liu-Ambrose, principal investigator at the Centre for Hip Health and Mobility and Brain Research Centre at Vancouver Coastal Health and UBC, and assistant professor in the Department of Physical Therapy at UBC's Faculty of Medicine. The one year follow-up study found the cognitive benefits of strength training persisted, and with two critical findings.

"We were very surprised to discover the group that sustained cognitive benefits was the once-weekly strength training group rather than the twice-weekly training group," says Liu-Ambrose, who's also a Michael Smith Foundation for Health Research scholar. "What we realized was that this group was more successful at being able to maintain the same level of physical activity achieved in the original study."

In this study, one group did once-weekly strength training, another did twice-weekly strength training, and a third group "control" group did something the authors refer to as "balance and tone" which according to a Vancouver Sun article featured "stretching, range of motion, core strength, balance and relaxation exercises." 

According to the report, only the once-weekly group maintained the cognitive benefits at the follow-up.  Strength training may keep you smarter than doing "balance and tone" training.

 Regarding economic benefits:

The second important finding relates to the economic benefits of once-weekly strength training. Using the data from the Brain Power Study and the one-year follow-up study, health economists Jennifer Davis and Carlo Marra, research scientists with the Collaboration for Outcomes Research and Evaluation at St. Paul's Hospital and UBC Faculty of Medicine, were able to show that the economic benefits of once-weekly strength training were sustained 12 months after its formal cessation. Specifically, the researchers found the once-weekly strength group incurred fewer health care resource utilization costs and had fewer falls than the twice-weekly balance and tone group.

"This suggests that once-weekly resistance training is cost saving, and the right type of exercise for seniors to achieve maximum economic and health benefits," says Davis.

The study found that the once-weekly strength training group had the fewest fall and lowest medical care utilization costs. 

Thus,  this study shows that standard strength training helps seniors maintain balance more effectively than exercises supposedly dedicated to "core" strength, balance, and range of motion.  That's partly because maintaining balance requires muscular strength.  If your inner ear detects that you are off balance, but you lack the strength required to correct the movement, you will fall.  Moreover, a properly designed strength training routine will itself provide so-called "core" strength, balance training, and movement through full ranges of motion.  

In other words, strength training does it all in just one session per week.

Stay strong, stay smart, stay healthy.

The Blood Type Diet: A Critical Perspective

Thanks to Peter J. D’Adamo, N.D. , author of the book Eat Right 4 Your Type, many people believe that blood type determines your dietary requirements and that only people with O-type blood should eat a paleo/primal diet.  

In this post I will discuss all the errors in this blood type hypothesis. 

The Blood Type Diet Hypothesis

D’Adamo’s hypothesis can be distilled down to four main claims and a conclusion drawn from these claims.

Claim A: Evolutionary adaptation to diet patterns resulted in the ABO blood groups, so each of the four types is adapted to a different type of diet and set of foods.
Claim B: People of different blood types have different antibodies in their blood and each blood type has a different susceptibility to diseases.
Claim C: Foods contain lectins that mimic blood group antigens and selectively cause blood agglutination (i.e. each food affects each blood group differently), and this causes diseases.
Claim D: Exposure to foods containing lectins incompatible with your blood type will cause agglutination of your blood which will cause diseases. As D’Adamo puts it, the lectins “target  an organ or bodily system and begin to agglutinate blood cells in that area.”

Conclusion: Therefore, people require diets tailored to their blood types, eliminating foods that have harmful lectins for their blood type.


The Four Diets

Blood type O

According to D’Adamo,  O-type blood is the “Original” blood type which evolved when humans lived by hunting supplemented by gathering.  He says these people should base their diets on lean meat and fish, supplemented with a selection of fruits and vegetables he deems suitable for this blood type.  O-types should avoid or greatly minimize dairy products and minimize or avoid grains and beans, particularly wheat.  According to D’Adamo, people with O-type don’t tolerate wheat “at all” yet he also says that they can eat sprouted wheat products (?).

Blood type A

D’Adamo claims that A-type blood arose as an adaptation to agriculture; conveniently, A is for agriculture.  He claims that this type can eat grains (except wheat), beans, most seafoods, many vegetables and  fruits, but should avoid dairy, meat, wheat, kidney beans, and lima beans.  He states that only people with A-type blood can and should eat a vegetarian diet.
           
Blood type B

D’Adamo refers to B-type as  “The Nomad.”  He implies that this type arose as an adaptation to pastoral lifestyles based on use of dairy products and meat from domesticated animals.  He states that B-types are adapted to a “balanced omnivore” diet that meat (but no chicken), eggs, dairy, beans, fruits, vegetables.  He advises people having the B blood type to avoid chicken, corn, lentils, peanuts, sesame, buckwheat, and wheat.

Blood type AB

D’Adamo calls type AB blood “The Enigma” and states that people having this blood type are adapted to a “mixed diet in moderation.   He advises that they can safely eat lamb, mutton, rabbit, turkey, pheasant, most seafood, dairy, beans, grains,             fruits,             and vegetables, but should avoid beef, chicken, kidney beans, lima beans, seeds, corn, buckwheat.

Basic Errors

As someone basing his whole approach to diet on blood types, D’Adamo appears disturbingly ignorant of basic facts about the evolution of the ABO types.

First, he claims that the O-type is the original blood type. He published his book in 1994, but by 1990 molecular biologists had determined that A-type is the original blood type.   In “Evolution of Primate ABO Blood Type Genes and Their Homologous Genes [full text available free],” Saitou and Yamamoto state [p. 405]:


“..the common ancestral gene for the hominoid and Old World monkey ABO blood
group is A type, and three B alleles evolved independently on the human, gorilla, and baboon lineages.”


The fact that both A-type and B-type antedate O-type seems predictable from the fact that O-type blood carries antibodies to A-type and B-type blood; for this to happen, O-type blood had to have emerged in an environment in which A-type and B-type antigens (the markers on A- and B- type blood cells) already existed. 

D’Adamo suggests that the blood types arose in humans as adaptations to dietary variations, implying that they are unique to humans.  In fact, ABO blood types occur not only in humans, but also in other primates.  Again, according to Table 5 in Saitou and Yamamoto, which presents data dating to 1964, more than 30 years before D’Adamo published his books:
·      The A phenotype occurs in chimps, orangutans, gibbons, baboons, Java macaques, sulawesi crested macaques, and squirrel monkeys (7 speicies).
·      The B phenotype occurs in gorillas, orangutans, gibbons, baboons, Rehsus macaques, pigtailed macaques, Java macaques, sulawesi crested macaques, and cebus monkeys (9 species).
·      The AB phenotype occurs in oragutans, gibbons, baboons, and Java macaques (4 species).
·      The O phenotype occurs in chimps, Java macaques, squirrel monkeys, and cebus monkeys (4 species).

Since none of these primates have ever practiced agriculture or domesticated dairy animals, it is clear that the A allele did not evolve as an adaptation to agriculture nor did the B allele emerge as an adaptation to consumption of dairy products. 

As noted above, the B-type has been found in the largest selection of species (9), followed by the A-type (7), and both O- and AB- type occur in the smallest selections (4 species each).  According to Saitou and Yamamoto, among this list of primates checked for ABO blood type, A phenotype occurred in 191 individuals, B in 75 individuals, AB in 44 individuals, and O in 20 individuals. 

According to Saitou and Yamamoto “It seems that the time of gene dupulication producing ABO and GAL genes may be around the emergence of vertebrates (ca. 500 MYA).”  [p.408]  In other words, these blood types have a much longer history than human dietary variations. 

Human Data

Its possible that D’Adamo did not mean to imply that the blood types arose only in humans.  Perhaps he meant that human adoption of hunting lifestyle selected for O-type, agriculture for A-type, and pastoralism for B-type.  In this case, we should find a predominance of O-type among all ethnic groups with a long history of living by hunting, especially those eating little plant food; a predominance of A-type among  ethnic groups having the longest histories of practicing agriculture; and a predominance of B-type among ethnic groups with long histories of pastoralism.

Unfortunately, the available data does not support this either.  So far as we know, Eskimos have never practiced agriculture or animal husbandry, and have long lived primarily by hunting, a diet enforced by their environment.  D’Adamo’s hypothesis would predict that Eskimos would have predominantly O-type blood.  According to Bloodbook.com, this is true of Greenland Eskimos (54% O, 36% A), but not of Alaskan Eskimos (38% O, 44% A), who have more A-type than O-type individuals, contradicting D'Adamo's hypothesis.

According to Wikipedia and EveryCulture.com, people of the Blackfoot or Niitsítapi tribe traditionally lived primarily on buffalo meat; classic hunters.  From Wikipedia:


“While the Niitsitapi were in the Great Plains, they came to depend on the buffalo (American bison) as their main source of food. The bison are the largest mammals in North America and stand about 6 ½ feet tall and weigh up to 2,200 pounds.[6] Before the introduction of horses, the Niitsitapi had to devise ways of sneaking up close to the buffalo without the animals' noticing so they could get in range for a good shot. The first and most common way for them to hunt the buffalo was using the buffalo jump. The hunters would round up the buffalo into V-shaped pens and drive them over a cliff (they hunted prong-horned antelopes in the same way). After the buffalo went over the cliff, the Indians would go to the bottom and take as much meat as they needed and could carry back to camp. They also used camouflage for hunting.[6] The hunters would take buffalo skins from previous hunting trips and drape them over their bodies to blend in and mask their scent. By subtle moves, the hunters could get close to the herd. When close enough, the hunters would shoot the bison with arrows, or use lances and spears to bring them down.

They used virtually all parts of the body and skin. They prepared the meat for food: by boiling, roasting and drying for jerky. This prepared it to last a long time without spoiling, and they depended on bison meat to get through the winters.[7] The winters were long, harsh, and cold due to the lack of trees in the Plains, so the people stockpiled the meat when they had the chance.[8] The hunters often ate the bison heart minutes after the kill, as part of their hunting ritual. The skins were prepared and used to cover the tepee. The tepee was made of log poles with the skin draped over it. It remained warm in the winter and cool in the summer, and a great shield against the wind.[9] With further preparation of tanning and softening, the women made special clothing from the skins: robes and moccasins. They rendered bison fat to make soap. Both men and women made utensils, sewing needles and tools from the bones, using tendon for fastening and binding. The stomach and bladder were cleaned and prepared for use as containers for storing liquids. Dried bison dung was fuel for fires. the fires. The Niitsitapi used almost every part of the buffalo and considered it a sacred animal, integral to their lives.[10]”


From this D’Adamo would predict that they had primarily O-type blood.  Wrong again.  Bloodbook.com states that 82% of Blackfoot people have A-type blood, and only 17% have O-type. 

Looking from the other side, D’Adamo’s hypothesis would predict that Cantonese Chinese would have a higher incidence of A-type and lower incidence of O-type because they have lived for millennia on a rice-based agricultural diet.  Bloodbook.com states that Cantonese Chinese have 46% O- and 23% A- types, the reverse of the D’Adamo prediction. 

Meanwhile, the northern Chinese (Peking) have 29% O, 27% A, and 32% B, a distribution which according to D’Adamo’s hypothesis would predict that the population has a long history of living on dairy products.  Unfortunately for D’Adamo, the pastoral lifestyle is not common in China at all, only practiced by Chinese in the western, mountainous provinces.

Just as one black swan is sufficient to disprove the claim that all swans are white, these few counter examples are sufficient to disprove the claims that hunting-based subsistence favors the O-type blood, while an agricultural subsistence favors the A type blood.  D’Adamo is just wrong when he asserts that dietary differences drive differences in distribution of blood types among humans. 

No Anatomical Evidence

All humans, regardless of blood type, cultural background, or  diet histories have the same basic gut design, dentition (number and type of teeth, type of enamel), type of saliva and digestive enzymes.  This is why we call them humans.  

For example, scientists have found no nutritionally relevant anatomical, physiological, or biochemical differences between Chinese and Eskimos. 

Nor are there any such differences between people with A-type and O-type blood.  

Medical Evidence?

D’Adamo claims that O-type people get ulcers more frequently than people with A-type because O-type individuals, according to him, produce more stomach acid than people with A-type blood.  He says that O-types produce this greater amount of acid as an adaptation to a high meat diet; A-types have less acid because they are adapted to a low protein, low meat diet. 

Apparently he missed the memo when research discovered that gastric ulcers arise from infection with H. pylori bacteria, not excessive stomach acid production.  While it is true that group O individuals have approximately 35% greater risk of gastrointestinal ulcer when compared to group A individuals, this is not because O-types produce more acid that A-types.   The ulcer-causing bacterium, H. pylori, can more easily attach to the G.I. lining of Group O, because it has a protein structure that mimics the Group O host (which confuses the host’s immune system).  In contrast, the immune system of A-type individuals more easily recognizes the bacterium as a foreign invader, making them more resistant to this infection.

D’Adamo correctly states that group A individuals have a higher risk of cancer than group O individuals.  Relative to Group O individuals, Group A individuals have higher risks of cancers of stomach , colon, ovary, uterus, cervix, and salivary glands (relative risks 1.2, 1.11, 1.28, 1.15, 1.33, and 1.64).  D’Adamo implies that this difference arises due to influence of dietary lectins. Presumably, the more vegetarian diet  he prescribes for A-types will protect them from cancer by reducing their exposure to harmful lectins.

First of all, if dietary lectins are the cause of cancers in anyone, it is very hard to understand how a vegetarian diet based on grains and beans will help prevent cancer.  Lectins are carbohydrate-binding proteins found in the highest concentrations in carbohydrate-rich foods like gains, legumes, and potatoes, not meat.  From Wikipedia entry on lectins:


“The toxicity of lectins has been identified by consumption of food with high content of lectins, which can lead to diarrhoea, nausea, bloating, vomiting, even death (as from ricin). Many legume seeds have been proven to contain high lectin activity, termed as hemagglutinating activity. Soybean is the most important grain legume crop, the seeds of which contain high activity of soybean lectins (soybean agglutinin or SBA). SBA is able to disrupt small intestinal metabolism and damage small intestinal villi via the ability of lectins to bind with brush border surfaces in the distal part of small intestine. Heat processing can reduce the toxicity of lectins, but low temperature or insufficient cooking may not completely eliminate their toxicity, as some plant lectins are resistant to heat.”



Therefore, the diet he apparently prescibes to A-types to reduce their risk of cancer actually provides MORE potentially hazardous dietary lectins than the diet he prescribes to O-types.

Secondly, D’Adamo doesn’t appear to know that the reason for increased risk of cancers in individuals with A-type blood is similar to the reason for the increased risk of ulcers in individuals with O-type blood.  Tumors often express an A-like antigen that the immune system of an A group individual will accept as “self” while the immune system of an O or B group individual will attack any cell with an A antigen. 

Note also that group O- and B- individuals still get these tumors. For example, the relative risk of stomach cancer is only 20% greater among A-type individuals compared to O-type.  This means that if 5 of 100 individuals having blood type O get stomach cancer, 6 of 100 individuals having blood type A get the same cancer; if 50 type Os get the cancer, 60 type As will get it.  In absolute numbers, the difference between these groups is not significant. 

Do Lectins Cause Selective Agglutination?

A lynch pin of D’Adamo’s hypothesis is the claim that lectins selectively cause agglutination in different blood types.  For example, according to D’Adamo, people with B-type blood should avoid chicken because it contains lectins that will agglutinate blood in these individuals, but not in people with other blood types.

We don’t have any evidence for this.  In fact, published data indicates that any individual lectin will affect all blood types in essentially the same way. 

Wikipedia discusses this topic:


“D'Adamo claims there are many ABO specific lectins in foods.[14] This claim is unsubstantiated by established biochemical research, which has not found differences in how the lectins react with a given human ABO type. In fact, research shows that lectins which are specific for a particular ABO type are not found in foods (except for one or two rare exceptions, e.g. lima bean), and that lectins with ABO specificity are more frequently found in non-food plants or animals.[15][16]

The Nachbar Study[17] has been cited in support of D'Adamo's theories, because it reports that the edible parts of 29 of 88 foods tested, including common salad ingredients, fresh fruits, roasted nuts, and processed cereals were found to possess significant lectin-like activity (as assessed by hemagglutination and bacterial agglutination assays). However, almost all of the 29 foods agglutinated all ABO blood types, and were not ABO blood type specific. Since D'Adamo's theory has to do with lectins in food that are "specific for a certain ABO blood type", this study does not support his claim that there are many ABO specific lectins in foods.”

Reference 15 in this excerpt refers to The Handbook of Plant Lectins

So Why Do People Feel Better?
  
D’Adamo advises the avoidance of wheat to O, A, and B blood types. Collectively, these comprise 96% of all people in the U.S..  Therefore, most people who read the book will get advice to avoid wheat, and they may try it.  This alone will improve health for some people who are wheat sensitive. 

Besides this, most people adopting the blood type diet will simply make general improvements to their diets, like reducing sugar intake, eating less processed and more unprocessed foods. D’Adamo suggests these steps to all blood types.  The general steps will help most people feel significantly better and perhaps lose some body fat. 

Bottom Line

The blood type diet does not have a solid leg on which to stand.  The hypothesis is riddled with errors. 

Addendum 10/28/11


Michael Klaper, M.D.: Challenges to the Blood Type Diet