You are on page 1of 511

BEYOND VEGETARIANISM

Compiled for www.beyondveg.com


Reports from veterans of vegetarian and raw-food diets, Veganism, fruitarianism, and instinctive eating, plus new science from Paleolithic diet research and clinical nutrition.

Assessing Claims and Credibility/Raw and Alternative Diets


p.1

The Raw Vegan Pledge


p.

Comparative Anatomy & Physiology up to Date


p.

Looking At Ape Diets, Myths, Realities and Rationalization


p.

The Fossil record Evidence about Human Diet


p.

Intelligence, Evolution of the Human Brain and Diet


p.

Comparative Anatomy
p.

Insights about Human Nutrition & Digestion


p.

Health and Disease in Hunter Gatherers


p.

Paleolithic Diet vs. Vegetarianism


p.

Dietary Macronutrient Ratios & their Effect on Biochemical


Indicators of Risk for Heart Disease p.

Are Higher Protein Intakes Responsible for Excessive


Calcium Excretion p.

Metabolic Evidence of Human Adaptation to Increased Carnivory


p.

Longevity & Health in Ancient Paleolithic vs. Neolithic People


p.

Is Cooked Food Poison?


p.

Does Cooked Food Contain Less Nutrition?


p.

What Every Vegan Should Know About Vitamin B12


p.

The Calorie Paradox of Raw Veganism


p.

Reality Check
p.

Functional and Dysfunctional Lunch Attitudes


p.

Satire
p.

Selective Myths of Raw Foods


p.

Troubleshooting: Avoiding & Overcoming Problems


p.

References
p.

Assessing Claims and Credibility in the Realm of Raw & Alternative Diets
Who Should You Believe? by Tom Billings The purpose of this paper is to address the relevant questions:

What criteria can be used to evaluate individuals ("experts" or "diet gurus") making dietary and health claims--particularly when attempting to assess the credibility of information that: either lacks scientific documentation, has not yet been well-researched, or remains controversial supposing that it has been investigated to some degree? Also, why should you assign any more credibility to the information provided on this site than to information provided by those we characterize as dietary extremists?
o o o

Gaining a sense of the issues involved in such concerns about credibility, and what to believe, is important, as anyone will know who has ever been faced with what can be a bewildering array of choices about what dietary approach to follow in their own life. An immediate but not fully satisfying answer to the question of this site's approach is to describe it as: an attempt to be realistic; frank and honest about shortcomings and difficulties of alternative diets; moderate, common-sense, sane; and open to new information (even when it contradicts established beliefs of the past). These are the qualities frequently lacking in the approach of dietary extremists.

The context: When science doesn't have the answer to every question

The role of scientific information


Before delving deeper into the question of credibility in the alternative dietary world in general, it is appropriate to examine the context in which the question arises. For some issues in the raw and vegan world, there is ample, relevant, scientific research that addresses the issue(s), and which provides an answer. Sample issues of this type are: Is the diet of the great apes completely fruitarian and/or vegan? Are humans natural vegans and/or did humans evolve as vegans or fruitarians? Is modern cultivated fruit "natural"? Does fruit have a nutritional composition similar to mother's milk?

Some of the material on this site addresses the specific questions above, among others. (By the way, the answer to all the questions listed immediately above is "no," although the answer to the third question may vary depending on your definition of the word "natural.") The above issues, and other issues for which there is adequate scientific research, are important because they provide an effective gauge for determining if an "expert," in general, accepts reality or is in denial of reality.

The role of anecdotal information


However, many important issues in the raw and vegan world have not been the subject of peer-reviewed, published, quality scientific research. Sample issues in this category are: How well does the diet work in the long-term? For which health conditions are (specific) raw diets a good short-term therapy, and for which conditions would it be better to avoid raw diets? What does it mean that so very few succeed long-term on 100% raw vegan diets? And which is the better alternative: a steady 50-75% raw diet, or an on/off pattern, i.e., switching from 100% raw to much less than 100%? As behavioral problems appear to be common on 100% raw diets, what are the mental-health effects of long-term, 100% raw vegan diets?

For such issues, most evidence is anecdotal: personal experience and the experience of others. Hence the question of primary interest here is--when adequate science is not available--whose anecdotal evidence to consider seriously: the evidence presented on this site, or the idealistic views promoted by dietary extremists? That is the central issue addressed in this article.

Structure of this article


The information and evaluative approaches we'll cover are presented in several major sections: The first section provides a contrast between the views found on this site and some of the extreme views one may encounter in the raw and/or vegan movements. The objective of the contrast is to help in assessing the credibility and usefulness of information offered by the various diet gurus or advocates one might encounter. The next major section discusses a tool that can be useful in analyzing whether a promoted diet is credible, or incredible. Then a section deals with the often irrational, hostile behavior of dietary extremists, and discusses the implications such behavior has regarding credibility. Finally, the closing sections discuss how discarding dietary dogma can be not simply of practical benefit on the road to better health, but just as importantly can be a personally liberating experience as well.

Most of the material is presented in summary form to minimize redundancy with other articles on this site. Site articles that provide related information include Idealism vs. Realism in Raw Foods and Selected Myths of Raw Foods.

TABLE OF CONTENTS

Examining extremist attitudes and dogma in raw veganism


o o o o Convenient and contradictory attitudes toward science and evidence Logical gaps, internal contradictions, and leaps of faith Signs of emotional instability and denial of reality Appeals based on the "cult of personality": authoritarian absolutism, worship of "success" figures/gurus, trashing of "failures" and dissidents

Examining the personal diets of raw vegan gurus: a potential tool for assessing credibility
o o o o o o o Does the diet really add up? (Calories don't lie) Introduction: rose-colored glasses vs. the unpleasant realities Some raw/fruitarian "experts" are unable to function in a civil manner Threats, bullying, and harassment: another extremist tactic Does your diet guru promote "dietary racism"? Fruitarian plagiarism: a sign of moral bankruptcy among extremists and "tolerant" followers In Bad Faith: dishonest information and debate tactics in raw foods--personal experiences Credibility is a continuum Skepticism can spark personal liberation A special message about, and for, extremists Is your "ideal" dietary dogma really a sandcastle?

Raw vegan extremist behavior patterns: the darker side of rawism

Skepticism in raw foods vs. the "golden cage" of dietary dogma


o o o o

Examining extremist attitudes and dogma in raw veganism

Convenient and contradictory attitudes toward science and evidence


Examine the position of the extremists on issues for which there is significant scientific evidence. Do they acknowledge reality, or are they in denial? Extremists often display a general pattern of picking

and choosing when to accept scientific evidence, and when to reject it.
Typically, science (i.e, specific papers or findings) will be accepted by such individuals (and even promoted) when it assists in promoting their idealistic dietary dogma. But it will also be rejected as invalid, e.g., via such mindless slogans as "science is cooked," or claiming the results are invalid because they are for "mixed diets," whenever science debunks their dogma. Additionally, extremists may display an unreasonably aggressive anti-establishment mindset, equating establishment science with evil, and thereby selectively rejecting specific research for political (rather than scientific) reasons. Examples of this are as follows:

Ignoring the extensive evidence from published field observation studies of the great apes, showing that at least a modicum of animal foods are a normal part of most ape diets. Instead, the evidence is rejected on the grounds that the extremist believes the apes are "in
error," the behavior is "maladaptive," or other flimsy, logically invalid rationalizations are given. Who knows better what is the natural diet of the apes: the wild apes eating animal foods to survive, or an irrational dietary extremist whose reason is clouded by idealistic dogma? Rejecting evolution as a scientific theory and adopting creationism, solely because evolution (and the fossil record covering millions of years of evidence) shows that humans are natural omnivores/faunivores, and have been omnivores/faunivores ever since the inception of the human genus, Homo, over two million years ago. Adopting creationism for sincere spiritual reasons (i.e., not related to dietary dogma) could perhaps be regarded as legitimate in some sense when limited to the sphere of religion; however, adopting it solely because evolution debunks your dietary dogma is a form of intellectual dishonesty.

Those who reject the published, peer-reviewed scientific research of Victor Herbert, M.D., simply because they disagree with Herbert's political efforts against alternative
diets and medicine. Even if one disagrees with Herbert's politics and goals, his research should be evaluated independently of his political views. (Herbert has published several significant papers on vitamin B-12, a topic of great interest to vegans.)

The dietary proponent who demands hard scientific proof (as a debating tactic) when only anecdotal evidence is available (e.g., when others criticize the diet), while
accepting as scientific fact (and, in effect, as "holy gospel" as well) some of the unscientific nonsense promoted by "experts" of the past--e.g., T.C. Fry claiming wheatgrass juice is toxic, Herbert Shelton claiming that cooking makes all minerals inorganic, Herbert Shelton claiming all spices are toxic, etc. (The dishonest debating style of a fruitarian "expert" who uses this tactic is discussed in a later section of this paper.)

Extremists who selectively quote scientific research papers out of context, or misrepresent the results of research, to promote their diets. The raw and veg*n
communities have the dubious (but mostly appropriate) reputation of tolerating and (in the case of certain extremists) promoting crank science and junk science. An interesting exercise one can perform is to examine the references cited in the rawist writings. You will find that the references fall into a number of categories, as follows: o References to other rawist books (non-scientific, of course). The point here is that much rawist "research" consists of a small group of people quoting each other in a logically circular fashion.

o o

References to newspapers, magazines, popular books; i.e., citations that are not
peer-reviewed scientific or professional journals, nor even non-peer-reviewed but otherwise relatively well-informed publications by legitimate scientists. References to peer-reviewed, scientific journals. Such citations are scarce, but are worth checking when they appear, because you may find, if the writer is an extremist, that he/she has misrepresented or misinterpreted the material in the cited paper.

The primary point of the above list is that much of the available writings on raw diets consist of individuals referencing each other in circular fashion, or are based on non-technical sources. Further, when actual references to scientific journals are provided, you may be surprised to find that even in these instances the material is being misused, particularly if the writer is a hardened fanatic.

Consistent pattern of misuse an important consideration. Finally, a note of caution here:


there is a certain level of subjectivity inherent in language. Thus, a few instances of a person appearing to misinterpret research papers is not proof the person is deliberately misusing the material. Instead, you need to observe a long-term pattern of misuse before drawing conclusions that the writer might be an extremist or otherwise untrustworthy. Of course, whenever such a pattern of misuse is present, it points to one or more of two or three things: intellectual dishonesty; chronic sloppiness and lack of due diligence (which often goes hand-in-hand with intellectual dishonesty); or serious prejudice on the part of the "expert." The pattern of promoting science when convenient, and ignoring or denouncing it when inconvenient, raises further questions regarding the lack of credibility of true-believing dietary promoters. The above examples further illustrate that such individuals are in denial of reality, an apparent occupational hazard for those with extreme views.

Logical gaps, internal contradictions, and leaps of faith

Extremists often have inconsistent views on the "proper" use of the human intellect.
You might encounter extremists who want you to: (1) use your intellect to agree with them that the raw vegan diet is the optimal, "natural" diet you should follow; then, (2) discard all of your intellect, even the most basic human tool-making skills, and limit your food choices to fruits and leaves, which are (falsely) claimed to be the only food choices of mythical "naked apes (humans), without tools." Needless to say, the above "logic" is really inconsistent nonsense. Even chimps use tools; should we humans voluntarily take ourselves lower than the chimps? The naked ape hypothesis, presented as selfevident or "revealed truth" by some, is so ridiculous that it might be hilarious, except that some people actually believe it. (By the way, naked apes without tools are not limited to eating only fruits and leaves: one can collect a wide variety of animal products--eggs, insects, snails, etc.--in that manner.) The extremist view of human intellect is similar to their view of science--use it when it helps to promote dietary dogma, ignore it at other times. Again, this reflects an underlying lack of rigor in failing to pause and think through the evidence and logic behind their views, and thus a parallel lack of credibility.

Examine extraordinary claims using basic common sense and logic. This provides an excellent credibility check.
In other words, if you stop long enough to begin thinking carefully and critically about what those touting

some exclusive "ideal" diet say, you will immediately reject many erroneous extremist positions. (By the way, we encourage you to use logic and common sense when evaluating the material on this site as well. We make no claims to infallibility, and welcome civil, coherent comment and criticism.) Some examples of extremist views that contradict logic and/or common sense are as follows.

Some promote the idea that sweet fruit is the "ideal" food for humans because it is supposedly very similar to human mother's milk.
This is of course nonsense; the nutritional composition of milk and sweet fruit are quite different. (See the article Fruit Is Not Like Mother's Milk on this site for a comprehensive analysis and breakdown of the nutritional components of both.) The credibility issue is relevant here on simple logical grounds when those promoting such crackpot mal-nutritional theories turn around and denounce all milk (except for human mother's milk) as unnatural and a bad food. Why? Consider the logic here: If a food is supposed to be good because its composition is like human mother's milk, then goat's milk, or even cow's milk, are better foods than fruit, as their composition is "closer" to human milk than is fruit. Conversely, if these other animals' milk is unnatural and bad, then sweet fruit is even worse because it is even less like mother's milk. Here, simple common sense and logic show how poorly thought-out this lopsided theory is--another reason to question the credibility of those promoting it.

In the case of protein (and perhaps fat/starch as well), one can find crank science: crackpot nutritional theories alleging to prove that these items are "toxic" or harmful. Meanwhile, there are such things as essential fatty acids, and essential amino acids, the lack of which are known to cause deficiency diseases and symptoms. Consider the credibility of any extremist who tells you that almost everyone on the planet (except them and their followers) has a diet that is mostly "toxic" foods, and that food components scientifically proven to be nutritional requirements are supposedly "toxic."

One can find "passionate" rawists claiming that the world's problems are caused by cooked-food consumption, and that if we all become raw vegans, there will be world peace, health, and happiness.
Hmmm... world peace, health, and happiness--all will come from what we put on our lunch plate? That's an awful lot to expect from what is on your lunch plate! If you have any doubts regarding the above, ask yourself: Was World War II an argument over, say, the recipes for egg salad and rice pudding? Has there ever been a war fought specifically over cooked food (vs. raw)? Do people rob, rape, or murder other people specifically because they eat cooked food? The answer to each of the preceding questions is a very clear "no." Is it really even believable on a more indirect level that the cooking of foods might affect the mentality of human beings to such a degree that those eating cooked foods would somehow be psychologically more warlike than those eating nothing but raw foods? In sharp contrast to such unsupported suppositions, and the more peaceful cooked-food eaters actually encountered online, this writer has been the target of hostile, vicious personal attacks (and threats) by allegedly "compassionate" vegan/raw-fooders. If anything, the extensive personal experience of this writer (with extremists) suggests that following a 100% raw vegan diet may make one less peaceful, and more hateful and aggressive. So much for claims of a new, raw "Garden of Eden." To summarize: How credible are those who claim the world will become an Eden if only we all become (100%) raw vegans? The question is particularly relevant to individuals and groups who have a negative, hostile, aggressive style.

How credible are claims that nature will ABSOLUTELY follow the simplistic dogma and bogus "laws of nature" promoted by the dietary extremist?
Nature is not limited to the narrow, absolute, simplistic models promoted by so-called "experts," who are often nothing more than idealistic dietary promoters in denial of reality. Nature does not care what we think of it, or what our models or "laws" say: Nature simply IS, and it is not bound by our simplistic ideas and delusions. Nature also does not care how "nice" or "logical" our theories appear to be. Logic alone cannot prove a theory; theories about nature must also agree with evidence--that is, reality. Don't you think that dietary advocates who claim to know everything about nature--that nature is limited to, and perfectly explained by, their delusions of simplistic absolutist "laws"--are merely expressing their own extraordinary arrogance and ignorance? Nature has many mysteries--that much is obvious to those who are not deluded by simplistic, idealistic dietary dogma. We obviously do not (not yet, at least, and not for the foreseeable future) perfectly understand nature, or its "laws." If, then, science is still continuing to plumb the unknown, how much less so can oversimplistic, dogmatic pronouncements about nature adequately reflect its operation?

Some "experts" make unrealistic claims about their diets: That it will bring perfect health, a perfect body, and/or will cure any/all diseases, and so on. Are such claims credible, or simply incredible?
Ask the proponent to give you a clear, testable, non-trivial definition of perfect health/perfect body (which, by the way, should be interesting in and of itself). Now for a way to test the credibility of those who make such claims: Ask them to sign a legally binding contract, as follows: You agree to follow their diet for a period of say, 3 years. At the end of that time, if you have perfect health/body, and/or your illness is cured, you will pay a fee of $1,000. However, if at the end of that time, you are not cured, or you don't have perfect health, then the "expert" is legally obligated to pay you $1,000. Such a dietary promoter will probably refuse such a contract, making the excuse that they cannot verify your compliance with the diet. But, of course, the same concern applies to them. Raw-food advocates, particularly the fruitarian wing, are notorious for not following the diet they promote, for binge-eating, and for the questionable honesty of certain extremists. If such individuals are as absolutely certain about the efficacy of the diet as they say they are, they should be happy to make an easy $1,000. But they won't, because even in their deep denial, they know that the diet they promote, when put into practice in the real world, often does not work "as advertised," which the refusal to risk paying a meaningful penalty if the diet fails, reveals.

Does the proponent claim his or her diet is based on, or uses, "eternal health principles"?
If so, ask the advocate for a list of these eternal principles. Is there any scientific proof for at least some of them, or do they have, in effect, the status of revealed principles that are supposedly "selfevident"? If they are revealed principles, are they presented explicitly as religion (potentially legitimate, at least when confined to that level) or as science (not legitimate)? A "health science" is different from a religion that has diet or health teachings. If the extremist cannot tell the difference between the two, what does that suggest about his/her credibility?

Does the dietary "expert" retreat into convoluted pseudo-reason to support their claims?
This is readily apparent when an obvious explanation--the diet does not work as well as claimed--

10

is a realistic answer under the circumstances. Meanwhile, the "expert" will come up with fanciful explanations which may contain some surface logic for why it does not work, but which break down when examined for the plausibility and probability that good alternate explanations must have. (Examples of this phenomenon are given below, under the discussion regarding how extremists react when told their ideal diet does not work.)

Signs of emotional instability and denial of reality


Those who promote (obsessive) fear as a motivation for their "ideal, perfect" diet are promoting eating disorders (mental illness) rather than a healthy diet.
One does not have to go far in the raw vegan movement to find people promoting rawism using pathological fear as a major motivation factor. In particular, the following fears are often actively promoted: fear of cooked food, fear of protein, fear of mucus. Fear, if it becomes strong enough (which can happen if one believes the "party line" strongly enough), can turn a raw diet into an eating disorder similar to anorexia nervosa. Fear is not the basis for a diet; it is, however, the basis for an eating disorder. Fear is an unsuitable motivation for a diet because of the central role of diet in our lives. There is a difference between rejecting a food based on calm reason, and rejecting a food from an emotional stance based on fear or other negative emotions. If you habitually base a food decision on negative emotions, those negative emotions become a part of your psyche. Another way to say this is that if your diet is motivated by fear, you are effectively bringing fear to the table when you eat. In this way, you are placing fear at the very center of your life: this is a very bad idea, as habitual fear is a toxic emotion. Some readers might claim that fear of unhealthy foods is in fact a good motivation because it will keep you away from bad foods. However, if you think about it, you will reject such reasoning as false, as outlined in the following discussion: Consider the common example of junk-food. You, personally, choose whether or not to eat junk-food. There are no gangs roaming the streets forcing people to eat junk food at gunpoint. The government does not send police or soldiers into your home to force you to eat junk-food. Not only is junk-food consumption your free choice, but unless you make it yourself, you have to go out and buy it/pay money for it so that you can eat it. Now, since junk-food consumption is a free choice, you can also simply recognize that it is not good for you, and, without any fear, choose to not eat it. Instead of feeling fear or other strong negative emotions when you see (or think of) junk-food, there is the calm, rational knowledge that it is not good for you, so it is best to avoid it, and you simply choose to eat other (better) food instead. Your reaction here, and your food choices, should be based on calm reason, not the implicit promotion of paranoid/pathological fear (or extremist slogans that implicitly promote fear). To summarize this point: Those who advocate a diet based on the implicit promotion of pathological fear (by "demonizing" certain foods or components of foods, e.g., protein) are promoting eating disorders, a type of mental illness. Are such "experts" credible? Also consider: Will promoting fear really make the world a better place?

When told that their "ideal, perfect, natural diet" is not working for you, how does the extremist react? What does their reaction say about them, and their credibility?
Extremists often retreat into rationalizations and excuses, e.g., they may say it's your fault, you are not "pure" enough yet, you need coaching, you need more faith/positive thinking to succeed, among other responses. Alternately, such individuals may speculate and say the cause of the failure is non-optimal living conditions, or other non-diet factors. Let's evaluate these responses.

Apply common sense. If you are attempting to follow a raw vegan diet which some claim (in
varying degrees) is the "perfect, ideal" diet, the diet you are "designed or created for," why would you ever need much coaching, faith, or positive thinking to succeed on it? Many other people go

11

through life on a diet of meat and potatoes, and they don't need coaching or faith to succeed (to the degree that they do) on such a diet. Does the cow need coaching, faith, or positive thinking to thrive on a diet of grass? Does the tiger need coaching, faith, or positive thinking to thrive on a diet of sambar (deer) meat? Of course not--they are eating their natural diets. Only very few thrive in the long-term on raw vegan diets: What does that suggest about the claim of naturalness of such diets? According to the extremists, the raw vegan diet is supposedly your natural, "perfect" diet. If that's true, then one would expect that, after a transition (some weeks or months, but not years), the raw vegan diet should begin to work extremely well for you, and for others following the same dietary path. However, if you talk to very many of those trying the diet, you will quickly learn that such diets often don't work as well "as advertised." More precisely, raw vegan diets may be good short-term healing/cleansing diets, but are frequently problematic in the long-term. The numerous excuses offered to explain this away are simply a type of hand-waving, an effort to distract you from the fact that the diet is apparently not working for you, by suggesting that you are being impatient. What is actually happening here, however, is that the extremists simply cannot, and will not, admit the truth--that raw-food vegan diets may be good for some, but not for everyone, that they are not our natural diet per se, but are a narrow restriction of our natural, evolutionary diet.

There is an additional danger presented by the claims one often hears that detox may take many years: You may develop actual nutritional deficiencies, or disease, and ignore it because you think it is detox.
This belief has led some raw-fooders to actual harm. The idea that detox takes a very long, indefinite time period is also used by zealous dietary advocates as an unfalsifiable tool to explain bad results on the diet: No matter how long you detox, it is never enough! Further, the fact that you are not getting good results is, to them, proof you need further detox (note the circular logic here). Additionally, the idea that it takes several years for the body to detox enough to get significant results places severe limits on the "self-healing is the only healing" myth that some promote.

Further, by blaming you if the diet does not work for you, such blindly emotional advocates are telling you something else as well, namely that dietary dogma is more important to them than you--a person--are! Some dietary oracles effectively worship dietary teachers of the past: T.C. Fry, Herbert Shelton, Arnold Ehret, and others.
Some of the teachings of such individuals are worthwhile, but others are outdated or simply incorrect. The idea that "Shelton said it, so therefore it is TRUTH" is narrow-minded, and promotes authoritarianism rather than independent thought. In the field of natural hygiene, the American Natural Hygiene Society is updating and revising, at least in part, some of Shelton's teachings to reflect new information. This is necessary--growth and change are a part of life; if they are absent, there is stagnation. Within the natural hygiene movement, however, there is now a major division as a result--those who want to keep natural hygiene current with new scientific findings vs. those who view any attempts at change as heretical attacks against the movement's grand synthesist (Shelton). The attitude of modern disciples or dietary missionaries regarding the teachers of the past provides some insight into whether the person clings to old, outdated dogma or actively investigates reality.

12

The desire for absolute certainty or "eternal" revealed truths in the field of diet and health is an earmark of religion rather than science.
Science is not just an end result, but an ongoing process of change and the willingness to reevaluate old beliefs (theories) in the light of new information. The missionary who presents unchanging, absolute, simplistic "laws" regarding diet and health is really presenting their false religion as scientific fact. Perhaps such individuals are simply religious fanatics out of place. That is, their diet has become their religion, and they are fundamentalist zealots, out to save the world from the "evil demons" of cooked food and/or animal foods. (If that is the case, how/when did cooked foods and animal foods become demons?) Note: The above is not intended as a criticism of religion; people have the right to choose their religion. Instead, the above is intended to point out that some extremists may be promoting their "dietary religion" as science, and doing so in a negative, fanatical manner.

Another trait of extremists is that they tend to promote themselves as the proof, or at least as prime examples, that their diet works and is "best."
Yet given the reality that very few thrive on such diets in the long-term, the claim by extremists that "it works for me, hence it can and will work for anybody/everybody," is very dubious, and in any case, based on invalid logic. The claim is that a few exceptions (such as themselves), somehow, prove a general rule. However, exceptions are used in reality to illustrate how the exception differs from the general rule, i.e., to clarify the general rule itself better, what cases it covers, what it doesn't, etc. In other words, exceptions do not prove a rule; because if they did, they would be bona fide examples of the rule, rather than the exceptions they actually are. An example may clarify the preceding. To be frank, the experience of the writers on this site regarding the few supposedly successful fruitarians, the "exceptions," is that such claims are often illusory or dubious, and usually fail (eventually) to hold up for one or more of the following reasons. o

The person has been on the diet only a very short time--a few weeks or months. This is very common among wanna-be raw "personalities." Even a year or two
or three is really not a long enough time to make definitive assessments about the longterm workability of a diet. Riding high on a wave of (perhaps understandable) enthusiasm, the individual may offer themselves up as an example, speaking out long before they have accumulated--or had a chance to observe, in others--much noteworthy experience. Even when they do hear of others' less-than-exemplary results, blinded by their own initial flush of short-term success, the poor track record of others on the diet often simply fails to register or is infinitely rationalized.

The person claims to be a fruitarian, but uses a different definition of fruit and/or fruitarian than the one used here, i.e., their diet is less than 75+% fruit, or
they use a broader definition of the word fruit. (Often so broad--nuts, seeds, leafy greens, etc.--that their "fruitarian" diet is in some cases little different than that of a raw-foodist in general.) The person actually eats a more varied diet than they claim, i.e., actual diet is different from the claimed diet, and/or they binge-eat, cheat, or make "exceptions" to their nominally 100% raw and/or "fruitarian" regime. The person's physical health might not be as good as claimed. (T.C. Fry defending fruitarianism while he was on his deathbed comes to mind here--see "The Life and Times of T.C. Fry" in the Nov. 1996 issue of the Health & Beyond newsletter, for details.)

13

The mental health or psychological balance and well-being of the fruitarian may be questionable (e.g., hateful fanaticism, denial of reality, extreme emotional
fragility--just challenge their diet and you may see this firsthand, etc.).

Obviously, success claims made when any one of the first four of the above five conditions exists automatically invalidate themselves, and claims by those exhibiting behavior outlined in the final condition have to be considered highly suspect. (Often one finds out that under closer questioning, one of the other conditions turns out to apply anyway, when all is said and done.) Here, the heavily self-oriented and self-promotional emphasis on the guru's own success (or that of a very few associates) is proffered as the ultimate justification for their diet. Again, this simply illustrates another facet of the often extreme subjectivity (and consequent lack of concern for a more balanced attempt at objectivity) that typifies the lack of credibility of such advocates.

Extremists may characterize some of the contributors to this site as "failures," because all of the principal writers here eat limited amounts of cooked food, and some of us have experienced problems with raw diets.
Horrors! We eat some cooked foods! We are impure! Call the dietary purity police right now to report this terrible crime! ;-) And while you're at it, also call a mental health professional to discuss how dietary purity has somehow become a critical, but warped and dysfunctional part of your personal self-identity. :-) Reminder: The principal writers on this site have long experience in raw diets. In many cases, we have more experience than some of the prominent extremists.

Have you noticed just how rare strict long-term 100% raw vegans are?
How very few people have been on the diet more than a few years full-time, and without any binges, cheating, "exceptions," or "backsliding"? Have you searched for ex-fruitarians, or exrawists, and found them to be more plentiful and much easier to find than those currently on the diet? The point of these questions is to illustrate that fruitarianism rarely if ever works in the long term. (Note that other raw diets tend to work better long-term than fruitarianism. Also, raw diets that are less than 100% raw often work better than 100% raw--surprising but true.) Those who have followed the diet (relatively strictly) continuously, for a short term--less than, say, 5 years (a description of some of the extremists advocating fruitarianism)--cannot realistically call themselves a success. Is someone on the diet for a short term a credible example of success?

There is a more serious side to the issue of success, a side that deserves discussion. The argument may be made that "fruitarianism did not work for you, but it works well for me" (where me = a fruitarian advocate).
The problem is, as discussed above, that if one closely examines some of the "role models" of fruitarianism, one often finds, in time, that many of these role models display eating-disorder behavior (binges and/or lying about eating), or other signs (in my opinion) of possible mental unbalance. So, when someone says that "fruitarianism works for me," it can be very hard to believe. Somewhere there may be credible fruitarians who follow the diet very strictly for very long periods, who are physically healthy and mentally balanced, but I have not met any in my opinion. To be complete, I should say that I have met a very few, non-strict fruitarians who appear to be relatively healthy. Perhaps the key to success on a fruitarian diet is to not be so strict, and to include a variety of foods in the diet?

14

Further, there is something else here that should be obvious. When an extremist tells you their raw diet is "ideal," and that if someone else (e.g., any of the contributors on this site) is derogatorily called a failure or backslider because the diet didn't work for them, you should immediately conclude that if the diet does not work for you, then YOU will be branded a failure as well.
Think about that, if you are considering trying fruitarianism, and you are looking for a teacher or mentor. This type of behavior, when apparent, is about the clearest evidence one could ask for to confirm that such a guru cares more about their dogma than the health of their followers.

Examining the personal diets of raw vegan gurus: a potential tool for assessing credibility
DOES THE DIET REALLY ADD UP? (CALORIES DON'T LIE) Looking closely at an individual's self-reported diet may be a powerful technique, at least in some cases, to assess the credibility of a raw vegan "expert" or "diet guru." However, in attempting to use such information for assessment purposes, one faces the central problem of the raw "experts," i.e., the reality that what they claim to eat is frequently not the same as what they actually eat. In other words, self-reports of diet (especially from extremists) may be unreliable, particularly as such reports may conveniently omit details of "exceptions" or binges.

Compare calorie requirements vs. intake. However, even rough estimates of the calorie levels of
self-reported diets may give some insight into the person's credibility. The site article, The Calorie Paradox of Raw Veganism, provides comprehensive information that can potentially help you evaluate the diet of certain "experts." The general evaluation procedure is as follows. First, using the age, weight, gender, and claimed level of physical activity, make a rough estimate of the daily calorie requirements for the individual in question. Standard sources that one can use to derive estimates of calorie expenditure include NRC [1989, pp. 24-38], and the U.K. Department of Health [1991, pp. 15-38, especially tables 2.5-2.8]. In general, however, most individuals will require 2,000 calories/day as a minimum (anorexics and extremely emaciated individuals excepted, perhaps; even in such unusual cases, however, it would be rare to find anyone consuming less than 1,600-1,800 calories per day on a long-term basis.) Second, ask the person about their typical daily diet--get as detailed a description as possible, one that includes amounts consumed. (Amounts are important; without such information you cannot make a reliable estimate.) Then use the information in the "Calorie Paradox" article to make a rough estimate of their daily calorie intake. Next, compare estimated calorie requirements with the estimate for calorie intake. If there is a large discrepancy, i.e., calorie intake is far below calorie requirements, then there are three possible explanations: The "expert" is getting energy from some mystical source. They have anorexia nervosa or a similar eating disorder. They are not being honest with you about quantity/types of foods consumed (e.g., the "expert" binges and/or cheats on the diet, and won't admit it).

15

The simple procedure above will disqualify (or raise serious doubts about) a number of raw, especially fruitarian, diet gurus. No doubt such advocates will make rationalizations and excuses in their defense. However, the simple explanation that the diet gurus are not being honest about their own diet is far more plausible than unsupported rationalizations. Obviously, someone who claims to thrive on a (long-term) diet whose calorie content is at or below starvation level has no credibility.

If the "expert" reports adequate caloric intake. A few raw vegan proponents report food intake
patterns that imply adequate calories, and on the face of it this is a positive sign, at least initially. But there are also a few remaining hurdles when assessing the claims. In probing further, other information from the "Calorie Paradox" article applies, and one should evaluate additional related dietary factors:

High-bulk diet? Does the person claim to eat massive amounts of raw vegetables each day? If
so, is that credible, and/or do you want to eat similar amounts every day yourself? Fruit diets: excess urination? If your "expert" reports consuming large amounts of sweet fruit every day, an interesting exercise is to make an estimate of the daily liquid volume of urine implied by the diet. If you are considering a diet of 4-6 kg. (8-13 lbs.) of fruit per day (the amount that would have to be eaten to obtain adequate calories), note that a substantial portion of that daily input will be excreted as urine. In most people, this will produce "excess" urination. In this case, see if you can obtain information about frequency of urination experienced on the individual's "ideal" fruit diet. Note: Some who follow a fruitarian diet may actually get habituated to frequent urination, and may regard the need to urinate several times per hour as "normal" or "healthy." This needs to be considered in discussing the above with fruitarian proponents.

Long-term maintenance diet vs. short-term diet. A further distinction one needs to
consider in evaluating such diets is that a short-term healing diet might not provide adequate calories. However, a long-term maintenance diet must provide adequate calories. Thus you may not need to analyze a diet advertised only as a short-term therapy. However, you may still find that an analysis of the long-term maintenance diet the "expert" claims is ideal nevertheless provides insight into their overall credibility.

The above are relevant issues to consider in evaluating the credibility of those peddling "ideal" raw/veg*n diets.

Raw vegan extremist behavior patterns: the darker side of rawism


Introduction: rose-colored glasses vs. the unpleasant realities.
Many of us are attracted to and/or get involved in raw vegan diets because of the positive idealism and optimistic (albeit simplistic) outlook that is a part of the "party line." It can be disconcerting, therefore, especially for those who are new to the diet and filled with enthusiasm, to learn that raw vegan diets are not cure-alls, are not perfect, and so on. Even more disconcerting, though, is when one learns that the behavior of certain raw diet gurus is a betrayal of the positive moral qualities that (in theory) underlie vegan diets in general; and further, that the behavior of some diet gurus is a massive betrayal of the trust people place in them, as well. Accordingly, readers are warned that this section will not appeal to those who prefer the myth of raw diets to the reality. There is a huge gulf between the "party line" of raw vegan diets and the reality. The behavior of extremists discussed here, e.g., hostility, threats, attacks, plagiarism, "dietary racism," etc., is not a positive or pleasant topic. (Names are omitted here for the obvious reasons.)

16

Thus if a frank discussion of the irrational, negative behavior of certain raw gurus offends you, then you might prefer to skip this section. However, the purpose behind it is that the material which follows may help you avoid needless grief, wasted time, or compromised health from learning the hard way. In that spirit, let's take a look at the darker side of involvement with some of the gurus of rawism.

Some raw/fruitarian "experts" are unable to function in a civil manner.


Some of the writers on this site have been attacked on multiple occasions in the past by hateful extremists (most of whom are vegans and/or fruitarians). A number of such raw/fruitarian diet proponents have earned deservedly bad reputations for their intense, chronic hostility.

Extremists may become socially disabled by their "ideal" dietary dogma. What does chronic
hostility suggest about the credibility of such individuals? If they frequently react to challenges to their dogma with hateful personal attacks, extreme anger, threats, plagiarism, copyright infringement, character assassination by quoting others out of context, and/or dishonesty (all of which are discussed herein)--then the obvious implication is that the "expert" may be emotionally impaired or disabled by his/her "lunch philosophy." Certainly it is reasonable to check for the presence or absence of emotional balance in such dietary advocates. Can a dietary proponent who is emotionally disabled by dogma have much credibility? Would it be a good idea for you to adopt the dietary dogma of such an "expert"? The fact that a number of raw/fruitarian extremists have been thrown off (multiple) email lists on Internet for hostile behavior provides evidence for the hypothesis that at least some of these dietary prophets are unable to function socially, i.e., are at least partially socially impaired by their "ideal" dietary dogma. Perhaps certain extremists are proud of being thrown off email lists; they may see it as proof that they are "radical." However, in reality it simply reflects how hateful and foolish their behavior is. It's worth pointing out that at least two of the extant raw email lists exist because their owners, moderators, or founders were thrown off other email lists (for cause), and had no other choice but to form their own separate lists. Of course, such individuals often cry "censorship" when they are booted from email lists for repeated disregard of the charters that govern list conduct. Such claims are bogus, of course, as censorship is a government function, and the email lists are privately owned--their members residing on the email list at the grace and generosity of the list-owner and moderator.

Threats, bullying, and harassment: another extremist tactic.


The use of threats and bullying provide dietary extremists with yet another weapon to harass people and to try to silence those who dare to challenge their idealistic, "compassionate" dietary dogma. Just a few examples are provided, as follows. This writer was the target of what could have been interpreted as an implicit death threat from a raw/fruitarian group widely considered (opinion) to behave at times in ways similar to a hate group (time period here was Summer 1997). While the implied threat was couched in hypothetical terms, and could also have been easily interpreted as mere bluster, an individual from the same group did eventually threaten actual physical harm at a later date. (See "1998 Raw Expo" below.) One vegan extremist, widely regarded as a "kook" on Internet, is notorious for threatening people with libel lawsuits. Although most people don't pay serious attention to the individual, the individual may be an example of another socially disabled dietary fanatic. Threats surrounding 1998 Raw Expo (San Francisco). At the request of the organizer of a Raw Expo in 1998 (San Francisco), this writer did not attend because the raw/fruitarian group mentioned above was reportedly threatening to disrupt the event and/or possibly physically assault this writer, if he appeared at the Expo. This "Expo affair" provides an interesting view into the warped thinking and behavior that is condoned in some raw circles. Consider the situation: a raw/fruitarian group threatened possible violence if this writer, a critic of fruitarianism (and a former long-time fruitarian), attended the event. They used such threats to

17

intimidate the Expo organizer into requesting I not attend. Then, in response, instead of taking simple security measures or simply advising the raw/fruitarian group they were not welcome to attend the Expo themselves given such threats, the organizer chose to ask the target of the threats (this writer) not to attend. The rationalization offered was that the raw/fruitarian group had already reserved their booth and was to give a lecture at the Expo--and the importance of such economic interests was emphasized by the organizer. I eventually acceded to the Expo organizer's request as a gesture of courtesy to them. The moral issues illuminated by this event are crystal clear: Some true-believing dietary promoters are so hostile and mentally deranged as to resort to threats of violence when their simplistic dietary philosophy is challenged. Further, some other rawists, e.g., those who continue to ally themselves with such extremists--either in spirit, or who cooperate with them simply out of financial interests--appear to be willing to tolerate those who use threats of violence and bullying as tools to promote the "compassionate"(?) raw vegan diet. In summary, what transpired in this situation illustrates clearly that some rawists believe the raw vegan diet is far more important than honesty and decency.

Does your diet guru promote "dietary racism"?


Let us begin by defining the term, as follows. Dietary racism: A form of bigotry that is the analogue, in dietary terms, of racism; i.e., the hatred of other groups of people because their diet is different from your diet or a hypothetical "ideal" diet, and/or feeling superior to other people because your diet is different. Some raw and veg*n diet gurus actively discuss compassion and/or how the diet they promote will make the world a better place. However, the reality is that their message (e.g., in the case of certain extremists) may be seriously undermined by the presence of blatant or subtle dietary racism. A couple of the more blatant examples are:

You are "inferior" unless you follow the advocated diet. Some raw/fruitarian proponents
like to insinuate that unless you have a 100% raw vegan and/or fruitarian diet, then you are "inferior" and/or your DNA "mutates," and you become something less than fully human (i.e., like an animal). The rhetoric of bigotry. Some raw/fruitarian advocates refer to those who eat cooked foods as "cooked-food morons," "cooked-food addicts," or say they are "failures" or "degenerates." Extremist view: crank science good, "cooked science" bad. Very common is for raw or fruitarian promoters of crackpot crank science theories to reject conventional, legitimate science that debunks their dietary dogma, on the grounds that the scientists involved eat cooked foods, hence their work is unreliable and cannot be trusted. This may be expressed in other terms, e.g., saying the scientists have "damaged brains," they deal in "cooked science," and so on. If the bigotry in such assertions is not clear to you, consider Hitler denouncing "Jewish science," and the analogy should then be readily apparent.

Somewhat more subtle examples of dietary racism are:

Pride in compassion = self-righteousness = feelings of superiority. Veg*ns who actively


claim they are more "compassionate" individuals may come to express outright disdain for, and implicitly behave as if they feel they are superior to, those people who eat meat. Vegans may claim that those who eat meat are "murderers," because "meat is murder." (This specific claim is not very common.)

Many, and probably most, of the individuals who follow raw/veg*n diets are adamantly opposed to racism, and would not tolerate hateful racist attacks on ethnic groups. Why, then, are so many silent when raw/veg*n extremists openly engage in dietary racism (which can be very vicious, at times)?

18

Assessing diet gurus who display dietary racist behavior. If you find significant and long-term
evidence of such "dietary racism," you should consider the impact that has on credibility, and whether you really want to have such a person as your "diet guru." Does the hatred and bigotry implicit in dietary racism reflect your personal views and values? If you would not follow a regular racist, why would you follow a dietary racist? Do you seriously think dietary racism, as a tactic, will help make the world a better place? Finally, the use of the term "dietary racism" here is meant to clearly illuminate the bigotry and hatred of others that, sadly, is close to the heart of the message promoted by certain raw/veg*n extremists. (The intent is not to diminish or slight the tragedy that regular racism represents.)

Raw/fruitarian plagiarism: a sign of moral bankruptcy among extremists and "tolerant" followers.
It is widely known in the raw community on Internet, and is becoming more widely known outside Internet (and in general veg*n circles) that one of the most heavily promoted recent books on raw-food diets/fruitarianism is in fact a massive plagiarism of the book Raw Eating, which was written by Arshavir Ter Hovannessian and published in English in Iran in the 1960s. The original book Raw Eating is long outof-print, although it may occasionally be found in used bookstores in some countries. Plagiarism is a form of theft, as it comprises the use (and, in this case, the sale) of the intellectual property of another, all done without proper attribution. Further, presenting oneself as the author of plagiarized material is misrepresentation (i.e., lying). If the plagiarism, which so far has been limited to one small extremist raw/fruitarian group, were widely condemned by most other raw/fruitarian "experts," then there would be little reason to discuss the matter. However, that is not what has happened as news of the plagiarism has become widely known. Instead, many raw/fruitarian "experts" have rationalized the plagiarism, and have either sided with the plagiarists or expressed tolerance for (i.e., condoned) the plagiarism. (Note: Isn't it surprising that some raw and fruitarian "experts" who are trying to sell books and tapes--ones they are authors of--would closely ally

Direct effect: can you really put much stock in anything a plagiarist, or one who defends a plagiarist, claims? The alleged raw/fruitarian "experts" who present themselves as "authors" of
themselves with plagiarists?) Plagiarized material, and other alleged raw experts who are closely allied with, or actively support the plagiarists, is implicitly establishing a very important principle with moral implications: that it is acceptable and legitimate to engage in plagiarism and misrepresentation, if the end result of such efforts is that you persuade others to adopt a fruitarian/raw vegan diet. More explicitly, the principle and its ramifications are:

Extremist Principle: Misrepresentation is, or at least can be tolerated as, a promotional tool if it serves a "larger good"--that is, convincing people to adopt a fruitarian or raw vegan diet. Once this code of behavior is accepted, one has to conclude that
every claim made by such extremists (plagiarists as well as allies of plagiarists) is suspect, and nothing they say can be completely trusted. After all, if the "expert" thinks it's permissible to engage in misrepresentation, how can you assess whether they are telling the truth when they claim they follow a 100% raw/fruitarian diet, that their diet is so great, and so on? Indirect effect: Selling one's virtue for a raw lunch. The acceptance of misrepresentation-which constitutes complicity by those who either side with, or in some manner defend or actively publish the message of such plagiarists--establishes yet another moral principle (which logically follows from the above). In outline form, the elements of it are: A. Principle #1: Misrepresentation is okay if it promotes raw diets. B. Misrepresentation = dishonesty = giving up virtue. A and B together, comprise a restatement to the effect that:

19

C. Giving up virtue is OK if it promotes raw diets. We also have: D. Giving up virtue = exchanging virtue = selling virtue. Note that the word selling here is quite appropriate because an exchange, in figurative terms, has been made: the extremist exchanges his or her honesty and integrity (i.e., virtue) for an increase in the ability to convince others to adopt the 100% raw/fruitarian diet. This brings us to: Principle #2: Selling your virtue is legitimate, if the price you charge is that more people eat a raw vegan diet. The point of the above exercise is simply to show that even extremists who do not plagiarize, but who merely condone it, are still, in effect, selling out their own virtue too (i.e., their own honesty and credibility) for the power to recruit more people to their diets (by recommending the works of plagiarists to others, or helping to spread word about the authors or "their" works, etc.). In selling out their virtue, then, people are, in a sense, "prostituting" themselves for the sake of promoting their "ideal" diet at the price of integrity and honest representation of facts.

Ends do not justify the means. The two principles discussed above could be seen as versions
of the tired old "the ends justify the means" excuse, which might theoretically be used in an attempt to justify any deviant behavior, from petty theft to murder. Be aware of this, should "the ends justify the means" argument actually be brought up as an explicit defense by overzealous diet promoters.

"Refusal to judge" where plagiarism is concerned is hypocritical and simply a cover-up for denial. One of the common excuses raised by rawists or fruitarians who have
elected to refrain from comment, or to take a stance, with regard to the plagiarism mentioned above has been to assert that they are simply being "neutral" when they "refuse to judge" the plagiarists. However, this is really an attempt (albeit a mostly subconscious one, probably) to divert attention from the important issues involved. Basically, the "refusal to judge" excuse fails for the following reasons: o

The decision to eat a raw diet is, obviously, in itself an instance of judgment.
Those who use the "refusal to judge" excuse are simply being silly. Judgments are necessary in life, and we make them all the time. Here, however, a double standard is promoted that one should engage in making judgments when selecting a diet to follow, but should not judge relevant behavior of one's erstwhile "diet guru." What this really indicates is a (perhaps subconscious) refusal to be honest with oneself about the fact that we make judgments all the time; and of course, such a double standard is hypocritical. Here, "remaining neutral" is simply a "cop-out." Additionally, the refusal to judge when there is clear evidence of plagiarism or other dishonest behavior is simply a form of denial of reality. In a way, this is not surprising, as (far too) many raw diet gurus appear to be in varying stages of denial of reality themselves.

Short-term gains at the expense of long-term growth/credibility. A very important


underlying issue here is whether plagiarized writings (or any that may be misrepresented in other ways, for that matter) form a solid basis for the long-term growth of the raw/fruitarian community, or whether plagiarism marginalizes the community and sharply limits growth. One would expect a credible raw advocate to oppose plagiarism on the grounds--ethical considerations completely aside--that it is harmful, in the long run, to the entire raw community in terms of credibility.

20

Because of this, those who make excuses and ally themselves with plagiarists are in effect relinquishing their own credibility as well.

If you follow the advice of a particular dietary guru, where do they stand on the issue? Because it has significant bearing on the credibility of your diet guru, you may find it
relevant to determine if they side with those have plagiarized, or have engaged in such behavior themselves. This is primarily of relevance to those in the raw/fruitarian community; it is not relevant at present to the conventional veg*n community. Note here, of course, that it's possible you might not get an honest answer if you directly question your diet guru on this issue. Thus, you may need to ask around the raw community for information. With a bit of searching and detective work, you will find out what is going on.

"So nobody's perfect": What's the difference between forgivable offenses/human error vs. real extremism/lack of credibility?
It should be acknowledged here that everyone makes mistakes on occasion. After all, no one is perfect. It's only human that sometimes people may get caught up in the emotion of the moment, and say or do things they later regret. At other times, or in other ways, unfortunate errors in judgment may occasionally be made. In fact, for the record, we ought to 'fess up here that one of the Beyond Veg site editors was once suspended from an email listgroup briefly for an egregious practical joke that got out of hand, causing others on the list considerable anguish. As was the case in that particular situation, however, one needs to examine whether regret is expressed, a genuine apology made and accepted and something learned from the situation, and the behavior rectified. It is not the intention here to promote intolerance for human errors in judgment, or emotions getting out of hand on occasion. Nor should people get too bent out of shape about the occasional flame (so long as they remain occasional), become oversensitive about a few sporadic heated exchanges, or be unable to take things in stride and so forth on email listgroups or elsewhere. If we couldn't, then it would be a world of impossibly inhuman expectations few could live up to.

How does one distinguish between actual bad faith or lack of credibility, and simple human mistakes? There is an important key by which one can assess if a particular behavior is coming out of
extremism and lack of credibility or not. Very simply, it is when the person regrets what they have done and has the character to offer a genuine apology when having "crossed the line," and gives some indication of having learned something, in that their behavior thereafter changes. With regard to extremist behavior, what distinguishes bad-faith tactics is that this almost never happens. At least to date, it's not something that's been in evidence with the fanatical types of behavior under discussion here. (Also, extremists rarely ever will admit they were wrong about anything when information and evidence are at issue either. Here, admissions of error in owning up to having missed the mark are very similar in their psychology to admissions of apology for errant behavior.)

Rationalizations rather than apologies or admission of error indicate extremism. Why the
unwillingness to apologize? Here, what is probably even more interesting is that you will find when extremists are asked about the kind of behaviors that have been discussed here, they generally simply do not see them as a problem--or at least not much of one--even in the face of feedback from numerous others that they are. An additional key here, then, is: not only do such inflexible absolutists not apologize, they don't see any problem with such behavior in the first place. That is, the behavior is simply rationalized: i.e., it's "radical," "passionate," "raw courage"; plagiarism may be "proactive marketing," etc.; but rarely ever is it a mistake that is admitted, regretted, or apologized for. And, sadly, apology itself may even be labeled a sign of weakness, rather than the sign of character strength and credibility it actually is.

21

In Bad Faith: dishonest information and debate tactics in raw foods--personal experiences.
Let's now consider a raw diet promoter who decides to attack some writings that he/she disagrees with, let's say, for instance, some of the writings on this site, which such individuals almost certainly will be likely to dispute. If an "expert" challenges statements made here via the debate tactic of repeatedly demanding "proof" (implicitly meaning published scientific evidence) for specific claims for which only anecdotal evidence is available, then one must question the credibility of said "expert." After all, a real expert would have knowledge of at least part of the scientific evidence relevant to raw diets, and should recognize that for many issues in raw, only anecdotal evidence is available. Like it or not, the reality is that anecdotal evidence is all that is available for some questions, and those who are credible acknowledge reality. A chronic lack of such acknowledgment, therefore, suggests that the alleged "expert" is using (whether deliberately or more unconsciously) deceptive tactics as a debate strategy, hence may be intellectually dishonest. But this is only one tactic among several others. As it happens, these sorts of tactics have already become familiar to a couple of contributors to this website from direct experience.

Extremist debate tactic: Quote out of context, then misrepresent the material. The
tactic mentioned in the above two paragraphs can be illustrated with examples from this writer's personal experience. On a few occasions in the past, certain of my articles have been republished-without permission (copyright infringement)--and were angrily attacked by a fruitarian "expert." In such republications (on an email forum), the articles were interspersed with numerous angry demands for "proof" from the alleged "expert." The articles were also misrepresented and misinterpreted, as well--by the exclusion of other specifically relevant parts of the article that would have revealed the distortions this individual engaged in, of course. Relevant comments on these attacks are as follows. o One such article (Vegan Attitudes Toward Instinctive Eating, available on this site in since-updated form) was (and still is) labeled explicitly as personal opinion--needless to say, no one is obligated to "prove" their opinions to an angry, irrational extremist. Hostile attacks may result from underlying eating-disorder behavior. One angry attack included brief quotes from the article, Health Food Junkie, by Stephen Bratman, available on this site. Inasmuch as the "Health Food Junkie" article describes how obsession with diet can become something akin to a mental illness (i.e., an eating disorder), in this instance it can be rather humorously observed (in my opinion) that the angry attack merely provided an excellent example of the type of unbalanced behavior typical of those with such eating disorders.

Extremist debate tactic: demand proof NOW even when an article has been explicitly presented as a summary only, with further evidence scheduled for future publication. Another article (Selected Myths of Raw Foods, also available on
this site) was specifically labeled as an introductory "capsule presentation" only. Made explicit in the very first paragraph of the article was a statement that a more detailed treatment with supporting evidence would be provided in other articles, to be available later. When the "expert" republished extensive parts of the article that they preceded to dispute, they conveniently left out the above notice regarding evidence to follow at a later date, and repeatedly demanded "proof" from the article. At yet another point in their slicedand-diced republication of the article, the "expert" deleted still another explicit statement that evidence for the specific issue being discussed in that passage would be provided in a scheduled future article, while inserting their own demand for (immediate) "proof" of the statement. Consider: Are such actions fair and honest, or are they unfair, deceptive, and intellectually dishonest?

22

(Note: The examples mentioned in the three above bullet points occurred some time ago in 1997-1998.) o

Harassment via spamming. Besides the above unauthorized republications, the


fruitarian "expert" also involuntarily subscribed this writer for a short period to their email list (in my opinion, a crank science nutrition forum) demonstrating (in my opinion) their disabled emotional state by repeatedly attacking this writer in a vituperative, insulting manner. Needless to say, such behavior constitutes spamming and deliberate harassment. To put such behavior in perspective: Would you consider it reasonable if someone were to take three articles from a popular health magazine (at least one article of which is identified as editorial opinion), and republish them on Internet (in violation of copyright), interspersed with numerous angry, egocentric demands for "proof," and including insults and denunciations of the magazine article authors? Consider: Would that be the action of a rational, sane person--or would it be more accurate to describe such behavior as irrational, insane, and hateful? Is this the behavior one would expect from a serious, "scientific" critic (which the fruitarian "expert" here appears to pretend to be)? Or is such behavior what one would expect, rather, from a raving lunatic or crank? Finally, would that be the act of an honest, credible individual? What credibility does a raw/fruitarian "expert" have who (frequently) engages in such irrational and unfair behavior regarding the writings of others?

Extremist often cannot/does not meet the standard of proof they demand from others. To make matters worse, such behavior is also hypocritical, as the alleged
"expert" habitually does not provide--and has not provided (to date)--published scientific studies on fruitarians (i.e., "proof," by their own standards) for ANY of their claims about fruitarianism. (There are almost no published research studies done on fruitarians.) This illustrates a hypocritical double standard: the extremist demands "scientific" proof of common anecdotal observations that fruitarianism does not work, while they have not yet presented any published scientific evidence that it does work. It also neatly reverses the burden of proof--a logically invalid step by accepted standards of evidence. That is, the efficacy of fruitarianism has never been proven, hence the logical burden is on the advocates of fruitarianism to provide credible proof, not on others to prove it doesn't. To date, no proof--other than crank science theories, of course-has ever been presented. Meanwhile, extensive anecdotal evidence, combined with extensive indirect scientific evidence, suggests that 100% raw vegan fruitarian diets are nutritionally deficient and highly unlikely to be successful in the long run.

Example of raw/fruitarian debater relying on plagiarized source material. In another


incident by a different raw/fruitarian proponent, one "diet guru" challenged a different writer on this site to debate evolution. The challenge was made in a long article, posted to a raw listgroup, that criticized evolution for allegedly being unscientific. However, it turned out that over half of the material in the posting the raw/fruitarian writer claimed authorship of (and put a copyright notice on, no less) was in fact plagiarized from the creationist book, Darwin on Trial, by Phillip Johnson. Detailed side-by-side comparisons of passages of the raw/fruitarian advocate's post with those from the book that were made in a reply to the fruitarian's post left no doubt about the extensive plagiarism. Is there any reason to debate a known plagiarist?

That such dietary extremists are willing to stoop to such tactics in defense of the dogma they promote merely underscores that they are not interested in reality or decency. Instead, they have the "one true religion and science of the 100% raw fruit/vegan lunch" to promote; and dishonest debate tactics, hostility,

23

and blatant hypocrisy are standard operating procedure for certain of these so-called raw/fruitarian "experts." What credibility can they be given on other issues, when they so blatantly disregard any standard of truthful disclosure in instances such as this? The obvious answer for any thinking person is that such "experts" forfeit any claim to credibility they might otherwise have had.

Rationalizations of extremist behavior.


No doubt some will try to rationalize their negative behavior via the tired, old excuse that they are merely "passionate" (about their lunch, no less). Ask yourself whether disagreements over lunch philosophy can justify the above aggressive, dishonest acts. The extremists call it "passion"; however, most other people would regard such behavior as reflecting intense hatred and mental toxicity. Note the emphasis here on emotional and mental balance/health. There is a reason for such emphasis--the inability to function in a civil manner when one's dietary dogma is challenged is a common condition among such individuals, and is a sign they may be mentally UNhealthy, and hence have little credibility.

Mainstream rawists are usually not extremists.


Implicit in the usage of the word "extreme" is that it denotes unusual, not-so-common circumstances. Here this means that although the raw and veg*n movements are infested with extremists, especially the fruitarian wing, it is not the case that every rawist is an extremist. Although it may be redundant, the preceding should be stated explicitly, as follows.

Most rawists are not extremists. Most veg*ns are not extremists. Many fruitarians are not extremists.

Having asserted that most rawists are not extremists, it should be noted that in my opinion, at least some of the self-appointed "experts" or "diet gurus" are, and the raw and veg*n movements suffer from the burden of their negative influence. While extremists themselves may not be common in raw, unfortunately they do tend to comprise its most visible element--and the extremist attitudes, myths, and misconceptions they promote have filtered out into the movement and are very commonly reflected in various rawist beliefs, myths, etc. Do such factors help the raw community to grow, or do such fanatical attitudes and behavior patterns simply relegate raw diets to the "lunatic fringe" of the alternative health world?

24

The Raw Vegan Pledge of Extremism, Ignorance & Denial


by Tom Billings Copyright 1998 by Thomas E. Billings. All rights reserved. If you've just surfed into the Beyond Veg site and this is the first article on the site that you read, we'll give fair warning here that without a background context for the piece, you might falsely assume the Pledge is anti-vegan. (Vegans are known for--how to say it--taking their lunch far too seriously at times.) In reality, the Pledge is a call for reform in the raw vegan movement, disguised as sarcastic satire. To avoid any misunderstanding, we suggest you read through the other material pertaining to raw veganism on this site before reading the Pledge. However, if you've already read a number of articles on the site or you're particularly adventurous, then you MAY be ready for the sarcastic humor of the Pledge. How can you tell? It's not hard: If the material on this website does NOT appeal to you; if it offends you by challenging many of the things you believe in; if it offends you by criticizing your lunch, then perhaps you should seriously consider actually TAKING the following pledge. On the other hand, if you appreciate the material on this site, you might find the lampooning here of raw vegan extremists to be very funny and "on target." Enjoy! :-) Note: The following is intended as a sharp, sarcastic satire of the current state (1998) of extremism in the raw vegan world. If you are easily offended, you may want to skip this article.

The International Raw Alliance of Vegan Extremists (IRAVE) MEMBERSHIP PLEDGE


WHEREAS: 1. The raw vegan diet is the one true religion and science of perfect health. 2. The vegan diet is based on high moral principles, particularly compassion. 3. Compassion should be promoted toward all living beings on this planet, with the obvious exception of those murderous meat-eaters and the "cooked," those who consume cooked foods. (Those who eat meat or cooked foods are degenerates, hence are "fair game" for insults, harassment, threats, and other similar forms of enlightened, compassionate interaction.) 4. The International Raw Alliance of Vegan Extremists (referred to hereafter by the initials: IRAVE) is an organization devoted to promoting the glorious 100% raw vegan diet. 5. IRAVE has extremely strict criteria for membership. AND: I wish to become a full-fledged member of IRAVE, including the rights and responsibilities thereof (i.e., the right to believe and follow, without challenge or question, the teachings of IRAVE, and the responsibility to regularly send money to the IRAVE leadership). IN RECOGNITION OF THE ABOVE:

25

I do hereby, of my own free will, swear or affirm my complete allegiance to the following pledge:

PLEDGE
I will follow a 100% raw vegan diet, with no backsliding or exceptions! If asked whether I ever cheat on the diet, I will DENY it, and I will IGNORE the fact that others backslide or cheat. Remember one of the many brilliant slogans taught by the IRAVE leadership: Ignorance is bliss! I will IGNORE the post-1970 research that shows animal foods are a normal part of the diet of the great apes, our closest primate relatives. If challenged on this, I will DENY the validity of current research and claim that the apes eating animal foods are "in error," "perverted," or give other imaginative rationalization(s). I will IGNORE the scientific evidence that our prehistoric ancestors were omnivores. I will cite outdated (and retracted) dental wear studies to support my views. If necessary, I will DENY science and evolution, and adopt creationism, for the sole reason that it can be molded to support my false dietary dogma--that humans are natural vegans/fruitarians. I will IGNORE the evidence of comparative anatomy, which suggests that animal foods can be a natural part of the human diet. Instead, I will use the misleading human-lion-cow comparison, and claim that it shows we are natural vegetarians because we are "closer" to cows than lions. I will IGNORE any health problems that occur on the diet, and will consider them as a sign of detox, a sign from above that I am not fully complying with the glorious 100% raw vegan diet in the theologically correct manner. I will resolve to follow the diet ever more closely, and will resist all temptations by the evil demon of cooked foods! If 100% raw veganism does not work for someone, and the "detox" excuse fails, then I will use another unfalsifiable excuse: food quality. I will claim that their organic food was grown on a former toxic waste dump, or demand they produce a log of Brix readings and soil tests for all food in their diet. In so doing, I will IGNORE the reality that high-quality foods are but one factor among many others in diet; do not guarantee success on any diet; and I will DENY the reality that such demands are arrogant, hostile, and stupid. I will observe, and keep holy, the sacred sacrament of food combining. I will tell others that the diet is ideal, perfect, and natural, and will DENY the real experience of those who have problems on the diet, or for whom the diet does not work. The diet is Perfect: any problems simply must be the fault of the individual! I will DENY that modern fruit contains excess sugar; I will IGNORE the reality that people with high % fruit diets report the symptoms of excess sugar consumption (sugar highs/blues, fatigue, excess urination, thirst, etc.). I will claim that fruit is the "highest food," and will IGNORE the reality that modern cultivated fruit has been bred for generations (in some cases, literally thousand of years) for high sugar content. I will IGNORE the intellectual dishonesty of those who make unrealistic claims for the diet--that it will cure all diseases, bring personal happiness, perfect health, world peace, etc. I will promote, or tolerate the promotion of, raw vegan diets using fear as a major motivator: fear of cooked foods, protein, or mucus. In so doing, I will IGNORE/DENY that such motivations are mentally toxic, pathological, and promote mental illness. I will IGNORE or DENY the obvious reality that blaming the problems of the world on cooked foods is intellectually dishonest, and that promoting the fear/hatred of those who eat cooked foods (or meat) is analogous to racism. I will IGNORE or attack those who criticize the glorious 100% raw vegan diet, especially if they cite scientific research. Science is the product of the minds of cooked-food consumers, thus it is a product of the evil demon of cooked foods! Beware the monstrous demon of cooked foods! I will IGNORE those who encourage me to think for myself. Independent thought is similar to science, hence another trick by the demon of cooked foods! Instead, I will mindlessly follow all the teachings and guidance of the IRAVE leadership. What a wonderful blessing--I don't have to think anymore!

26

I will IGNORE the reality that some raw vegan dogma is anti-common sense, and logically invalid/inconsistent. Once again, common sense and logic are like science--yet another trap by the clever demon of cooked foods. I will attack those who criticize the diet using anecdotal evidence, IGNORING the reality that only anecdotal evidence is available for many issues. Instead, I will demand published scientific studies on raw-fooders as proof, even though I cannot produce such studies to support my claims. If my approach is challenged as inconsistent, I will IGNORE/DENY it, as, after all, intellectual honesty and consistency are also products of the evil demon of cooked foods! I will never accept, and will always DENY, the reality that raw veganism is not the natural diet of humans, but is instead a restriction of that natural diet. I will IGNORE or condone those who promote the raw vegan diet via negative, hostile methods (harassment, threats, bullying, insults, etc.). I will claim that such tactics are a joke, a marketing ploy, or that the end justifies the means, if it saves people from the demon of cooked foods! If I, or other extremist members are criticized because our methods are anti-compassion, I will DENY that veganism has anything to do with compassion. I will claim that veganism is based only on animal liberation, and will IGNORE the obvious reality that, without compassion, animal liberation is meaningless--the equivalent of automobile liberation. If any of the leaders of IRAVE engage in plagiarism in furthering their agenda, as has been known to happen, I will IGNORE it. Although plagiarizing a book is stealing (of intellectual property), and presenting oneself as author of plagiarized material is lying, I will IGNORE (or even support) such activities if they promote the one true religion and science that is the 100% raw vegan diet. As a member of IRAVE, I will IGNORE and DENY the need for integrity. Integrity is yet another clever trick by the demon of cooked foods! By joining IRAVE, I adopt instead "raw integrity" and "raw courage," a radical lunch-based system of ethics, in which most any means--including lying and falsification of scientific information, cheating, etc.--are all good if they promote the glorious end we uphold: the 100% raw vegan lunch. I pledge to support the efforts of other members of IRAVE, even if they promote the glorious 100% raw vegan diet through such imaginative claims as "protein is poison," "fruit is just like mother's milk," and so on. If such claims are challenged, I will DENY the reality that they are crank science, and will instead assert they are valid science. Anything repeated often and loudly enough MUST be true! If a member of IRAVE claims to thrive on a diet, the calorie content of which is patently or demonstrably below starvation level, I will not challenge their claim, but will defend their claims when questioned. If necessary, I will IGNORE and DENY that such claims are apparent evidence of secret binge-eating and dishonesty by those making such claims. Whenever reality contradicts raw vegan dogma, I pledge to IGNORE reality and live in DENIAL. After all, a life of IGNORANCE and DENIAL is a life of pure joy and bliss, and a superior way of life, per the teachings of the IRAVE leadership. I pledge that raw vegan dogma (obsession with food and dietary purity) is the most important thing in my life, and in the world. Further, raw vegan dogma is more important than the safety, health, nutritional status, or well-being of other people. I will attack those who challenge my obsessions, and will DENY the reality that my attitudes are mentally UNhealthy, and a sign of serious mental UNbalance.

In certification of my complete personal allegiance to the above solemn pledge, I hereby sign my name using one of my own precious bodily fluids:
____________________ ____________________ (Date)

27

Skepticism in raw foods vs. the "golden cage" of dietary dogma

Credibility is a continuum
The discussion here has covered many issues and behaviors. It is appropriate to remember, therefore, that credibility is a continuum, and not a strictly binary classification (credible vs. incredible). Evaluation of potential extremist "gurus" must be done on an individual basis, as only some of the material here might apply to a particular advocate. Also, someone who is generally credible on most issues might be less credible on a few select issues, due to (differences in) personal philosophy, or ordinary individualism. Evaluation of credibility requires assessing a wide range of issues and behaviors, and making an evaluation based on an overall pattern. Basing an evaluation on only one point may be too narrow and inflexible. Of course, narrowness and inflexibility are trademarks of extremism, and traits to be avoided. So, get as much information as possible when evaluating potential "diet gurus." You probably don't want to trust your health and well-being to an extremist!

Skepticism can spark personal liberation

Why so much skepticism here?


Most of the writers on this site generally follow mostly raw-food-type diets (paleo, instincto, raw lactovegetarian; none of us are 100% raw at present). At one stage or another, we were raw vegans, some of us 100% raw, for varying lengths of time. We even promoted raw diets, though not in the self-conscious "missionary" style that characterizes the current extremists. In other words, the writers on this site had much in common (in the past) with today's extremists. (Of course there were differences--we hopefully had much better manners than many of today's obnoxious and toxic extremists, at least we sure hope the perception by others was that we did! ;-) ). What happened, then, to make the writers here so skeptical about the efficacy of raw vegan diets? Very simply: We had too many problems on the diets, and we saw too many problems in others following the diets, over the long term. Reality intervened, and showed us that raw vegan diets, especially 100% raw vegan diets, and their associated dogma, often do not work as advertised (in the long term). Further, recent scientific research, plus ordinary logic and common sense applied to raw and vegan dogma, debunk major parts of the raw vegan "party line." Because of this, we came to the awareness that portions of the philosophies that underlie rawism and veganism are intellectually bankrupt (and may be ethically bankrupt as well).

Seeing the whole picture brings wider options and more freedom.
This brings us to the motivations of this site: To communicate to you that dietary extremists, blinded by idealism, are not telling you the whole story. To discuss some of the problems that can occur on such diets. To examine the beliefs of rawism/veganism in an open, dogma-free manner. To document that the idea that (raw) veganism (or even vegetarianism) is your "natural" diet is not in accord with scientific evidence. (One can still be a vegetarian for spiritual reasons, of course. And vegetarian diets can certainly be healthier than the SAD as well, though contrary to popular perception within the vegan movement at large, vegetarian diets don't necessarily work well for everyone. Finally, the major objective of this site is to motivate you to do a thorough, critical re-examination of your diet and any dietary dogma that might accompany it. Regardless of whether or not you change your diet because of what you read here, if you begin to think critically about dietary idealism and the claims of the "experts" or "diet gurus," then we have accomplished a great deal--for then, you have transcended the stance of "follower" and are thinking for yourself. That is a major step forward and the first step toward freedom and liberation from (false) dogma.

28

Having to think for yourself frees you from the constricting mental prison of narrow dietary dogma.
Thinking for oneself on dietary matters may be frightening at first. It can be scary to go from the absolute certainty involved in being a follower of cult-like raw dietary extremists, whose aggressive and hostile style actively discourages individual thought, to ditching them and thinking for yourself. However, in that move, you can discover the seeds of freedom, independence, and enhanced self-respect. The "party line" of 100% raw vegan dogma does not "set you free"--instead it binds you to a set of strict rules and a narrow mindset that promotes food obsessions. Thinking for yourself may be a challenge at first, but the rewards make it very worthwhile. Life is full of uncertainty--which is very different from the extremists who foolishly (and falsely) claim complete knowledge of nature and its laws (or of the only ones that they believe matter). Some decisions will be major challenges. However, that is the point--and, quite literally, the fun--of life. A life without challenge, lived according to narrow dietary dogmas, is no fun at all. Take that as advice from one who has personally "been there, done that, no thanks!" We invite you to embark on the ongoing, individual effort to develop an open, honest relationship with nature and reality. The first step is to begin a critical examination of the dietary dogma that you may have been taught in the past.

A special message about, and for, extremists

Extremism and black-and-white morality generate a destructive us/them mentality.


The vegan movement has been criticized for a tendency to divide the world into "us" (good vegans) vs. "them" (bad meat-eaters). This negative practice is, to a certain extent, also reflected in the raw community. The nature of this article is such that comparisons are necessary to convey the essential information. Although this writer has had many unpleasant experiences with hateful dietary extremists, it is not the goal here to condemn "them," i.e, extremists. Probably none of us are immune to the tendency to be overzealous at times, so the intent here is not to focus criticism on the person, but rather on the (pattern of) behavior. Further, it should be noted that those who advocate views based in extremism do so of their own free choice. They can also review the extremism of their positions, and personally choose to change their behavior. As those of us contributing to this site have personally learned through hard experience of our own (experience that we occasionally have to re-learn :-) ), giving up one's extremist mindset can be a very liberating act that releases you from the "golden cage" of narrow, restrictive dietary dogma.

The "golden cage" of dietary dogma.


A life lived in the golden cage of dietary dogma is one of intense, life-controlling obsession with diet/food, a life in which one foolishly self-identifies with their diet. Ultimately, such a life degenerates into selfrighteousness as well. The end result of this is often a serious mental imbalance, and the hostile actions of certain extremists simply confirm their unfortunate condition. Can you imagine two people with SAD diets, fighting over whether corn chips or potato chips are better? Wouldn't most people regard such an argument as truly insane? But that is exactly what extremists are doing when they make hostile attacks and threats against those who challenge their not-so-perfect diets. Is that kind of behavior mentally healthy, or unhealthy?

29

However, whether the individual in question is oneself, or another, there is hope--extremism can always be released in favor of tolerance, by the simple fact of loving and accepting oneself more fully, as one currently is, without insisting on instant change, or demanding that one comply with a (new/different) standard of perfectionism. Loving and accepting yourself as you are right now is the first step to loving and accepting others as they are (even if they eat meat or cooked foods). Also, note that lasting, positive changes come more easily and reliably from gentle, consistent efforts than from sudden, radical changes (e.g., demanding strict conformance to a new standard of behavior--like a new but difficult and narrow diet). Sudden changes often lead to burnout, whether physical or mental. In the long run, slow, gentle change is healthier for body and mind than is extremism. Extremism may show good short-term results, but at a high cost, and with negative side-effects (e.g., long-term physical and/or mental problems). Note: The use of a golden cage as a metaphor above is deliberate. See the book The Golden Cage: The Enigma of Anorexia Nervosa, by Hilde Bruch (1978, Harvard University Press, Boston) for the source of the metaphor.

Is your "ideal" dietary dogma really a sandcastle?


Finally, a general observation. In figurative terms, following the false dogma promoted by dietary extremists can be characterized as being like living in a small sandcastle built right by the waves, at low tide, on the shoreline of the ocean of truth. But now things are changing in the dietary world. The recent scientific information (post-1970) that confirms that apes eat animal foods as part of their natural diets is becoming more generally known. Additionally, the fact that our evolutionary diet indicates that humans are--and always have been--natural omnivores/faunivores only began receiving wide attention in the scientific journals in 1985, with first dissemination to the public at large (via a popular book) in 1988. The vegetarian movement was very small in the U.S. prior to 1970, and it is only now that the varying potential long-term results of vegetarianism (depending on the individual) are becoming clear. What this means, in figurative terms, is that high tide is now beginning in the ocean of truth. This high tide will ultimately sweep away the sandcastles of false dietary dogma, and redefine the shoreline of the ocean of truth. When such major changes occur, you have a choice--stay with the enchanting but dissolving sandcastle, or move to higher ground. Here, the higher ground refers to a place where there is no dogma, save truth and honest concern for your health. Ultimately, these two concerns--truth and your health--are the ones that matter the most, as you search for a diet that serves your body, mind, and spirit in the best way possible. --Tom Billings

Comparative Anatomy and Physiology Brought Up to Date


Are Humans Natural Frugivores/Vegetarians, or Omnivores/Faunivores?
by Tom Billings Copyright 1999 by Thomas E. Billings. All rights reserved.

30

An Invitation to Readers
If you are--or have ever been--involved in alternative diets or vegetarianism, you have probably heard or read claims that comparative anatomy and/or comparative physiology "proves" or (in more conservative language) "provides powerful evidence" that humans are "natural" fruitarians, vegetarians, or even omnivores. This paper will assess such claims and the evidence supporting them, but we first need to ask: Whether comparative anatomy and physiology provide actual hard proof of the precise composition of the "natural" human diet, or if they merely provide general indications of possible diets. Then, we want to go beyond the typical simplistic analyses presented in the vegetarian and alternative diet lore, and reexamine what information comparative anatomy and physiology actually provide regarding the natural diet of humans. Further, a number of related claims are often made as part of such comparative "proofs." Some of these claims addressed here are: o Does "instinct" (whatever that is) "prove" that we are natural vegetarians? o Does research that shows the typical Western meat-based diet is unhealthy prove that all omnivore diets are unhealthy? o And, since it is mentioned in the subtitle, just what is a faunivore, anyway?

If these questions interest you, I invite you along for the ride. But first, open your mind and fasten your seat belt--the ride may be bumpier than you expect!

If your time is limited, or you have specific interests...


As this paper addresses numerous, diverse topics by necessity (some in depth), it is lengthy and this may present obstacles for some readers. Also, considerable background information must be covered before the main claims of the major comparative "proofs" of diet can be addressed. To help navigate the numerous topics, to gain a quick bird's-eye view, or to read the paper in sections as time permits, the comprehensive Table of Contents below provides direct links to all the major sections and subsections of the paper.

TABLE OF CONTENTS

PART 1: Brief Overview: What is the Relevance of Comparative Anatomical and Physiological "Proofs"? PART 2: Looking at Ape Diets: Myths, Realities, and Rationalizations PART 3: The Fossil-Record Evidence about Human Diet PART 4: Intelligence, Evolution of the Human Brain, and Diet PART 5: Limitations on Comparative Dietary Proofs PART 6: What Comparative Anatomy Does and Doesn't Tell Us about Human Diet PART 7: Insights about Human Nutrition and Digestion from Comparative Physiology PART 8: Further Issues in the Debate over Omnivorous vs. Vegetarian Diets PART 9: Conclusions: The End, or The Beginning of a New Approach to Your Diet?

31

PART 1: Brief Overview--What is the Relevance of Comparative Anatomical/Physiological "Proofs"?


Comparative anatomy and physiology (as well as more general comparative approaches that incorporate additional information) are often cited by advocates of particular diets as providing "proof" that the diet they promote is humanity's "natural" diet, and hence best or optimal in some sense. The object of this paper is to examine, in detail, the claims made in the assorted "proofs" based on comparative studies. To be fair, it should also be pointed out that some of those presenting comparative studies openly admit that such studies do not provide hard proof of the natural human diet, but instead merely contribute evidence in favor of the particular diet they advocate.

Along the way, this paper will address many of the claims made in:

The Comparative Anatomy of Eating, by Milton R. Mills, date unknown (a popular


comparative "proof" on the Internet among the raw/vegetarian communities). Fit Food For Humanity, no author cited, 1982, Natural Hygiene Press. This appears to be an update of Fit Food for Man, by Arthur Andrews, 1970. In addition, some of the claims made in What Might Be the "Natural" Diet for Human Beings, by Ted Altar, are briefly discussed here as well.

Other claims: Over and above claims made in the above "classic" sources in the alternative dietary literature of the past, an extensive cataloguing of other claims typically made as part of such comparative "proofs" is also reviewed and discussed. As well, a number of other assertions that go beyond comparative anatomy/physiology itself but that often come up in discussions related to the issue of

omnivorous vs. vegetarian diets are covered here for completeness, since they will be of
keen interest to those interested in these subjects. Citations for these claims are not given, as in some cases the sources are, in my opinion, extremists (often fruitarians); and citing such works by author's name may needlessly inflame and detract from a focus on the issues themselves, or might merely incite personal attacks by hostile but allegedly "compassionate" extremists. The approach taken here will be to paraphrase the latter claims, trusting that readers will be able to recognize such claims and their proponents should they subsequently run across them elsewhere.

The Logic and Structure of Comparative "Proofs"


The basic argument underlying comparative anatomy/physiology "proofs" is that comparing humans with other species--via a list of anatomical and/or physiological features--somehow "proves" that humans have a particular, often quite specific/narrow, diet. This conclusion is allegedly reached if sufficient features match in the comparison list.

Subjective nature of traditional "proofs." However, examination of the various "proofs" reveals
considerable variation in length of lists used for the comparisons, in the level of detail in each list, and in the number of supposedly matching features provided as "proof." In addition to the question of whether the method is logically valid to "prove" the human diet is restricted to a narrow range, one immediately observes that there appears to be considerable subjectivity in determining the level of detail in the comparison list. And, for that matter, just how many "matches" are required for "proof," anyway? (And what validation is there for such a "magic" number of matches?)

32

What do comparative "proofs" actually show? As we will see later, comparative anatomy and
physiology are not so precise as to give definitive, narrow answers to the question of what is the natural diet of humanity. Instead, comparative anatomy and physiology provide evidence of associations and possibilities. We will also see that those who present simplistic comparisons, and claim they absolutely "prove" humans naturally must follow a narrow, restricted diet (e.g., fruitarianism, for example) are not telling you the whole story. Unfortunately, this pattern--giving you only a small part of the information available (in figurative terms, half-truths)--is common in the raw and vegan movements at present (in my opinion).

Comparative Anatomy and Physiology are Legitimate Tools


It should be emphasized, and very clear to readers, that comparative anatomy and comparative physiology are legitimate, useful, and important tools for scientific inquiry and research. In particular, the following are relevant: Comparative anatomy and physiology are used, legitimately, in biology, paleoanthropology, and other fields. It is not the intent of this paper to criticize, minimize, or denigrate the valuable (and important) disciplines of comparative anatomy and physiology. The intent of this paper is to analyze whether the simplistic comparative studies presented by dietary advocates actually prove that the "natural" human diet is as specific as claimed by the dietary advocates. This paper will look at some of the limitations of such comparative analyses. Further, it will provide background information for some of the topics covered in comparative studies, so you can understand just how simplistic the "proofs" provided by dietary advocates usually are.

Comparative anatomy is a valid tool, but simplistic applications are often fallacious. The
basic question of this paper is not whether comparative anatomy and physiology are valid tools (they clearly are, though we will see that applying comparative anatomy and physiology to humans is problematic), but whether the simplistic analyses presented by dietary advocates are legitimate or "scientific" (we will see later that they are not). So, the question here is not whether the tool is valid, but the specific application of the tool--i.e., whether the tool is being used honestly and properly.

Emphasis on Primates
Modern human beings, species Homo sapiens, are classified as belonging to the primates. Accordingly, as this paper focuses on comparisons, the non-human primates will often be used for comparison. Humanity's prehistoric ancestors--extinct primates, including both human and non-human species--are also discussed as appropriate. When necessary, non-primate species are discussed as well, but most attention here is on primates.

Helpful Notes for the Reader, and Special Terminology


This paper uses the shorthand notation "veg*n" as an inclusive abbreviation for "vegan and/or vegetarian" (where "vegan" means no foods of animal origin whatsoever, and "vegetarian" means dairy and/or eggs may be consumed). This notation is coming into use in some sectors of the online (Internet) vegan/vegetarian community, and is used here to avoid needless repetition and cumbersome phraseology.

33

The abbreviations "SAD" (for "standard American diet") and "SWD" (standard Western diet) are also used extensively throughout this paper to describe the by now well-known unhealthful standard diets consumed by the majority of the populace in most economically well-off countries. Such diets are typically high in overall fats, particularly saturated fats; high in hydrogenated oils ("trans" fats); high in refined sweeteners and denatured (highly processed) foods; low in essential polyunsaturated fats of the omega-3 variety; and low in fruits and vegetables, particularly fresh fruits and vegetables. Readers should be aware that this writer is a long-time vegetarian (since 1970). My personal motivations for being a vegetarian are spiritual, rather than "scientific" or "philosophical naturalism." Intended audience. This paper is written for both the conventional and raw veg*n communities. In particular, some of the information here contradicts and discredits the claims of certain extremists, many (but not all) of whom are raw fruitarians. If you are part of the large majority of conventional veg*ns who in fact are not extremists, then I hope that the references made herein to fruitarian extremists will be educational. They provide what I hope you will find to be interesting insights into the strange crank-science claims and phony naturalistic philosophy that, unfortunately, are often a part of the basis for vegan fruitarianism. Also, quite a number of the claims are often utilized as underpinnings for conventional veganism itself, at least for those who are more extreme in their thinking about it; thus this information should be of interest to the wider vegetarian community as well. Also, as fruitarian extremists will be mentioned here, please note that some fruitarians are moderates and are not extremists. The fanaticism promoted by certain extremist fruitarians does not reflect the more moderate beliefs and practices of at least some mainstream fruitarians.

Defining Fruitarianism. Inasmuch as the term fruitarian has significantly different meanings
when used by different people, it is appropriate to specify the definition used here: A diet that is 75+% raw fruit (where fruit has the common definition) by weight, with the remaining foods eaten being raw and vegan. A common rather than botanical definition for fruit is used--i.e., the reproductive parts of a plant that includes the seed and a juicy pulp. This definition includes sweet juicy fruit, oily fruit like avocados and olives, so-called "vegetable fruits" like cucumbers and tomatoes, but excludes nuts, grains, legumes (all of which are "fruits" per the more general botanical definition, but not as used here). Note that most fruitarians eat other raw foods to comprise the non-fruit part of their diet; such foods may include (by individual choice) raw vegetables, nuts, sprouted grains, legumes, seeds, and so on. The definition used here is somewhat strict, but it does match closely the idealistic and puritanical diets advocated by the more extreme fruitarians. Some individuals use a different definition; when discussing diet with a fruitarian, the first thing to determine is what the diet actually is, as it can vary substantially from one individual to another.

I hope that you find this paper to be interesting, that the material here is "mentally digestible," and that you will use the information as a positive opportunity to examine the assumptions that underlie your personal dietary philosophy. Enjoy!

PART 2: Looking at Ape Diets--Myths, Realities, and Rationalizations


34

Dietary Classifications & Word Gamesmanship Dietary categories are not strict in nature
A good place to begin our exploration of the topic of natural diet as it relates to morphology (anatomical form) is with dietary classifications. In nature, dietary classifications are rarely strict. Herbivores (folivores) routinely consume significant amounts of insects (animal matter) on the leaves they eat. Some folivores--e.g., gorillas, including the mountain gorilla--may deliberately eat insects. Carnivores may seek out and deliberately consume grasses and other plant matter on occasion, and they may consume the stomach contents (vegetation) of herbivores they prey on. Frugivores normally eat some leaves and/or animal matter in addition to their primary food of fruit. Extreme diets--diets that are 100% of a specific food or narrow food category--are not very common in nature. In sharp contrast to the spectrum of broad-based diets that one finds in nature, however, one can easily find dietary advocates promoting (human) diets that have a relatively narrow basis: 100% vegan, 100% raw vegan, 75+% fruit, and so on.

High variability of primate diets


Primates may frequently switch dietary categories
When discussing dietary categories, it is very easy to forget that extremes are rare in nature, and instead focus exclusively on the central food (or food category) consumed, and ignore the other foods consumed. If one classifies chimps as frugivores, it is easy to forget that they also consume smaller but nutritionally significant quantities of leaves and animal foods (both invertebrates--insects--and vertebrates, and other mammals). The fact that humans are primates is also relevant, for primate diets tend to be highly variable on a month-to-month basis in the wild. Chapman and Chapman [1990] reviewed 46 long-term studies of wild primates, with attention on monthly variations in diet. They noted (pp. 121, 126): ...primates do not consistently combine the same kinds of foods in their diets, as many past characterizations would suggest, but rather, that they often switch between diet categories (e.g., fruit, insects, etc.)... Our review of primate diets on a monthly temporal scale suggests that primates do not always consistently include the same kinds of foods in their diets. Instead, primate populations frequently switch between diet categories.

Animal food consumption common


Harding [1981], noting the widespread reports of predatory behavior and meat consumption by non-human primates, makes the interesting comment (p. 191; my explanatory comments are in brackets [ ] below): It is now clear that several primate populations make regular and substantial use of precisely the type of food [animal flesh] which the early theories described as instrumental in the emergence of the hominids. If the diets of these particular nonhuman primates are more broadly based than we had thought, then how accurate is it to characterize contemporary primate diets in general as "vegetarian"?... As Teleki points out (1975: 127ff.), such terms are first used as shorthand references to a particular dietary specialization but then gradually become inclusive descriptions of an animal's entire diet. Essential elements of the diet are ignored, and the result is a generally misleading impression of what a group or population actually eats. As this article shows, diversity rather than specialization is typical of primate diets.

35

Drawing boundaries between diet categories is non-trivial


As mentioned above, extreme diets are rare in nature. Instead, there is a spectrum of diets, and drawing lines or boundaries in the multi-dimensional dietary spectrum to distinguish between, say, frugivores and faunivores [faunivore = meat-eater] is a non-trivial problem. The reality that primates may frequently switch dietary categories makes the situation even more complicated. Chapman and Chapman [1990, p. 123], citing Mackinnon [1974, 1977], report that in one month, orangutan diet in Sumatra was frugivorous (90% fruit by feeding time), but the same population at a different time was folivorous (75% leaves by feeding time). One month a frugivore, later a folivore: how to classify? Additional evidence on the variability of orangutan diets is given in Knott [1998, p. 40]. Further, there is even some academic disagreement over the definition and use of the term "omnivore." These topics are addressed later in this paper .

Common vs. scientific diet category terms


Use of the cultural terms "vegan" and "fruitarian" versus "folivore" and "frugivore"
Another relevant aspect of dietary classifications: The terms vegan, fruitarian, and vegetarian are all human cultural terms, and are used, at least by most dietary advocates, with narrow definitions. One may hear an advocate claiming that "apes are vegans," then later the same advocate might criticize others for using the same terminology--i.e., "apes are NOT vegans"; the specific criticism being that a biological term is more appropriate (e.g., folivore, frugivore, faunivore ). In this article, a primary objective is to communicate concepts clearly and accurately but without undue complexity. Hence, since most readers would generally understand that the phrase "apes are NOT vegans" means that a human who eats the same diet as the apes would not qualify as a vegan, here we won't worry about the technical terms unless/until they are necessary to assist the reader in understanding the material. (Technical terms are used extensively in certain sections of this paper, but will be defined as we go along.)

To Summarize:

In general, dietary classifications in nature are not as distinct/clear as the narrow, simplistic, precisely defined diets promoted by certain dietary advocates. Because dietary classifications are not strict in nature, a frugivore might deliberately eat some animal foods, hence not qualify as "vegetarian" as the term applies in human culture (similar remarks apply to folivores). Primate diets tend to be highly variable in the wild.

The Evidence of Ape/Primate Diets


Preface. A comparison of humans vs. allegedly "vegetarian" anthropoid apes is frequently a part of comparative anatomy/physiology "proofs" that assert humans are natural vegetarians. ("Anthropoid" means the most human-like apes in the "great apes" family, which includes the chimpanzee, gorilla, and orangutan.) However, as knowledge of the actual diet of wild primates/anthropoid apes (from field observation studies) increases, the reality that most primates/apes include some animal foods (even if only insects) in their normal, natural diet is becoming better known. As a result of this, the myth of the "vegetarian ape" is slowly slipping away.

36

Ape diets, with emphasis on chimpanzees, are summarized in the article "Setting the Scientific Record Straight on Humanity's Evolutionary Prehistoric Diet and Ape Diets" on this site. Additionally, some of the rationalizations by dietary advocates that have been advanced as part of attempts to preserve the myth of "vegetarian apes" are discussed in "Selected Myths of Raw Foods." The current section serves primarily to supplement the above articles, and to address other, newer rationalizations and misinformation promoted by dietary advocates who stubbornly cling to the "apes are vegetarians" myth.

Claims Made in Fit Food for Humanity


The booklet Fit Food for Humanity [Natural Hygiene Press 1982] makes what could be considered the classical comparative argument: humans vs. anthropoid apes. The booklet argues that: Anthropoid apes are natural vegetarians (pp. 6-7). Humans are very similar to anthropoid apes, which are established via a comparative anatomy/physiology table (pp. 8-9).

The booklet then reaches the conclusion that humans are natural vegetarians, i.e., that the vegetarian diet is the (only) diet in accord with human anatomy. However, the line of argument given above is incorrect and logically insufficient to establish the claimed result. The structural problems in the above type of argument are addressed in later sections. This section will focus specifically on the issue of ape diets.

"Vegetarian" apes: a misconception of the past First, as mentioned above, the idea that apes (or primates in general) are strict vegetarians in the normal human sense of the word is a misconception of the past. This point is clarified in Sussman [1987, pp. 166, 168]: In fact, most species of primate are omnivorous (see Harding [1981]) and omnivory should be considered an evolutionarily conservative and generalized trait among primates. Primates evolved from insectivores.... Thus, omnivorous primates are mainly frugivorous and, depending upon body size, obtain most of their protein from insects and leaves. In all large, omnivorous, nonhuman primates, animal protein is a very small but presumably necessary component of the diet. In the above, the term omnivore has the usual definition; e.g., from Milton [1987, p. 93]: "By definition, an omnivore is any animal that takes food from more than one trophic level. Most mammals are in fact omnivorous...". ["Trophic" refers to the different levels of the food chain.] Note that some experts use a different, more precise definition for the term omnivore, and disagree that mammals are omnivores--instead they suggest using the term faunivore for animals that regularly include fauna (other animals) in their diet.

Insect food
Regarding consumption of animal foods by primates, Hamilton and Busse [1978, p. 761] note: Many primate species once considered herbivorous are now known to expand the animal-matter portion of their diet to high levels when it is possible to do so... Insect food is the predominant animal matter resource for primates. Insects are eaten by all extant apes, i.e., chimpanzees (e.g., Lawick-Goodall 1968), orang-utans (Gladikas-Brindamour1), gorillas (Fossey2), gibbons (Chivers 1972, R.L. Tilson3), and the siamang (Chivers 1972). The amount of insect matter in most primate

37

diets is small, but may expand to more than 90% of the diet when insects are abundant and easily captured... Preference for animal matter seems confirmed. Note that the footnote numbers in the quote above refer only to Hamilton and Busse [1978].

Rationalizations about Dietary Deviations among Primates


Fit Food for Humanity does include notes on "Dietary Deviations Among the Primates" (pp. 11-12). It is interesting to note that most (but not all) of the references cited therein are encyclopedia entries--which usually do not reflect the latest research. The response in Fit Food for Humanity to the information that anthropoid apes are not strict vegetarians could be characterized as reliance on outdated information, rationalizations, and hand-waving. Let's review some of the claims. ("FFH" is used as an abbreviation for Fit Food for Humanity in the material below.)

FFH: Gorillas are total vegetarians. REPLY/COMMENTS: Both lowland and mountain gorillas consume insects, deliberately and indirectly, that is, on the vegetation they consume. The above quote from Hamilton and Busse
[1978] cites Fossey (personal communication) regarding insect consumption by mountain gorillas. Tutin and Fernandez [1992] report consumption of insects by lowland gorillas in the Lope Reserve, Gabon: termites (whose remains were contained in 27.4% of gorilla feces) and weaver ants. Note that both insects mentioned are social insects; the consumption of social insects is efficient, as their concentration in nests allows easy harvesting of significant quantities. Of further interest here is the information that termites are known to contain significant quantities of vitamin B-12; see Wakayama et al. [1984] for details. Insectivory by mountain gorillas is discussed further later in this section.

FFH: Orangutans consume 2% insects; from p. 11: "the 2% digression may be seen as incidental and insignificant." REPLY/COMMENTS: The quote from FFH does not specify whether the 2% is by weight or feeding
time. Due to the difficulties in estimating weights of foods consumed, the 2% figure is probably by feeding time. Galdikas and Teleki [1981] report that orangutans at Tanjung Puting Reserve in Indonesia consumed 4% fauna (insects, eggs, meat) by feeding time. Kortlandt [1984] reports that (p. 133), "orang-utans eat honey, insects and, occasionally, bird's eggs, but no vertebrates." For photos of a wild orangutan eating insects, see Knott [1998], p. 42; and for a photo of a wild orangutan eating a vertebrate--a rare event--see Knott [1998], p. 54. The claim that insect consumption by orangutans is "insignificant" is clearly an unproven assumption. Insects and other animal foods are nutrient-dense foods: they supply far more calories and nutrients per gram of edible portion than the same weight of most of the plant foods commonly consumed (i.e., fruits other than oily fruits, and leaves).

FFH: The principal rationalizations given for termite and meat-eating by the chimps of Gombe Preserve are:
The Gombe Preserve is small, and surrounded by human-populated areas. Chimps are "natural" imitators.

38

FFH then implies (assumes) that the behavior of the chimps of Gombe is in imitation of human behavior. Other writers (elsewhere, not in FFH) suggest that chimps eat meat in imitation of baboons.

REPLY/COMMENTS: The reality is that predation on vertebrates by chimpanzees is widespread throughout tropical Africa. Regarding chimpanzee predation, Teleki [1981, p. 305]
reports that: Moreover, predatory behavior involving vertebrate prey has now been recorded at all major study sites in equatorial Africa, from Uganda and Tanzania to Sierra Leone and Senegal. I expect that the known geographical distribution of predatory behavior will continue to expand as new chimpanzee projects are launched, though it is probable that some populations practice this behavior little or not at all, while others do so regularly and systematically (Teleki, 1975). Note in the above remark that predation by chimps has been found at all major study sites, although it is possible that some groups of chimps hunt rarely, or not at all. Over and above the reality that predation and meat-eating by chimps is widespread, the claim that chimps do so in imitation of humans or baboons is both unproven and dubious. Chimps have lived in proximity to both humans and baboons for approximately 2-2.5 million years. This alone suggests that sufficient time has elapsed, in evolutionary terms, for chimps to adapt to such allegedly "imitation" behavior. Once evolutionary adaptation occurs, the "imitation" behavior would no longer be an imitation (supposing it were that, in some hazily conceived past)--it is natural. This reasoning suggests that the "imitation" argument is dubious at best. Another problem with the "imitation" argument is that imitative learning in captive chimps is common, but in wild chimpanzees it is rare; see Boesch and Tomasello [1998] for discussion on this point.

Chimp/Baboon Interaction. The interaction between baboons and chimps is quite interesting and
serves to illuminate the shallow nature of the "imitation" argument. Teleki [1981, pp. 330-331] comments on: ...the anomalous nature of an interspecific relationship that includes play with baboons, consumption of baboons, and competition with baboons for at least one kind of prey... [T]he Gombe chimpanzees removed 8% of the local baboon population in 1968-1969 (Teleki, 1973a) and 8-13% of the local colobus population in 1973-1974 (Busse, 1977). How is it possible, then, that the primates serving as prey to chimpanzees in Gombe National Park, and possibly also at other sites, have not developed more successful defensive tactics? Any answer other than the proposition that chimpanzees have only recently acquired predatory inclinations, for which there is no supportive evidence at all (Teleki, 1973a), would be welcome." Note: The above quote is included to specifically inform readers that there is no evidence that predation by chimps is a "new" behavior, and that there is extensive, complex, baboon/chimp interaction.

Insect Consumption by Chimps is Universal. Kortlandt [1984] also discusses insect consumption by
chimps (p. 133): Chimpanzees, on the other hand, spend a remarkable amount of time, mental effort and tool use on searching out insects and feeding on them in every place where they have been intensively studied. Hladik and Viroben (1974) have shown that this insect food is nutritionally important in order to compensate for a deficiency of certain amino acids in the plant foods, even in the rich environment of the Gabon rain-forest.

39

Other "Apes Are Vegetarians" Claims


Outdated quote from Stevens and Hume [1995]
On occasion, one sees the following quote from Stevens and Hume [1995, p. 76] cited in support of the "apes are vegetarians" myth: The gorilla, chimpanzee, and orangutan are generally considered to be strict herbivores, although there is evidence that chimpanzees may hunt and eat termites and smaller monkeys (van Lawick-Goodall 1968). The Stevens and Hume [1995] reference (a book, Comparative Physiology of the Vertebrate Digestive System) is, in general, a good reference. However, the specific quote above is clearly outdated. (In a later section, we will discuss another quote from Stevens and Hume [1995] that is also misused by certain dietary advocates.)

Less predominant foods often "assumed away" by fiat. We have already (briefly) discussed insect
consumption by gorillas and orangutans. Though insects are a small component of their diet, we note the following: Such a dietary component cannot be "assumed away," as it may be nutritionally significant as discussed above. A human who consumes insects (live insects are obviously a "living food"!) would not be considered a vegan or vegetarian according to the strict dietary definitions promoted by dietary advocates.

Given that the Stevens and Hume [1995] quote mentions Jane van Lawick-Goodall [1968], the dated nature of the quote is easily illustrated by van Lawick-Goodall [1971] herself, where she reports that the chimps of Gombe consume a wide array of insects, bird's eggs, and chicks, and the following vertebrate prey (from pp. 282-283): Most common prey animals are the young of bushbucks (Tragelaphus scriptus), bushpigs (Potamochoerus porcus) and baboons (Papio anubis), and young or adult red colobus monkeys (Colobus badius). Occasionally the chimpanzees may catch a redtail monkey (Cercopithecus ascanius) or a blue monkey (Cercopithecus mitis). As the information has begun to spread within the raw vegan community that ape diets include some animal foods, even if only a limited amount of insects, certain raw vegan advocates have reacted with brand-new rationalizations and claims, which we examine next.

Chimp predation is supposedly "an error" or "maladaptive"


One rationalization made is that the chimps eating meat are acting "in error" or are "perverted." The logical fallacies of such crank/bogus claims are discussed briefly in Selected Myths of Raw Foods. (One wonders what irrational beliefs drive those fruitarian extremists who appear to think they know more about chimp diet than the chimps themselves?) Yet another rationalization promoted by fruitarian extremists is that meat-eating by chimps is "maladaptive," i.e., anti-evolution. Those who promote this claim have no credible evidence to support their views, and they conveniently ignore that the consumption of insects--an animal food--is apparently universal (or nearly universal) among adult chimpanzees. The comments of Kortlandt [1984, p. 134] address the above rationalizations: We should consider here that hunting and killing vertebrates involves health hazards resulting from injury, and that eating meat implies the risks of parasitic and contagious diseases. Such behavior, therefore, strongly indicates certain deficiencies in the composition of the available vegetarian diet.... Similarly, the preference for eating the brain of the victim suggests a lack of lipids and possibly other components in the diet.

Meet the mountain gorilla, the new "model ape"


Recently, some new claims have surfaced regarding the mountain gorilla: Mountain gorillas are the "closest" to human of the anthropoid apes.

40

Mountain gorillas are strict vegans; Schaller [1963] (a by now outdated source of information on this particular point) is usually cited as the reference for this claim.

The new claims cited above are interesting. While there have been claims in the alternative diet movement in the somewhat distant past emphasizing gorillas as models of fruitarian diet (see Dinshah [1976-1977], for example), advocates since then have tended to concentrate more on the chimpanzee. Now, however, after years of using chimps as the "model vegetarian ape," such advocates are changing course. The above claims appear to attempt once again to designate the mountain gorilla as the NEW "model vegetarian ape," and thereby divert attention from the (now more widely acknowledged) news of chimps who eat meat and termites.

Examining the Claims about Gorillas

DNA analysis of hominoids


Let's now examine the above claims. The claim that mountain gorillas are the "closest" to humans is simply false, and DNA evidence is available for analysis here. Sibley and Ahlquist [1984] analyzed the DNA sequences of hominoid primates: humans, chimpanzees (including bonobos), gorillas, orangutans, etc. (Note: "Hominoid" is a grouping that includes both the "anthropoid apes"--also known as the "great apes"-plus humans as one related family of primates based on their shared "human-like" traits.) Their research produced a matrix of Delta T(50)H values--a distance function that measures differences between DNA sequences. The distances they found (from Sibley and Ahlquist [1984, p. 9]) are:

SPECIES COMPARED Human / Chimpanzee Human / Bonobo Human / Gorilla Human / Orangutan

Genetic Distance (Delta) 1.9 1.8 2.4 3.6

Note that the human/chimp and human/bonobo DNA distances are lower numbers than the human/gorilla DNA distance. Hence we observe that chimps and bonobos are "closer" to humans than gorillas are. Sibley and Ahlquist expanded their 1984 research with similar results; see Sibley and Ahlquist [1987] for details (the paper also includes a review of related research papers). Felsenstein [1987] provides even more insight into the Sibley and Ahlquist research via a statistical analysis of their data set. Similar results come from earlier research. Goodman [1975] compared hemoglobin amino-acid chains, and computed amino-acid distances for humans versus selected non-human primates. Goodman's methods were slightly different but the results were similar to the above: chimps are "closer" to humans than gorillas are. (See Goodman [1975, p. 224] for the amino-acid distance values.)

Mountain gorillas eat insects


Beyond the erroneous genetic-distance assertion, the claim that mountain gorillas are strict vegans (in the normal human sense) is incorrect as well. Mountain gorillas are folivores, and their predominant food is leaves; however, at least some mountain gorillas deliberately consume insects. Watts [1989] observed mountain gorillas feeding on driver ants in the Virunga Volcanoes region of Rwanda and Zaire.

41

Deliberate consumption. Watts [1989 (pp. 121, 123)] notes:


Not all gorillas in the Virungas population eat ants... Gorillas become very excited during ant-eating sessions, and even silverbacks who are not eating ants sometimes give chest-beating displays... Some mountain gorillas eat ants with striking eagerness and excitement, and the rate of food intake per unit feeding time is higher than for other means of insectivory (Harcourt and Harcourt, 1984). But ant-eating is so rare that, like searches for egg cases inside dead plant stems (ibid), it probably is not nutritionally important. It may differ in this respect from termite-eating by lowland gorillas in Gabon (Tutin and Fernandez, 1983)... Fossey and Harcourt [1977] report that mountain gorillas in Rwanda consume grubs.

Inadvertent consumption more important? Harcourt and Harcourt [1984] analyzed inadvertent
insect consumption by gorillas in the Virunga Volcanoes area. They note (pp. 231-232): ...adding all sources of invertebrate material produces a daily consumption of about 2g of animal matter per adult gorilla... Clearly gorillas inadvertently eat many invertebrates during their daily vegetarian diet, and far more than they eat deliberately... [T]he daily 2g of invertebrates makes up just 0.01% of the intake... When invertebrates occur at high concentration in the environment they are deliberately sought by the gorillas, and not a little time sometimes expended in their procurement... Invertebrate consumption could be necessary to satisfy trace element requirements in a species with a nearmonotypic vegetarian diet [Wise, 1982]. Harcourt and Harcourt [1984] also suggest that deliberate insect consumption by gorillas is (probably) nutritionally unnecessary since the prevailing level of inadvertent insectivory provides adequate nutrition.

Insect consumption by lowland gorillas


The situation is somewhat different for lowland gorillas. Lowland gorillas are known to more heavily consume social insects. Tutin and Fernandez [1992] analyzed the insect consumption of lowland gorillas and chimpanzees in the Lope Reserve, Gabon. They found insect remains in 27.4% of gorilla feces, versus 20.2% in chimp feces. Tutin and Fernandez disagree with the suggestion by Harcourt and Harcourt [1984] that deliberate insectivory by gorillas may be nutritionally unnecessary. Tutin and Fernandez [1992] explain that lowland gorillas eat more fruit than mountain gorillas, and preprocess much of their herbaceous foods (using their hands). The result is that despite higher insect density in the lowland gorilla habitat, inadvertent insect consumption by lowland gorillas is less than that for mountain gorillas. They then conclude that deliberate insectivory is necessary for lowland gorillas. From Tutin and Fernandez [1992 (p. 36)], with my explanatory comments in brackets [*]:

42

However, given the relatively frugivorous diet and the selective processing of their herbaceous foods, inadvertent insectivory by gorillas at Lope is likely to be minimal and [deliberate consumption of] social insects probably serve the same critical role in nutrition as they do for chimpanzees. [Hladik & Viroben, 1974; Redford, 1987].

Actual mountain gorilla diets vs. fruitarianism


The choice of mountain gorilla as the "model ape" by advocates of fruitarian diets is both ironic and humorous. The mountain gorilla is a folivore and consumes little fruit. Schaller [1963] observed the mountain gorillas feeding, and he actually tasted some of the foods consumed by the gorillas. Schaller [1963] provides a table (table #39, p. 372) listing the plant species and parts consumed, and how they tasted. Of 24 plant species/parts listed, only 6 were fully palatable, 1 was described as "mealy," and the remaining 17 (~71%) were described as one or more of: bitter, sour, astringent, or resinous. Additional evidence that the gorilla diet is not very palatable to humans is in Koshimizu et al. [1994], which reports that ~30% of mountain gorilla diet items are bitter in taste to humans. Koshimizu et al. [1994] also cite an interesting hypothesis by Harborne [1977] that the mountain gorilla tolerates or even likes bitter-tasting foods. The irony here is, of course, that the fruitarian advocates who use the mountain gorilla as "model ape" do not recommend a diet whose predominant tastes are bitter, sour, astringent, or resinous. Instead they often promote a diet that by default is high in sugar from modern, hybridized sweet fruits.

Bonobos: Last Stand for Extremists?


Now that the information that (regular) chimp diet includes some meat and insects is becoming wellknown, some fruitarian extremists have adopted the diversionary strategy of disregarding chimps and trying to center attention on bonobos, the pygmy chimpanzee. The bonobo has not been studied as thoroughly as chimps. An additional attraction to the fruitarian extremist is the common figure of an 80%-fruit diet for bonobos. However, Chivers [1998, p. 326] reports that the figure of 80% fruit may be an overestimate: The mean fruit-eating score for the pygmy chimpanzee (Pan paniscus) may be inflated by being presented as a proportion of fruiting trees, rather than relative to feeding time (which may be closer to 50% than to 80%; see Susman, 1984). Note that the primary scientific research regarding bonobos is centered on their unique social organization, rather than their diet. More importantly, however, while ape diets are of value as a reference point in attempting to determine the diets of early (pre-human) hominids, diet varies considerably enough among primates that attempting to make definitive extrapolations from one species to another is unreliable. This is particularly the case with humans who, as we will see later, have a digestive system that is nearly unique among primates.

Other Implications of Ape Diets


Given that Fit Food for Humanity argues for vegetarianism on the basis that humans are very similar to the great apes, the new information that the great ape diets include at least some animal foods presents the advocates of comparative "proofs" with a problem. If the comparative method really "proves" that our diet should be that of the great apes, then the information that ape diets include some animal foods also "proves" that the human diet should include some animal foods as well. More precisely, it "proves" that we are not natural vegetarians, at least in the usual human sense. This point will be discussed further in a later section, where we will have some fun with the logic.

To Summarize:

43

The claim that "apes are vegetarians" is a misconception of the past. Virtually all of the great apes consume some animal food, even if only limited amounts of insects. Humans doing this would not be considered "vegetarian," yet attempts are often made to characterize apes as essentially vegetarian by falsely characterizing such insectivory as insignificant. The comparative analysis done in Fit Food for Humanity--humans versus the great apes-would actually indicate, if such comparisons are valid, that the natural human diet should include at least some animal foods (insects and/or vertebrate/mammalian flesh). The recent claims that humans are "closest" to allegedly "vegan" mountain gorillas are false. We might also note here the level of extremism inherent in the idealistic rationalizations above regarding ape diets: Despite the fact that animal foods average anywhere from approximately 1% to perhaps 5% or 6% of ape diet over the course of a year (depending on the species), even this modest level of animal products is not acknowledged as a "legitimate" aspect of their diets that humans might want to emulate, supposing one were intent on modeling their diets after comparative studies. What does this tell us about the mindset and open-mindedness of such advocates regarding the actual evidence?

PART 3: The Fossil-Record Evidence about Human Diet


Introduction
Meat a part of human diet for ~2.5 million years
The evidence of the fossil record is, by and large, clear: Since the inception of the earliest humans (i.e., the genus Homo, approximately 2.5 million years ago), the human diet has included meat. This is well-known in paleoanthropological circles, and is discussed in Setting the Scientific Record Straight on Humanity's Evolutionary Prehistoric Diet and Ape Diets. The current state of knowledge regarding the diet of our prehistoric ancestors is nicely summarized in Speth [1991, p. 265]:

44

[S]tone tools and fossil bones--the latter commonly displaying distinctive cut-marks produced when a carcass is dismembered and stripped of edible flesh with a sharp-edged stone flake--are found together on many Plio-Pleistocene archaeological sites, convincing proof that by at least 2.0 to 2.5 Ma [million years ago] before present (BP) these early hominids did in fact eat meat (Bunn 1986; Isaac and Crader 1981). In contrast, plant remains are absent or exceedingly rare on these ancient sites and their role in early hominid diet, therefore, can only be guessed on the basis of their known importance in contemporary forager diets, as well as their potential availability in Plio-Pleistocene environments (for example, see Peters et al. (1984); Sept (1984). Thus few today doubt that early hominids ate meat, and most would agree that they probably consumed far more meat than did their primate forebears. Instead, most studies nowadays focus primarily on how that meat was procured; that is, whether early hominids actively hunted animals, particularly largebodied prey, or scavenged carcasses... I fully concur with the view that meat was a regular and important component of early hominid diet. For this, the archaeological and taphonomic evidence is compelling.

Early hominid diet was mixed, not exclusive


The comments in Mann [1981, pp. 24-25] further illuminate the above: Nevertheless, given the available archaeological evidence and what is known of the dietary patterns of living gatherer/hunters and chimpanzees, it appears unlikely to me that all early hominids were almost exclusively carnivorous or herbivorous. It is more reasonable to suggest that the diet of most early hominids fell within the broad range of today's gatherer/hunter diets, but that within the wide spectrum of this adaptation, local environmental resources and seasonal scarcity may have forced some individual populations to become more dependent on vegetable or animal-tissue foods than others. The remarks by Mann remind us of the obvious: that early hominid diets, like hunter-gatherer diets, are a function of local flora and fauna; such diets are limited to the local food base (and to food acquired via trading).

"Natural" behavior a function of evolution The evidence that meat has been part of the human diet for ~2.5 million years, thus, directly implies that meat is a "natural" part of the human diet, where "natural" is defined as: those foods one is adapted to consume by evolution. (Side note to vegetarians: The fact that meat is a natural part of the evolutionary diet does not imply that one must, or even should, eat meat.) Some raw dietary advocates, in apparent denial of the evolutionary evidence, try to turn "opportunistic feeding" into a straw-man argument. The straw-man argument they construct is that the claim meat can be a natural part of the diet is based solely on the idea that humans can (and do) eat meat; they then claim it is circular logic, asserting that the "possibility" is not evidence it is "natural." However, this type of criticism or straw-man argument is based on a rather astonishing ignorance of--or at least certainly a denial of-evolutionary adaptation and how it occurs (discussed below). As such, the anti-"opportunistic feeding" straw-man argument is logically invalid.

Examining the rationalizations and denials of the fossil record evidence of human diet
As one might expect, the preceding information has been met with denial, crank science, and rationalizations from some in the raw vegan (and general veg*n) community. Some of these rationalizations and denials are as follows, with relevant comments.

45

CLAIM: What about the time before humans? What about Australopithecus? Aren't there tooth-wear studies that prove Australopithecus was a fruitarian? REPLY: If you go back far enough, you are dealing with a NON-HUMAN primate.
Australopithecus was a bipedal hominid in the line that led to humans, and was the last hominid prior to the evolution of Homo habilis, the first human in the genus Homo, but Australopithecus was itself still prehuman. If you go back somewhat further, you are dealing with apes that were primarily frugivorous, but as we will see below in discussing Australopithecus further, their diet would have included foods that were tougher than what we would generally call fruits today. Prior to these apes, if we go all the way back to the first primates 65 to 70 million years ago, from which all other primates evolved, then the diet was primarily insects. The important point regarding evolution is, as noted above, that the human diet has included meat since the inception of the human genus (Homo), and we have adapted to such a diet. If the diet of ancient frugivorous non-human primates is somehow supposed to be "very relevant" (more relevant, anyway, than diets of actual members of the genus Homo), then we are basically engaging in an absurd game of "moving the goalposts." If that's legitimate, then we might also say that the diet of the ancient insectivorous primates could be just as relevant.

Outdated toothwear studies often cited. As for Australopithecus, the highly publicized tooth-wear
studies cited in fruitarian/vegetarian lore are dated. (See related discussion in Part 1 of the Paleolithic Diet vs. Vegetarianism series.) Newer isotope studies of Australopithecus fossils indicate an omnivorous diet: Sillen [1992] analyzed the strontium/calcium (Sr/Ca) ratios in Australopithecus fossils, and concluded (p. 495): When specimens of the fossil Australopithecus robustus were examined, Sr/Ca values were inconsistent with that of a root, rhizome or seed-eating herbivore, suggesting that the diet of this species was more diverse than previously believed, and almost certainly included the consumption of animal foods. The results of Sillen [1992] were confirmed in a separate study, using stable carbon isotopic analysis; see Lee-Thorp et al. [1994] for details. Also see Sillen et al. [1995] for a followup to Sillen [1992].

Ancient fruits of different character than today's. A further note on Australopithecus: The fruits in
its diet were not like today's modern sweet hybrid fruits (such as one buys in a supermarket or produce stand). Instead, they included some very tough plant foods. Peters and Maguire [1981] analyzed the wild plant foods available in the Makapansgat (South Africa) area, and determined the structural strength of the foods available. They note (p. 565): Moreover, the most important potential plant food staples include very tough dry berries, beans and nuts, which require an average of 50-250 kg of compressive force to crack and crush them. Peters and Maguire concluded that even the strong (stronger than modern human) jaws of Australopithecus africanus were not strong enough to prepare all the tough plant foods, and simple stone tools would be necessary for survival in such an environment. (Needless to say, the fruits eaten by Australopithecus were markedly different from the diets advocated by modern promoters of fruitarianism--modern fruitarian diets emphasizing juicy, sweet fruits.) CLAIM: Our prehistoric ancestors could not eat meat prior to the advent of tools. (Often stated very emotionally as: Well, what about the time BEFORE tools!?)

REPLY: Chimpanzees kill and eat small animals without tools. The method used is flailing; see
Butynski [1981, table 2, p. 425] for details; also van Lawick-Goodall [1971, p. 283] for illustration. If chimpanzees can kill small prey without tools via flailing, humans are certainly capable of the same.

Naked apes without tools? Some fruitarian advocates promote the crank science/science-fiction claim
that humans are merely "naked apes, without tools," and that the human diet should be limited to foods that

46

can be obtained when one is nude, without tools. This bizarre fantasy is interpreted to mean that humans should limit their diets to soft fruits, leaves, and occasional seeds or nuts. However, even as stated, the fantasy actually implies the inclusion of a wide range of potential animal foods in the "natural" human diet. See the article, Selected Myths of Raw Foods, for further discussion of this, which includes a list of animal foods that meet the stated collection requirements. Note: A recent research report that capuchin monkeys are capable of making and using stone tools (see Westergaard and Suomi [1994]) demonstrates how utterly ridiculous the "naked ape without tools" myth is. One might regard such a myth as bizarre crank science; unfortunately a few fruitarian extremists promote the myth--often in a hostile manner. However, the most relevant point here is that humans have used tools since the inception of the human genus, so the question, "What about the time before tools?" is actually the question, "What about the time before the human genus existed?" This indicates the question is of limited relevance.

CLAIM: Eating meat is a perversion and UNnatural, but it does not affect reproduction. Hence, eating meat does not affect evolutionary selection. Thus, we are NOT adapted to meateating despite the dreadful "cultural" habit of eating meat for literally millions of years! (This argument is often stated in a highly emotional manner.)

REPLY: The above attitude is common in the raw veg*n movement. Such an attitude also reflects
misunderstanding and ignorance of how evolution works. Even if eating meat is "cultural" rather than a required survival tactic (the evidence of hunter-gatherer tribes suggests it is necessary for survival when living in the wild), if meat-eating is universal, then meat-eating is, in effect (or by default), a part of the environment. Once meat-eating becomes a part of the long-term environment, then evolutionary selective pressure will favor genes that are best adapted to that environment. In the long term, genetic adaptation to such a diet, by evolution, is the inevitable result. To illustrate this point, let us consider yet another crank-science fantasy promoted by certain fruitarian advocates: that of a pure fruitarian human for whom protein foods are "toxic," meaning that protein, above quite low levels (far below the levels accepted as necessary even by more mainstream vegan advocates), is believed harmful because it causes the production of "toxic" metabolic by-products. (Those not familiar with fruitarian lore may find such claims to resemble science fiction; unfortunately, such nonsense is actively promoted in some quarters.)

Could a fruitarian human have survived and evolved in an ancient environment favoring meat-eating? Assume a genetic mutation occurs and a human is born into a meat-eating prehistoric
group, but this human is--by pure chance--adapted to be a pure frugivore (fruit only), and protein is "toxic" (as described above) to that person. Clearly, such a person will not survive, long-term, in the meat-eating environment (the individual would eventually die from the toxic effects of the by-products of protein digestion) or would live a short, unhealthy life and be less likely to reproduce (hence the genes fail to survive long-term given that they will be out-competed reproductively). In sharp contrast to the preceding, any genetic mutations that enhance survival in the environment of meateating will exist in an environment where they can and will thrive--and will, over evolutionary time, outcompete those adaptations that do not enhance survival. This illustrates how interaction between behavior (culture) and the environment creates selection pressure that favors the universally prevalent diet (meateating) and disfavors a rare, narrow diet (e.g., fruit only) that is outside the range of the environmental support system. (Note: the remarks of Mann [1981] above are relevant here--I am not suggesting that the evolutionary human diet was 100% meat. Instead, it was a mix of animal and plant foods, varying according to location and seasonal availability of foods. The point is that meat was a significant part of the diet, hence "natural" in the most rigorous sense.)

47

What does "survival of the fittest" actually mean? The claim that humans have not adapted to meat
in ~2.5 million years of consumption rests on a misunderstanding of the principle of survival of the fittest. The idea that eating meat may not at first glance appear to (favorably) affect the reproduction of individuals comes from a misapplication and limited examination of the principle of survival of the fittest--that is, it considers the question on an individual, short-term basis only. However, survival of the fittest is really a broad-based, long-term (multi-generational) proposition. Those traits--mutations and adaptations--that ensure or promote survival over time and over multiple generations are the traits that will be propagated the most successfully and survive; this is the real meaning of survival of the fittest. Further, evolutionary selective pressure is driven by survival--an amoral principle--and not by such moral considerations as whether eating meat is moral or immoral, "good" or "bad," whether it violates so-called animal rights, and so on.

Diet, Evolution, and Culture


Feedback loop between evolution and culture
As discussed above, those aspects of culture that are universal, or nearly universal (tool use, meat-eating, and after the discovery and regular use of fire, eating cooked food) can and will impact evolutionary selection pressure. It does not matter whether such behavior is absolutely required for survival in any possible imaginable circumstances. What counts is that such behavior is universal (or nearly so) and therefore effectively required for survival under the actual conditions (which necessarily includes behavioral conditions) that a given individual or species finds itself. Thus there is an important feedback loop between evolution and behavior (culture). For another perspective on this, note that humans are unique because of their high intelligence (discussed in the next section). Given such intelligence, human survival becomes a function of behavior. And because of our high intelligence, human behavior (which constitutes part of the ongoing evolutionary environment) can be a malleable choice rather than a compelling "instinct." Further, behavior can be, and is, a part of culture. (Culture can be seen as a set of group-specific behaviors that are acquired via social means; see McGrew [1998] for discussion.) The following figure illustrates the nature of the behavior/culture evolutionary feedback loop. It is a modification of fig. 29.1, p. 439, from Wood [1995].

48

Courtesy link to Yale University Press graciously provided in exchange for permission to reproduce the above figure.

Diet and early social organization


Spuhler [1959] comments on the likely interaction of diet and culture (pp. 6-7): The change to a partially carnivorous diet had extremely broad implications for the social organization of early hominoids. [Note: "hominoids" (includes great apes and humans as a class) is probably a typo in the original here; "hominids" (which means humans and their bipedal predecessors) is likely what is meant.] Carnivores get a large supply of calories at each kill. This concentrated food is more easily transported to a central, continually used shelter than is low-calorie plant food, especially before containers were available... Compact animal protein high in calories is a good basis for food sharing...It is unlikely that the long dependency of human children--so important in the acquisition of culture by individuals--could develop in a society without food sharing. And the amount of information which needs to be transduced in a communication system for plant eaters like the gibbons is small compared to that needed in group-hunting of large animals.

Did the human brain give rise to culture and tool use, or vice versa? Washburn [1959] considers
the relationship between tool use and biological evolution, and observes (pp. 21, 29): [M]uch of what we think of as human evolved long after the use of tools. It is probably more correct to think of much of our structure as the result of culture than it is to think of men anatomically like ourselves slowly discovering culture... The general pattern of the human brain is very similar to that of ape or monkey. Its uniqueness lies in its large size and in the particular areas which are enlarged. From the immediate point of view, this human brain makes culture possible. From the evolutionary point of view, it is culture which creates the human brain.

49

The remarks of Spuhler and Washburn illuminate the culture-evolution feedback loop. The argument that "meat-eating does not (directly) impact the reproduction of individuals, hence we are not adapted," is erroneous because it ignores this important feedback loop.

Evidence of dietary culture in non-human primates


Some fruitarian extremists are quick to condemn all (human) cultural dietary habits as being allegedly unnatural. However, non-human primates show evidence of rudimentary, dietary cultural habits. Nishida [1991, p. 196] notes: Humans display diversified food culture: E. Rozin (1973, cited by P. Rozin, 1977) called "cuisine" culturally transmitted food practices... Nonhuman primate species also display diversified food habits within their own geographic distribution... The most easily observable result of cultural transmission processes is the difference in modes of behavior between different local populations of a species, which is uncorrelated with gene or resource distribution (Galef, 1976). Nishida [1991] provides an impressive listing of the differences in feeding behavior of chimpanzees, baboons, and monkeys. Of possible interest to raw-fooders is his description of the macaques of Koshima Island, Japan dipping food into seawater for seasoning (some raw-food advocates stridently condemn the practice of seasoning foods). Of course, that non-human primates display cultural food preferences suggests human cultural food preferences are an extension of that feature. There is an extensive body of scientific research on the topic of culture in non-human primates. For two recent papers that provide an entry to the research, consult McGrew [1998], which discusses culture in several non-human primates, and Boesch and Tomasello [1998], which compares chimpanzee and human cultures.

Fruitarian Extremists Retreat into Creationism


As the general information that the evolutionary diet (actually, range of diets) of humans included animal foods has spread into the raw vegan community, certain fruitarian extremists who want to continue to propagate the myth that 100% raw vegan/75+% fruit diets are the "most natural" diets for humans have made a hasty retreat into what could be called secular creationism. (This is similar to the sudden appearance of claims that the mountain gorilla is "closest" to humans, and vegan.) Relevant observations here are: If you are a creationist because it is a part of your sincere spiritual or religious views, please understand that this writer has no argument with you. Sincere spiritual views can be respected, at least regarding their genuineness of belief. Those fruitarian extremists who retreat into creationism for the sole purpose that it can be molded to fit their dietary dogma, however, appear to be engaging in blatant intellectual dishonesty. One fruitarian group advocating creationism went so far as to plagiarize portions of the book Darwin on Trial, by lawyer and creationist Phillip Johnson, in order to attack the evolutionary evidence against their diet

Adaptation in Action: Fruitarian Reaction to New Information


It is interesting to observe that certain fruitarian extremists--after their claims that "chimps are vegans" and "we evolved on a purely fruit/vegan diet" were discredited--adapted very quickly. In a matter of months (less than a year) new myths evolved and were promoted: "Humans are just like vegan mountain gorillas" plus the idea that species are created (nature is a "mystery") rather than evolved.

50

Thus we can wryly observe that certain fruitarian extremists "adapted," in less than a year, to the debunking of their old mythology with a brand-new mythology. And yet these same folks claim that humans cannot and did not adapt, via evolution, over ~2.5 million years, to a diet that includes some meat. Admittedly, these are different adaptations--one intellectual, the other physical. But we see that intellectual "adaptation" can be very rapid indeed, and the contrast between rapid intellectual adaptation versus claims that physical adaptation absolutely CANNOT have occurred in ~2.5 million years (despite fossil record evidence of physical change) presents us with an example of "raw irony"--a not-uncommon feature of the more extreme wing of the raw vegan movement.

Ironic, contradictory views of "nature's laws"


To add further irony to the situation, some fruitarian extremists simultaneously claim both that: We are created, not evolved; nature is a deep, profound mystery, and Nature is simplistic (on the other hand) and MUST follow the narrow, bizarre versions of nature's laws promoted by the (same) fruitarian extremists.

Of course, the ability of fruitarian extremists to contradict themselves so clearly, and get away with it, rests on the gullibility and idealism of their followers. This is yet another example of "raw irony," though in this case a much more unfortunate one (for the followers).

To Summarize:
Since the inception of the human genus Homo, ~2.5 million years ago, the human diet has included meat. Early hominid diet was probably like recent hunter-gatherer diets in the sense that it would have been a mixture of plant and animal foods, depending on habitat and season. There are no strictly vegetarian hunter-gatherers, and purely "carnivorous" hunter-gatherers (or nearly so) are rare (the traditional Inuit's 90-95% flesh diet being the only example); mixed diets predominate. As a universal, long-term aspect of culture that includes consumption of animal foods, biological adaptation to a diet that includes animal foods is an inevitable outcome of evolution. This occurs as a result of the feedback loop between evolution and long-term behavior patterns (including culture as an important element). As a significant part of the range of diets we are adapted to by evolution, meat--specifically the lean meat and organs of wild animals--can be considered natural (food). However, as humans are intelligent, we can use our intelligence to choose a different diet, if we wish

PART 4: Intelligence, Evolution of the Human Brain, and Diet


Introduction: Claims of the Comparative "Proofs"
Human intelligence ignored or rationalized. One of the key systematic restraints on the alleged
comparative anatomy/physiology "proofs" that promote particular diets is such proofs generally do not consider the many important features that make humans unique in nature. In particular, human intelligence is usually ignored or dismissed (via rationalizations) in such "proofs." For example, Fit Food for Humanity asserts (p. 14): But merely having a superior brain does not alter our anatomy and physiology which, according to natural law, remain characteristic of a total-vegetarian mammal, meant to eat a variety of vegetables, nuts and seeds.

51

Brain size discounted. Le Gros Clark is sometimes quoted by those advocating comparative "proofs" of
vegetarian diets. He also appears to minimize the importance of large brains in humans (the term is "encephalization," discussed later herein). From Le Gros Clark [1964, pp. 4-5]: In Homo the large size of the brain relative to the body weight is certainly a feature which distinguishes this genus from all other Hominoidea, but it actually represents no more than an extension of the trend toward a progressive elaboration of the brain shown in the evolution of related primates. The attitudes stated in the quote from Fit Food for Humanity reflect an underlying denial of the importance of human intelligence, in particular its impact on behavior (and, ultimately on morphology via evolution). The attitude one finds in some raw/veg*n circles is that human intelligence is suspect because it allows us to "make errors," i.e., to eat foods different from those promoted by dietary advocates (who often behave as if they are absolutely 100% certain that they know better than you what foods you should eat).

Hidden, contradictory views on the value of intelligence. An irony here is that there is a
contradiction in the logic of the attitudes of certain dietary advocates regarding intelligence. Some fruitarian extremists promote the alleged naturalness of fruitarian diets via the "humans are naked apes, without tools" myth discussed in the last section. This falsehood is often presented as actual science (needless to say, it is crank science) by those who promote it. Inasmuch as the advanced use of tools is an evolutionary characteristic of human intelligence, we can observe that those promoting the myth are saying that you should reject tool use in seeking your "natural" diet (this nonsense may even be presented as being scientific or logical). However, the preceding is equivalent to telling you to reject your intelligence, and even reject your status as a human being, in order to select the (allegedly) optimal diet. The argument made by fruitarian extremists is thus contradictory; the argument can be stated as: Use your intelligence to agree with the extremist that humans are "naked apes, without tools," and thus reject, in the future, your use of intelligence in food choices. Another irony here is that some of the extremists promoting this false myth present themselves as "scientific." Crank science (or science fiction) is a more accurate description for such myths, however. The contradictory logic of the "naked ape" myth is a good example of the ambivalent, confused attitude toward intelligence displayed by some dietary advocates.

Recent evolutionary research now emphasizes the interaction of diet and brain development. Further, recent research has rendered the quotations above outdated. The remarks of Milton
[1993] on the interaction between brain evolution and diet provide a brief introduction to a more modern perspective (p. 92): Specialized carnivores and herbivores that abound in the African savannas were evolving at the same time as early humans, perhaps forcing them [humans] to become a new type of omnivore, one ultimately dependent on social and technological innovation, and thus, to a great extent, on brain power. This section will review some of the research on the human brain, specifically: Its evolution, How our brains compare to other animals (especially primates--using a comparative anatomy approach), and The dietary/metabolic factors required to support the large human brain.

We begin our review with the topic of encephalization, or brain size.

Encephalization
Introduction
The most significant features that make humans unique in all of nature are our high intelligence and "large" brains. Here "large" means the brain is large relative to body size. Encephalization, or the relative size of the brain, is analyzed using a measure known as the encephalization quotient .

52

"Expected" vs. actual brain size. In order to measure encephalization, statistical models have been
developed that compare body size with brain size across species, thereby enabling the estimation of the "expected" brain mass for a given species based on its body mass. The actual brain mass of a species compared to (divided by) its "expected" brain mass gives the encephalization quotient. Higher quotients indicate species with larger-than-expected brain sizes. Thus, a quotient greater than 1 indicates an actual brain mass greater than predicted, while quotients less than 1 indicate less-than-expected brain mass.

The encephalization quotient is important because it allows the quantitative study and comparison of
brain sizes between different species by automatically adjusting for body size. For example, elephants, which are folivores, and certain carnivorous marine mammals have larger brains (actual physical mass) than humans. However, after adjusting for body size, humans have much "larger" brains than elephants or marine mammals. Additionally, the complexity of the brain is significant as well (and, of course, encephalization does not directly measure complexity--it only measures size).

Kleiber's Law. Kleiber's Law expresses the relationship between body size--specifically body mass--and
body metabolic energy requirements, i.e., RMR (resting metabolic energy requirements), also known as BMR (basal metabolic energy requirements). The form of the equation is: RMR = 70 * (W0.75) where RMR is measured in kcal/day, and W = weight in kg. (The above is adapted from Leonard and Robertson [1994].) An understanding of Kleiber's Law is important to several of the discussions in this paper.

Brain and digestive system compete for limited share of metabolic energy budget. A key
observation to note about relative brain size when averaged across species is that the equation for how brain size varies in proportion to body size uses an exponential scaling factor almost identical to the one used in the equation for how an organism's basal metabolic rate (BMR) varies with body size, i.e. Kleiber's Law. (The exponential scaling coefficient used in the equation for how brain mass varies in relation to body mass is 0.76 [Foley and Lee 1991]; the analogous scaling coefficient for BMR is 0.75; Kleiber [1961] as cited in Foley and Lee [1991].) This is important because it directly implies that brain size is closely linked to the amount of metabolic energy available to sustain it [Milton 1988, Parker 1990]. This point will become central as we proceed. For now it is enough to observe that the amount of energy available to the brain is dependent on how the body's total energy budget has to be allocated between the brain and other energy-intensive organs and systems, particularly the digestive system. Further, how much energy the digestive system requires (and thus how much is left over for the brain and other "expensive" organs) is a function of the kind of diet that a species has developed to handle during its evolution. As we proceed, we will return to the ramifications of this for human diet as it relates to the evolution of the large human brain.

A comparative anatomy analysis of primate brains


Stephan [1972] provides a comparative anatomy analysis of primate brains, including modern humans, non-human primates, and our prehistoric ancestors. Below is a summary of the important points made in Stephan [1972]. Note here that the Stephan paper was done before the Martin research cited above; thus Stephan uses a slightly different measure of encephalization.

Humans at top of primate scale. Using measures of encephalization based on the


encephalization of insectivorous primates, Stephan [1972] reports that humans are at the very top of the index (with an encephalization quotient, or EQ, of 28.8), while Lepilemur is at the bottom (EQ=2.4). Large gap between humans and great apes. There is a large gap between the encephalization of modern humans and all extant (present-day) non-human primates, including

53

our closest relatives, the great apes. This large gap is filled, however, by analysis of the encephalization of our prehistoric hominid ancestors. Brain enlargement disproportional. The enlargement of the human brain vs. non-human primates is not proportional (thereby possibly contradicting the earlier quote from Le Gros Clark). Stephan [1972, p. 174] notes: The enlargement of the brain is not proportional; that is, all parts do not develop at the same rate. The neocortex is by far the most progressive structure and therefore used to evaluate evolutionary progress ( = Ascending Primate Scale).

Also of interest here is the additional remark in Stephan [1972], regarding comparative anatomy in this context (p. 174): "It must be stressed, however, that because our scientific approach is indirect, it can provide only inferences, not proofs."

Factors in Encephalization: Energy (Metabolism) and Diet


The reality of encephalization--the relatively large human brain--with its correspondingly high intelligence, is readily apparent. The object of current research and debate, however, is the examination of what evolutionary factors have driven the development of increased human encephalization. Such research provides insight into our evolutionary diet, and also reveals why any comparative "proof" that ignores intelligence and the significant impact of brain size on metabolic requirements is logically dubious.

Life cycle and energy requirements


Parker [1990] analyzes intelligence and encephalization from the perspective of life history strategy (LHS) theory, a branch of behavioral ecology. LHS is based on the premise that evolutionary selection determines the timing of major life-cycle events--especially those related to reproduction--as the solution to energy optimization problems.

Extensive energy required for brain growth. Parker discusses the life history variables in nonhuman primates, and then examines how life history events relate to large brain size, gestation period, maturity at birth, growth rates and milk consumption, weaning and birth intervals, age of puberty, and other events. The motivation for studying such events is that the brain is the "pacemaker of the human life cycle" [Parker 1990, p. 144], and the slow pace of most human life history events reflects the extensive energy required for brain growth and maintenance. Foley and Lee [1991] analyze the evolutionary pattern of encephalization with respect to foraging and dietary strategies. They clearly state the difficulty of separating cause and effect in this regard; from Foley and Lee [1991, p. 223] In considering, for example, the development of human foraging strategies, increased returns for foraging effort and food processing may be an important prerequisite for encephalization, and in turn a large brain is necessary to organize human foraging behaviour.

Dietary quality is correlated with brain size. Foley and Lee first consider brain size vs. primate
feeding strategies, and note that folivorous diets (leaves) are correlated with smaller brains, while fruit and animal foods (insects, meat) are correlated with larger brains. The energetic costs, both daily and cumulative, of brains in humans and chimps, over the first 1-5 years of life are then compared. They note [Foley and Lee 1991, p. 226]: Overall the energetic costs of brain maintenance for modern humans are about three times those of a chimpanzee. Growth costs will also be commensurately larger.

54

Then they consider encephalization and delayed maturation in humans (compared to apes), and conclude, based on an analysis of brain growth, that the high energy costs of brain development are responsible for the delay in maturation.

Dietary shift beginning with Homo. Finally, they consider the dietary shifts that are found in the fossil
record with the advent of humans (genus Homo), remarking that [Foley and Lee 1991, p. 229]: The recent debate over the importance of meat-eating in human evolution has focused closely on the means of acquirement... but rather less on the quantities involved... In considering the evolution of human carnivory it may be that a level of 10-20% of nutritional intake may be sufficient to have major evolutionary consequences... Meat-eating, it may be argued, represents an expansion of resource breadth beyond that found in nonhuman primates... Homo, with its associated encephalization, may have been the product of the selection for individuals capable of exploiting these energy- and protein-rich resources as the habitats expanded (Foley 1987a). The last sentence in the preceding quote is provocative indeed--it suggests that we, and our large brains, may be the evolutionary result of selection that specifically favored meat-eating and a high-protein diet, i.e., a faunivorous diet.

How dietary quality relates to the brain's share of total metabolic budget
The research of Leonard and Robertson [1992, 1994] provides an in-depth analysis of brain and body metabolism energy requirements. Relevant points from their research:

Dramatic changes in last 4 million years. Leonard and Robertson [1992, p. 180] note:
Evidence from the prehistoric record indicates that dramatic changes have occurred in (1) brain and body size, (2) rates of maturation, and (3) foraging behavior during the course of hominid evolution, between four million years ago and the present. Consequently, it is reasonable to assume that significant changes in metabolic requirements and dietary changes have also occurred during this period.

Human brain's metabolic budget significantly different from apes. They point out that
anthropoid primates use ~8% of resting metabolism for the brain, other mammals (excluding humans) use 3-4%, but humans use an impressive 25% of resting metabolism for the brain. This indicates that the human "energy budget" is substantially different from all other animals, even our closest primate relatives--the anthropoid apes. In contrast, total human resting metabolism not significantly different. In order to understand the relationship between metabolism (energy budget) and body size, Leonard and Robertson collected relevant data on metabolism and body size in primates, humans, and other mammals. Some of the data collected included body size, brain size, resting metabolic rate (RMR), brain metabolic rate (brain MR), total energy expenditure (TEE), etc. They also collected activity and energy expenditure data on a few hunter-gatherer societies, and select non-human primates. Statistical analysis of the data showed that:

55

[L]arge-brained anthropoids [great apes], as a group, do not depart from the general mammalian metabolism/body size relationship. Several individual species, however, do deviate markedly from the RMRs predicted by the Kleiber relationship. To summarize, they found that the RMR (resting metabolic rate) for humans--that is, the overall total energy budget--as predicted by body size, did not deviate significantly from that predicted by the Kleiber relationship; that is, resting metabolic rate for humans is comparable to that for other animals (based on body mass).

Human brain MR 3.5 times higher than apes. However, in marked contrast, they found
that humans have a radically different brain energy metabolism than the other animals. From Leonard and Robertson [1992, p. 186]: The human brain represents about 2.5% of body weight and accounts for about 22% of resting metabolic needs... At 315 kcal (1318 kJ), humans use over 3.5 times more of RMR to maintain their brains than other anthropoids (i.e., a positive deviation of 255%). Clearly, even relative to other primate species, humans are distinct in the proportion of metabolic needs for the brain.

Important changes in diet of Homo erectus. Leonard and Robertson [1992] also applied
their statistical models to our prehistoric ancestors. Their analysis points to important changes in diet. From Leonard and Robertson [1992, p. 191]: The clear implication is that the diet of Homo erectus [note: Homo erectus evolved roughly 1.7 Mya] was not simply an australopithecine diet with more meat; rather there were important changes in both animal and vegetable components of the diet... What made meat an important resource to exploit was not its high protein content, rather, its high caloric return... In short, the early hunting-gathering life-way associated with H. erectus was a more efficient way of getting food which supported a 35-55% increase in caloric needs (relative to australopithecines)...

In a followup paper, Leonard and Robertson [1994] expanded the analysis of their 1992 paper by looking at the relationship of dietary quality to body size and metabolic rates. Important points from their 1994 paper:

Body size and dietary quality (DQ). When the relationship between body size and dietary
quality (i.e., the energy and nutrient content of the diet) was analyzed, the general relationship found was that larger primates, e.g., gorillas, have low-quality diets (gorillas are folivores), while the smaller primates have higher-quality (insectivorous) diets. Leonard and Robertson [1994, p. 78] note: In general, there is a negative relationship between diet quality (the energy and nutrient density of food items) and body size (Clutton-Brock and Harvey, 1977; Sailer et al., 1985).

Humans depart from normal DQ/body-weight relationship. Next, they calculated dietary
quality (DQ) indices for 5 hunter-gatherer groups, and 72 non-human primate species, relative to body weight. The observed DQ values for the hunter-gatherer groups were higher than predicted for a primate that weighs as much as a human. Leonard and Robertson [1994, p. 79] comment that:

56

Humans, however, appear to depart substantially from the primate DQ-body weight relationship (Fig. 1)... [T]he diets of the five hunter-gatherer groups are of much higher quality than expected for primates of their size. This generalization holds for most all subsistence-level human populations, as even agricultural groups consuming cereal-based diets have estimated DQs higher than other primates of comparable size... [E]ven in human populations where meat consumption is low, DQ is still much higher than in other large-bodied primates because grains are much more calorically dense than foliage.

Availability of grains vs. animal food. A note here regarding cereal (i.e., grain) consumption:
Prior to the development of agriculture, grains would only have been a miniscule fraction of the human diet due to lack of technology to farm, harvest, and store them. In some locations they may have been available seasonally, but the low level of harvesting, storing, and processing technology would necessarily have sharply limited their consumption. Hence prior to the development of agriculture 10,000 years ago (a tiny fraction of the 2.5-million-year existence of the Homo genus), grains were not a feasible option to increase DQ.

DQ and RMR. Another statistical comparison was run to analyze the relationship between
dietary quality and RMR, or resting metabolic rate. (The previous comparisons just discussed were based on body weight.) While the model fit was not very good, the data plot suggests that human DQ may also be higher than expected when using RMR as the yardstick for comparison in lieu of body weight. (As the model fit is not good, the preceding is a hypothesis only.)

The paradox: Where does the energy for the large human brain come from?
In any event, as we have seen, what begs explanation is that humans "spend" far more energy on the brain than other primates: 20-25% of RMR vs. roughly 8% in the great apes. Yet the total human RMR remains in line with predictions based purely on body size. This presents a paradox: where do humans get the extra energy to "spend" on our large brains? As we will see later in the research of Aiello and Wheeler, the most feasible hypothesis is that the answer lies in considerations of dietary efficiency and quality. Leonard and Robertson [1994, p. 83] conclude: These results imply that changes in diet quality during hominid evolution were linked with the evolution of brain size. The shift to a more calorically dense diet was probably needed in order to substantially increase the amount of metabolic energy being used by the hominid brain. Thus, while nutritional factors alone are not sufficient to explain the evolution of our large brains, it seems clear that certain dietary changes were necessary for substantial brain evolution to take place. In other words, while the evolutionary causes of the enlarging human brain themselves are thought to have been due to factors that go beyond diet alone (increasing social organization being prime among the proposed factors usually cited), a diet of sufficient quality would nevertheless have been an important prerequisite. That is, diet would have been an important hurdle--or limiting factor--to surmount in providing the necessary physiological basis for brain enlargement to occur within the context of whatever those other primary selective pressures might have been.

To summarize: The significance of Leonard and Robertson's research [1992, 1994] lies in their analysis of energy metabolism, which reveals the paradox: How do humans meet the dramatically higher energy needs of our brains, without a corresponding increase in RMR (which is related to our body size)? They argue that the factor that allows us to overcome the paradox is our higher-quality diet compared to other primates. Of course, prior to the advent of agriculture and the availability of grains, the primary source of such increased dietary quality was the consumption of fauna--animal foods, including insects.

57

The Relationship of Dietary Quality and Gut Efficiency to Brain Size


The Expensive Tissue Hypothesis of Aiello and Wheeler
Aiello and Wheeler [1995] is an excellent research paper that neatly ties together many of the threads addressed in the research papers discussed up to this point in this section. The insights available from this paper provide yet another view into the paradox of how humans can afford, in metabolic energy terms, the high "cost" of our large brains without a corresponding increase in BMR/RMR (basal metabolic rate, also known as resting metabolic rate). Aiello and Wheeler begin by comparing the actual size of the "average" human brain (for a body weight of 65 kg). The actual weight is ~1.3 kg, while the predicted weight is 268 g, based on the scaling model of Martin [1983, 1990] (as cited in Aiello and Wheeler [1995]; the Martin model relates brain mass to body mass). They then note that despite the large brain size with its consequent disproportionate demands on metabolism, the total BMR for humans is nevertheless well within the range expected for primates and other mammals of comparable body size. This of course brings us back to the paradox described above: Where does the "extra" metabolic energy come from to power the enlarged human brain, if the human body's total BMR is no higher than that of other comparably sized mammals?

Large brain compensated for by decreased gut size. Aiello and Wheeler then analyze the other
"expensive" organs in the body (expensive in terms of metabolic energy): heart, kidneys, liver, and gastrointestinal tract, noting that together with the brain, these organs account for the major share of total body BMR. Next they analyze the "expected" sizes of these major organs for a 65-kg non-human primate, and compare these with the actual organ sizes for an average 65-kg human. Their figure 3 (below) illustrates the dramatic differences between the expected and actual sizes of the human brain and gut: The larger-than-expected size of the human brain is compensated for by a smaller-than-expected gut size.

58

Aiello and Wheeler [1995, pp. 203-205] note: Although the human heart and kidneys are both close to the size expected for a 65-kg primate, the mass of the splanchnic [abdominal/gut] organs is approximately 900 g less than expected. Almost all of this shortfall is due to a reduction in the gastrointestinal tract, the total mass of which is only about 60% of that expected for a similar-sized primate. Therefore, the increase in mass of the human brain appears to be balanced by an almost identical reduction in the size of the gastrointestinal tract.... Consequently, the energetic saving attributable to the reduction of the gastrointestinal tract is approximately the same as the additional cost of the larger brain (table 4).

Less energy-intensive gut associated with higher dietary quality. The authors further point out
that the reduction in gut size is necessary to keep the BMR at the expected level. Additionally, they argue that the liver, heart, and kidneys cannot be significantly reduced in size to offset the energy costs of encephalization because of their highly critical functions; only the gut size can be reduced. Since gut size is associated with dietary quality (DQ), and the gut must shrink to support encephalization, this suggests that a high-quality diet is required for encephalization. That is, a higher-quality diet (more easily digested, and liberating more energy/nutrients per unit of digestive energy expended) allows a smaller gut, which frees energy for encephalization. Figure 5 from Aiello and Wheeler [1995], below, illustrates the diet/gut size/encephalization linkages.

59

Aiello and Wheeler deduce that the obvious way to increase DQ among early hominids is to increase the amount of animal foods (meat, organs, insects) in the diet. They note [Aiello and Wheeler 1995, p. 208]: A considerable problem for the early hominids would have been to provide themselves, as a large-bodied species, with sufficient quantities of high-quality food to permit the necessary reduction of the gut. The obvious solution would have been to include increasingly large amounts of animal-derived food in the diet (Speth 1989; Milton 1987, 1988).

Advent of cooking may have promoted further encephalization by reducing digestive energy required. Finally, in what will be controversial to raw-fooders, Aiello and Wheeler, after arguing that the
first major increase in encephalization was due to increased consumption of animal foods, next propose that the second major increase in brain size (with the appearance of archaic Homo sapiens) was due to the appearance of cooking practices. (Archaic Homo sapiens coincides with a timeframe of roughly 100,000 to 400,000 years ago.) Cooking neutralizes toxins and increases digestibility (of starch, protein, beta-carotene), and might make the digestion of cooked food (vs. raw) less "expensive" in metabolic energy terms--thereby freeing up energy for increased encephalization. It may be the case, therefore, that in evolutionary terms cooked food could have been responsible for some of the later increases in brain size and--since increased brain size is associated with increased technology--intelligence as well. The authors note [Aiello and Wheeler 1995, p. 210]: Cooking is a technological way of externalizing part of the digestive process. It not only reduces toxins in food but also increases its digestibility (Stahl 1984, Sussman 1987). This would be expected to make digestion a metabolically less expensive activity for modern humans than for non-human primates or earlier hominids. Cooking could also explain why modern humans are a bit more encephalized for their relative gut sizes than the non-human primates (see fig. 4). Note that while this particular aspect of Aiello and Wheeler's research is speculative, it is at least based on the existence of supportive evidence. Fruitarian readers might wish to compare the above analysis with the pejorative slogans about cooked food popular in the raw/fruitarian community that are often based on not much more than simply the strength of as much emotional vehemence as can be mustered.

Potential subterfuges by extremists. Note to readers: The 1995 paper by Aiello and Wheeler is
excellent and, if you have an interest in this topic, makes very worthwhile reading; it is available in many university libraries. At the end of the paper there is a series of comments by other scholars, followed by a rejoinder from Aiello and Wheeler, which gives good insight into the process of scientific inquiry and

60

debate. If you take the time to locate this paper, I would encourage reading not only the primary report, but the comments and rejoinder following. Reason: In my experience and opinion, there is enough of a tendency among raw/veg*n extremists to avoid the primary evidence in cases like this, while finding minor points to nitpick, that it would not be surprising if certain (fruitarian) extremists were to try to use, in an illicit manner, the criticisms by the commenting scholars--e.g., use the material without proper attribution, without consideration of the rejoinder remarks by Aiello and Wheeler, and possibly (the worst extremists of all) the material could be plagiarized or used in violation of copyright. This may sound inflammatory to some readers--please be assured, however, that this comment is based on hard experience with extremists. (See Previous Chapter)

Recent brain-size decreases in humans as further evidence of the brain/diet connection Why has brain size decreased 11% in the last 35,000 years and 8% in the last 10,000?
Another interesting observation about the brain/diet connection comes from recently updated and more rigorous analysis of changes in brain size in humans over the last 1.8 million years. Ruff, Trinkaus, and Holliday [1997] found that encephalization quotient (EQ) began reaching its peak with the first anatomically modern humans of approximately 90,000 years ago and has since remained fairly constant [see p. 174, Table 1]. Most surprisingly, however, absolute brain size--on the other hand--has decreased by 11% since 35,000 years ago, with most of this decrease (8%) coming in just the last 10,000 years. (The decrease in absolute brain size has been paralleled by roughly similar decreases in body size during the same period, resulting in EQ values that have remained roughly the same as before.)

The significance of constant EQ vs. shrinking brain size in context. This data suggests two
points. The first point--relating to EQ--is subject to two possible interpretations, at least on the face of it. One interpretation (characterized by somewhat wishful thinking) might be that, if we disregard the absolute decrease in brain and body size, and focus only on EQ, we can observe that EQ has remained constant over the last 10,000-35,000 years. One could then further conjecture that this implies humans have in some sense been successful in maintaining dietary quality during this time period, even considering the significant dietary changes that came with the advent of the agricultural revolution (roughly the last 10,000 years). However, the problem with such an interpretation is exactly that it depends on disregarding the information that overall body size diminished along with brain size--a most important point which needs to be taken into account. The alternate, and more plausible and genetically consistent interpretation begins by noting that EQ represents a genetically governed trait determined by our evolutionary heritage. Hence one would not expect EQ itself to have changed materially in just 10,000 years, as it would be unlikely such a brief period of evolutionary time could have been long enough for the actual genetics governing EQ (that is, relative brain size compared to body size) to have changed significantly regardless of dietary or other conditions.

Dietary/physiological mechanism may be responsible. This brings up the second point, which is
that the specific question here concerns a slightly different issue: the absolute decrease in brain size rather than the issue of EQ. Since the greatest majority of this decrease took place in just the last 10,000 years, a genetic mutation is no more likely as an explanation for the decrease in absolute brain size than it is for relative brain size, or EQ. This leaves us once again with a physiological/biochemical mechanism as the responsible factor, which of course puts diet squarely into the picture. (Not to mention that it is difficult to imagine plausible evolutionary selective pressures for brain size--primarily cultural/social/behavioral--that could conceivably be responsible for the reversal in brain size, since human cultural evolution has accelerated considerably during this period.)

Far-reaching dietary changes over the last 10,000 years. This leaves us with the indication that
there has likely been some kind of recent historical shortfall in some aspect of overall human nutrition--one that presents a limiting factor preventing the body/brain from reaching their complete genetic potential in terms of absolute physical development. The most obvious and far-reaching dietary change during the last 10,000 years has, of course, been the precipitous drop in animal food consumption (from perhaps 50% of

61

diet to 10% in some cases) with the advent of agriculture, accompanied by a large rise in grain consumption--a pattern that persists today. This provides suggestive evidence that the considerable changes in human diet from the previous hunter-gatherer way of life have likely had--and continue to have-substantial consequences.

Brain growth dependent on preformed long-chain fatty acids such as DHA. The most plausible
current hypothesis for the biological mechanism(s) responsible for the absolute decrease in brain size is that the shortfall in consumption of animal foods since the late Paleolithic has brought with it a consequent shortfall in consumption of preformed long-chain fatty acids [Eaton and Eaton 1998]. Specifically, for optimal growth, the brain is dependent on the fatty acids DHA (docosahexaenoic acid), DTA (docosatetraenoic acid), and AA (arachidonic acid) during development to support its growth during the formative years, particularly infancy. These are far more plentiful in animal foods than plant. Eaton et al. [1998] analyze the likely levels of intake of EFAs involved in brain metabolism (DHA, DTA, AA) in prehistoric times, under a wide range of assumptions regarding possible diets and EFA contents. Their model suggests that the levels of EFAs provided in the prehistoric diets was sufficient to support the brain expansion and evolution from prehistoric times to the present, and their analysis also suggests that the current low levels of EFA intake (provided by agricultural diets) may explain the recent smaller human brain size.

Rate of synthesis of DHA from plant-food precursors does not equal amounts available in animal foods. Although the human body will synthesize long-chain fatty acids from precursors in the diet
when not directly available, the rates of synthesis generally do not support the levels obtained when they are gotten directly in the diet. This is particularly critical in infancy, as human milk contains preformed DHA and other long-chain essential fatty acids, while plant-food based formulas do not (unless they have been supplemented). Animal studies indicate that synthesis of DHA from plant-source precursor fatty acids does not equal the levels of DHA observed when those are included in the diet: Anderson et al. [1990] as cited in Farquharson et al. [1992], Anderson and Connor [1994], Woods et al. [1996]. Similar results are reported from studies using human infants as subjects: Carlson et al. [1986], Farquharson et al. [1992], Salem et al. [1996]. For a discussion of the above studies, plus additional studies showing low levels of EFAs in body tissues of vegans, see Key Nutrients vis-a-vis Omnivorous Adaptation and Vegetarianism: Essential Fatty Acids.

To summarize The data that human brain size has fallen 11% in the last 35,000 years--with the bulk of that decrease (8%) coming in the last 10,000 years--furnishes, by extension, suggestive, potential corroborative support for the hypotheses explored earlier in this section that increasing brain development earlier in human evolution is correlated positively with the level of animal food in the diet. It also indicates that animal food may be a key component of dietary quality (DQ) that cannot be fully substituted for by increasing other components in the diet in its absence (such as grains). This indication is important to consider, because evidence available on the changes in food practices of more recent prehistoric humans (and of course, humans today) can be assessed in more depth and with a higher degree of resolution than dietary inferences about earlier humans. In conjunction with data about DHA synthesis in the body vs. obtaining it directly from diet, this provides a potentially important point of comparison for assessing hypotheses about the brain/diet connection.

Fruitarian Evolution:

62

Science Fact or Science Fiction?


Despite the material of this (and preceding) sections, some fruitarian extremists are likely to continue to promote the tired, discredited "party line," i.e., that humans evolved under a fruitarian diet, and that the natural diet of humans is exclusively fruit or a very-high-percentage fruit diet. (Such claims were assessed and found unsupportable earlier in this paper.) However, there are a few aspects of these claims that merit additional discussion.

Vague claims about an ancient frugivorous primate ancestor


First, those who make such claims may refer to some vague, ancient, frugivorous primate ancestor, implying that such an ancestor somehow proves humans are natural fruitarians. There are two major problems with this:

The reference to an ancient frugivorous ancestor is so vague that it is meaningless.


True, there were ancient frugivorous primates. However, the reference mixes up the diverse categories of primates, hominoids, and humans such that no meaningful statements can be made. Another problem here is that--as has been discussed earlier--the type of fruits eaten by earlier frugivorous apes included tougher and more fibrous fruits considerably different in character than the highly bred and far sweeter varieties developed for commercial production in modern times. Humans have been eating meat since the dawn of the Homo genus. Humans appeared with the advent of a brand-new genus (Homo) ~2.5 million years ago. Humans evolved on the savanna-- a very different environment from the forest home of the great apes. From the very inception of our genus, humans have been eating animal foods. There is overwhelming scientific evidence to support this point. (Some of the evidence is discussed in this and the preceding section; also see Part 1 of the Paleolithic Diet vs. Vegetarianism interview series, available on this site, for additional information and citations.) The diet of some vague prehistoric frugivore that may or may not be an ancestor is irrelevant in light of the status of humans as a new genus with a different diet (i.e., eating more animal foods) and evolving in a different environmental niche. In contrast to the extensive fossil record evidence of meat in the evolutionary diet, there is virtually no credible scientific evidence of a strict fruitarian or veg*n diet by our prehistoric human (and australopithecine) ancestors.

No fruitarian, or even vegan, hunter-gatherer societies have ever been found. Further, there is
no evidence to indicate there ever existed, in the past, a fruitarian (or veg*n) hunter-gatherer society. Even in the tropical rainforest, hunter-gatherers eat meat. (The Ache of Paraguay in the Amazon rainforest, one of the best-studied of all hunter-gatherer tribes, are a prime example with an average of over 50% meat consumption throughout the year, ranging from 47-77% depending on the season [Hill, Hawkes, Hurtado, and Kaplan 1984].) There is no evidence of any fruitarian societies, and--more to the point--the extensive anecdotal evidence (virtually the only evidence available) on modern attempts at (strict) fruitarianism indicates that it may work for a short while but almost always fails in the long run. (Even the fruitarian extremist "experts" often fail to follow the diet strictly, in the long term.)

Crank science and logical fallacies used in support of claims of fruitarian evolution
Returning to the false claims that humans evolved as fruitarians, the primary additional "evidence" presented in support of such claims consists of additional claims such as:

Egocentric claims. Unscientific, blatantly egocentric claims that eating foods other than fruit is
"maladaptive," or "in error." (These claims were discussed earlier in this paper, and are also reviewed in the article Selected Myths of Raw Foods.) Denial of evolutionary adaptation. Fallacious claims that humans did not or could not adapt, via evolution, to a diet that included animal foods (fauna), despite ~2.5 million years of evolution on such diets. (The topic of physiological adaptation is discussed later herein.)

63

Straw-man arguments based on SAD/SWD diets, i.e., citation of clinical research that
shows conventional vegan diets (not fruitarian diets) are better than the standard Western diet. The logical fallacies of this approach (which use the evidence against the typical Western diet to create a "straw man" to stand in for all omnivorous diets) are discussed in a later section. Uninformed comparative anatomy. Claiming that comparative anatomy indicates we evolved to become fruitarians. This claim is addressed in the sections that follow. "Protein is toxic" claims. The presentation of ancillary crank science claims, i.e., that "protein is toxic," in the sense that any excess protein--any above a very low standard much lower than cited even in conventional vegan research--produces metabolic by-products that are poison and will harm you. This crank science theory is then used to "prove" that animal foods are unnatural because they allegedly contain excess protein. Such theories are based on amazing ignorance of the reality that the human body is very well equipped to dispose of the metabolic by-products of protein digestion. Note: See the article, "Is Protein Toxic. Paranoia-based theories. Another factor in such crank theories is that they appear to be based on an intensely paranoid, pathological fear of any/all possible "toxins" from food, and hence such theories may actually (indirectly) promote obsession with unachievable ideals of total dietary purity, sometimes leading to eating disorders. "Fruit is like mother's milk" analogies. Claiming that human milk (a food whose calories are predominantly from fat) is "just like" sweet fruit--a food whose calories are predominantly from sugar. (Most people seem easily enough able to tell the difference between fat and sugar, though apparently certain fruitarian extremists have major, inexplicable difficulties in perceiving this distinction.) This claim has been blown up into a major bulwark used by some to promote fruitarianism, and is another prime example of bogus crank science. (See the article Fruit is Not Like Mother's Milk for details.)

Fruitarian denial of physiological evolution/adaptation What are the claims? Given the extensive fossil record evidence of meat consumption and physical
evolution since the first appearance of the human genus ~2.5 million years ago, those who wish to promote the theory that humans evolved on, and are adapted to, a diet of only fruit (with perhaps a small amount of other plant foods) must make the following three unusual claims: Despite the fossil record evidence of physical/anatomical evolution in the last ~2.5 million years, physiological evolution--specifically of the digestive system--did not and in fact could not have occurred. Because of the preceding, humans are still adapted to a fruitarian diet because of the (mystical) alleged ancient frugivorous primate ancestor, discussed in a previous section. Similarly, those who make the above two claims must also claim that meat-eating is "maladaptive," and that somehow the human genus survived and evolved (except for the digestive system, of course) over ~2.5 million years of evolution on a "maladaptive" diet! (Does common sense tell you anything about such far-fetched claims?)

Note that those who claim humans evolved to be strict veg*ns, whether fruitarian or not, also must make claims similar to the above to be able to somehow "explain away" the evidence of long-term meat consumption provided in the fossil record.

Analysis of the Fruitarian Claims


Let's now examine, in depth, some of the "evidence" and additional claims raised by promoters of the fruitarian evolution theory.

CLAIM: Morphological evolution is "easy," physiological evolution is nearly impossible.


The basic claim is that mutations that impact morphology--skin color, bone size, etc., have a minor impact

64

on the organism, while mutations that produce physiological changes occur under what seem to be more complicated interactions, hence are less likely to succeed or enhance survival.

REPLY: The claim is not only misleading and an implicit oversimplification, but a good example of the
way such claims are often hastily grasped at without thinking carefully through them first. As changes within the human body occur in a system in which both morphology and physiology are unified, physical changes often cannot be easily divided into two distinct categories, morphological and physiological. In reality, morphology and physiology are closely interrelated. Such morphological parameters as body size and shape are in fact regulated by hormones, which are part of your physiology.

Hormone regulation of morphology/physiology. For example, bone growth and size of the
bones/body are regulated by pituitary growth hormones and sex hormones. In turn, the growth hormones are themselves regulated by releasing hormones produced by the hypothalamus, which is in the brain. (See Tortora and Anagostakos [1981], and Tanner [1992] for relevant discussion of hormones.) This implies that any significant evolutionary change in morphology requires an associated simultaneous change in physiology, i.e., the relevant hormones must change as well. Thus we observe that a binary or "black-and-white" classification of changes as either morphological or physiological does not match well with reality. Once again, fruitarian extremists are engaging in the binary thinking and oversimplification that is characteristic of such dietary dogma. As a postscript to this topic, the remark of Tanner [1992b] is relevant: Differential growth rates are very often the mechanism of morphological evolution in primates... The above suggests that morphological changes are driven by changes in growth rates, which are determined by hormones (physiology). This is quite interesting, for it indicates that morphological evolution is driven by physiological evolution.

CLAIM: The digestive system is extremely complex. For it to evolve in a mere 2.5 million years is simply not possible. REPLY: The above is an unsupported rationalization. Consider that the human brain is far more complex
than the digestive system, and that the physiology of the entire body is regulated via the autonomic nervous system, which is controlled by the brain. More precisely, some of the control of the autonomic nervous system rests in the cortex [Tortora and Anagostakos 1981, p. 374], and the cortex is the part of the human brain that is larger and has further evolved vis-a-vis the great apes [Stephan 1972]. Thus we observe that the "master controller" of the human physiology is the brain, with the cortex playing a key role. Then we note that the human brain has in fact evolved (quite considerably) during the period, i.e., the "master controller" of the human physiology has evolved significantly in the last ~2.5 million years since the human genus first appeared. Thus, the bizarre claim that the physiology of the human digestive system could not have evolved during the period is easily seen as the ludicrous fantasy that it is.

CLAIM: Expert opinion is that physiological evolution is highly unlikely. From Jones [1992, p.
286]: Mutations that deviate from the norm often interfere with biochemical processes and are rigorously removed by selection. Much of the body is made up of the building blocks of cells, which must fit accurately with one another... Any mutation that changes the shape of such molecules will almost always be at a disadvantage as it reduces their ability to interact with each other or with their substrate. Stabilising selection of this kind can be a very conservative force. Stabilising selection is a powerful agent that leads to genetic homogeneity. To explain how genetic variation is maintained in the face of such widespread censorship by natural selection is one of the main problems of population genetics.

65

REPLY: The last sentence in the expert opinion quoted indicates, of course, that physiological evolution
happens anyway. How this happens is precisely one of the interesting challenges spurring current research. The above quote has been used to support the claim that physiological evolution cannot have occurred. Yet genetic variation--and hence physiological evolution--happens anyway. The quote here has simply been twisted and misrepresented.

CLAIM: There is no convincing teleonomy proof that humans have adapted to a diet that includes meat! REPLY: There is also no convincing teleonomy proof that humans have adapted to a strict fruitarian diet!
The appeal to teleonomy is, thus, a red herring (i.e., a diversion) and nothing more. Teleonomy, the scientific study of adaptations, is sometimes used as a defense of fruitarian evolution. Basically, the teleonomy proof argument is nothing more than a diversion--the fruitarian extremist demands what is implied to be definitive "proof," a teleonomy study or analysis, that "proves" humans have adapted to meat in the diet. Such claims are simply diversions, however, because those who make them do not have teleonomy "proof" that humans are adapted to a fruitarian diet either. In effect, the claim is used (if unconsciously) as a smokescreen to divert attention from the total lack of legitimate scientific evidence to support the fruitarian evolution theories. A few comments on teleonomy are appropriate here. Thornhill [1996, p. 107] provides an introduction to teleonomy, and he defines teleonomy as: ...[T]he study of the purposeful or functional design of living systems and the directional pressures that have designed adaptations during long-term evolution. Focus on evolutionary design and function. Teleonomy is primarily interested in phenotypic design and adaptation as they relate to the selective pressures that drive evolution. The focus is on the evolutionary function of adaptations, rather than the evolutionary origins of adaptations (see Thornhill [1996, p. 108]). The role of genetics in teleonomy studies is unclear from the discussion in Thornhill [1996], as Thornhill appears to regard such information as being of very limited value (pp. 116, 122-124). A relevant point here is that the fruitarian extremists who use the teleonomy argument may also simultaneously demand "proof" in the form of genetic models (which Thornhill, the teleonomist, suggests are of limited value in teleonomy). It seems odd to demand genetic "proof" in the context of a system where it is used only occasionally.

Criticisms and limitations of teleonomy. Teleonomy is also known as the adaptationist program, and
a well-known critique of the subject is provided by Gould and Lewontin [1994]. An interesting paper on the many statistical challenges faced in doing teleonomy studies is Pagel and Harvey [1988]. As will be discussed later in this paper, humans are unique in nature by virtue of certain important features: high intelligence, language, etc. This uniqueness, coupled with the serious statistical problems inherent in crossspecies comparisons, presents major statistical and analytical challenges to those who wish to unequivocally "prove" adaptation in humans (vis-a-vis other species) via teleonomy studies. As such, perhaps the best way to view teleonomy is as one analytical tool (out of many tools) available to the researcher, rather than the "best tool" or "only (valid) tool," which is what fruitarian extremists appear to imply in their use of teleonomy as a last-ditch defense of their crank science theories.

CLAIM: There are no examples of animals evolving backwards from a vegetarian to an omnivorous diet!

66

REPLY: Evolution does not move backwards or forwards. Evolution occurs as the result of
changes over time, the result of selective pressures in the environment (which can include behavior, especially for humans, per discussion in a previous section). The use of the word "backwards" shows the emotional nature of the fruitarian evolution claims. The idea that evolving from a vegetarian diet to an omnivorous (or faunivorous) diet is "backwards" is purely a subjective bias. Nature simply IS; our subjective opinions of the process are irrelevant.

CLAIM: If humans are adapted to eating meat, what exactly are those adaptations? REPLY: The answer to this question will be apparent in the later sections of this paper. Though a short
answer could be given here, it would remove the element of surprise from some of the following sections. This question is answered and addressed later with relevant details and context.

In summary, those who cling to crank science/science-fiction theories that humans evolved as fruitarians must also cling to incorrect logic and fantasies which state:

That the human digestive system is an evolutionary throwback to some mystical frugivorous ancestor, and, That the human body, including the brain--but not the digestive system!--has evolved over the last ~2.5 million years. In other words: Those who cling to false fruitarian evolution theories are claiming that evolution works, but not for the digestive system; the brain can evolve but the stomach cannot. Such bizarre assertions place the fruitarian evolution theory squarely in the realm of science fantasy rather than fact.

Further Evidence Against the Claims of Fruitarian Evolution


Fully upright bipedal posture, lack of arboreal adaptations
Unfortunately for the proponents of fruitarian evolution theories, there is evidence that we did not evolve as pure fruitarians, over and above the extensive fossil record evidence of human consumption of animal foods (fauna).

Quadrupedal vs. bipedal adaptations and tree-climbing. First, humans are fully upright and
bipedal, while apes are quadrupedal (though chimps, bonobos, and gorillas, of course, do have limited bipedal abilities). A fully upright bipedal posture is a major disadvantage in tree-climbing compared to quadrupedal motion--it is easier and safer to navigate trees when one can grasp and hold with four extremities rather than two, as humans must. That is, a fully upright bipedal posture is a disadvantage, overall, in collecting fruit--and hence would reduce survival of the alleged "pure fruitarian" prehistoric humans or proto-humans.

Bipedalism/quadrupedalism and fruit-picking. The hypothesis has been advanced that bipedalism
evolved because it is more efficient to pick small, low-growing fruits [Hunt 1994]. Standing up to pick low-

67

growing fruits allows use of both hands to pick fruit, and may allow more efficient harvesting of small fruit from small trees. However, this hypothesis [Hunt 1994] that bipedalism evolved as a fruit-picking strategy, specifically refers to chimps and australopithecines. Even then, it is well-known that chimp diet is not exclusively fruit; their diet also includes small but nutritionally significant amounts of fauna (insects, meat), along with other foods such as leaves and pith. Also, chimps retain quadrupedal motion and arboreal capabilities which they use for efficient fruit collection in larger trees. Hunt [1994] also discusses special muscular adaptations in chimps for vertical climbing (trees) and hanging (from trees). As for Australopithecus, radio-isotope studies indicate that Australopithecus was an omnivore/faunivore, and not a fruitarian. (See Sillen [1992], Lee-Thorp et al. [1994]; also Sillen et al. [1995].) Thus even if bipedalism did evolve initially as a specialized fruit-picking mechanism, the evolution occurred within the framework of an omnivorous/faunivorous diet (or a frugivorous diet that specifically included nutritionally significant amounts of fauna), not a strict fruitarian diet.

Actual fruit availability/dependability fluctuates with location and time. Note that the
hypothesis of Hunt [1994] does not suggest or imply that chimps or Australopithecus had a fruit-only diet. Also note that even in a savanna environment, fruit is not limited to low-growing trees. Consider the major restrictions placed on the survival of our prehistoric ancestors (or even modern-day chimps) if their diet were limited to only fruit picked within 2-3 meters of the ground. Such a restriction would greatly reduce survival, not increase it. A pure fruitarian needs a year-round supply of large amounts of fruit, and the ability to efficiently harvest as much of the available fruit as possible. That is, one needs to be able to efficiently pick most of an entire, tall tree--hence needs effective tree-climbing ability, that is, quadrupedal motion and arboreal adaptations.

Different adaptations, different tradeoffs. In contrast, an omnivore/faunivore for whom fruit is only
a part of the diet, and who effectively hunts/harvests other high-calorie foods (e.g., animal foods) can easily afford the loss of overall efficiency in fruit harvesting that fully upright bipedalism (as in humans) involves. This is particularly true if bipedalism enhances the ability to harvest high-calorie foods (fauna) by increasing hunting efficiency. (Hunting efficiency is discussed in a later section.) The set of adaptations of chimps--limited bipedal ability that improves efficiency in picking fruit from small trees, combined with quadrupedal/arboreal adaptations to efficiently harvest fruit from tall trees, appears to be the optimal adaptation set for a primarily frugivorous animal. However, humans clearly do not have all of these important survival adaptations. In regard to humans, Hunt [1994] argues that (p. 198): Accordingly, scavenging, hunting, provisioning and carrying arguments for the origin of bipedalism (Shipmam, 1986; Sinclair et al., 1986; Carrier, 1984; Wheeler, 1984; Lovejoy, 1981; Jungers, 1991) are more convincing explanations for the refinement of locomotor bipedalism in Homo erectus, as are heat stress models (Wheeler, 1993). Adaptations must be interpreted in context. That is, even if the hypothesis of Hunt [1994] is correct, the later improvements in bipedal motion associated with Homo erectus (which evolved approximately 1.7 million years ago) are likely associated with the role of early humans as hunters, and not as fruit-gatherers. Inasmuch as humans are a separate genus from chimps and Australopithecus, the later developments are, of course, more relevant. Aiello [1992, p. 43] points out that: In fact, at walking speeds human bipedalism is significantly more efficient than ape quadrupedalism. Needless to say, a fruitarian ape needs quadrupedal abilities for harvesting fruit from tall trees, and the lack thereof (in humans) suggests that harvesting fruit was much less important than the fruitarian evolution advocates claim. Additionally, note that humans did not evolve the special (orangutan) hand adaptations for tree-climbing (this is discussed further in a later section, with accompanying citations), and we also lack the special muscular adaptations of chimps [Hunt 1994] for arboreal motion. In contrast, fully upright

68

bipedalism made walking--and hunting--far more efficient, at the cost of reduced efficiency (overall) in fruit harvesting.

Body size and tree-climbing. Finally, the body size of humans may be very questionable for a pure
fruitarian. Humans are larger than orangutans, the most arboreal of the great apes, and we are larger than chimps. Hunt [1994] observed that larger chimps favored shorter trees, thereby minimizing the energy expended in climbing trees. The even larger body size of humans (compared to chimps or orangutans) also reduces our efficiency in harvesting fruit growing on the outer canopy of tall trees (where a substantial part of a tree's fruit is located), as our weight restricts us to relatively large branches. (Small branches could break, and falling out of trees certainly does not promote evolutionary survival.) Temerin et al. [1984, p. 225] note that: Arboreal travel is precarious at best, and increasing size makes compensatory demands for stability increasingly important. One response to this is to limit travel speeds. The remarks of Aiello [1992, p. 44] provide further insight: In larger animals, [tree] climbing can be two to three times more costly in energy terms than horizontal locomotion, a relationship that may explain why arboreality is largely confined to smaller primates. The obvious exception to this pattern is the arboreal orang-utan--the largest arboreal mammal with an approximate average body weight of 55 kilograms. It is, however, both highly specialized in its anatomy and fairly lethargic, climbing slowly and deliberately.

In summary, it seems quite odd that humans allegedly evolved to eat a nearly 100% fruit diet, yet our body size may be "too big" for an efficient frugivore, and we lack the major ape adaptations (quadrupedalism; hand and muscle adaptations) needed to efficiently collect fruit. That is, the fruitarian program alleging we are adapted to eat a diet of fruit lacks substantial awareness of the real-world needs of animals who eat high percentages of fruit to be able to efficiently collect that fruit. The obvious conclusion here is that the claim humans evolved as strict fruitarians is crank science (or, if you prefer, science fiction).

Fruitarian nutrition vs. brain evolution


Finally, the question of whether a fruitarian diet could supply the nutrition needed to drive brain evolution must be considered. All available evidence indicates that the answer is no. Some of the relevant reasons are:

Vitamins, minerals, and fats. Fruit is deficient in vitamin B-12; pure fruit diets may be
deficient in zinc, and even calcium. The EFA (essential fatty acid) balance in sweet fruit, the diet promoted by fruitarian extremists, is not optimal. Needless to say, vitamin B-12 (and mineral) deficiencies do not promote evolutionary survival, and a poor balance of EFAs does not promote brain development either. (Note: B-12, zinc, and EFAs are discussed in a later section herein.)

Evolution/survival on pure fruit diets not possible in nature given fluctuations in fruit availability. Getting enough fruit to survive on, every day of the year, is unlikely even in a
pristine, primary tropical rainforest. Preceding sections have discussed that orangutans apparently cannot get enough fruit in their remote tropical rainforest homes to live on fruit all year. Instead, they are forced to eat large amounts of leaves when fruit is scarce (see Knott [1998], MacKinnon [1974, 1977] as cited in Chapman and Chapman [1990]). If orangutans--quadrupeds with superb tree-climbing skills--cannot get enough fruit to survive on a (pure) fruitarian diet, it seems highly unlikely that humans (who are larger than orangutans, require more food, and are less adapted for tree-climbing) could consistently find enough fruit in a tropical rainforest (or savanna, where fruit is less plentiful), every day of the year, to satisfy their minimum calorie requirements.

69

By the way, climbing tall trees to pick fruit (orangutans and other primates do not rely on fruit that falls to the ground--apparently inadequate as a sole food source) is hard, dangerous work and greatly increases daily calorie requirements, which further increases the amount of food/fruit required. One wonders if the fruitarian extremists who promote idealistic visions of life in a jungle paradise filled with fruit trees have ever tried to pick more than minimal quantities of wild fruit-especially in a rainforest where the lowest fruit-bearing branches on a tree may be 20-30 or more meters straight up. Finally, no doubt some fruitarian extremists will claim that adequate fruit to support pure fruit diets was in fact available in prehistoric times. This is of course nothing but speculation, without reference to any kind of credible paleobotanical evidence. Even more relevant, where is there evidence of an adequate year-round supply of fruit (adequate to support a fruit-only diet for our prehistoric ancestors) on the African savanna of 1-3 million years ago, in the days of Australopithecus, and subsequently in the early days of the newly evolved Homo genus? The obvious conclusion is that a pure fruitarian diet is not feasible for hunter-gatherers or our prehistoric ancestors, as fluctuations in the fruit supply make survival on a pure fruitarian diet unlikely in the short run, and a virtual impossibility in the long run. If you cannot survive on the diet, it certainly cannot support brain evolution, or any other kind of evolution.

Synopsis and Section Summary


We have seen that:
Humans are the most encephalized and intelligent species in all of nature. The human basal metabolic rate (BMR) is normal for our body size, yet we invest far more energy in our brains than other mammals. This is a paradox: how can we "afford" it? The increase in encephalization in humans appears to be compensated for by a comparable decrease in gut size. A decrease in gut size with constant body mass and BMR is possible only via an increase in dietary quality (DQ). Prior to the development of agriculture, the only feasible/economic way to increase DQ was via consumption of animal foods. The available evidence supports the idea that the consumption of animal foods, including meat, was an important factor in encephalization and in the evolution of the human brain. That is, your large brain--which helps you choose your diet (and argue with others about diet)--is apparently, according to the most plausible interpretation of the evidence, the result of the evolutionary legacy of ~2.5 million years of meat consumption by your ancestors. Much of the material in this section comes from paleoanthropology, and is based on an analysis of the available data. The net effect of so much evidence, all supporting the same conclusion (that animal foods were a driving force in brain evolution) is both powerful and convincing. However, some readers might dissent and remind us that although the evidence is substantial, parts of it may be subject to different interpretations. That, however, is a result of the nature of parts of the available evidence, e.g., the fossil record. In this case, those who dissent are faced with the daunting task of plausibly supporting a picture of how brain evolution could have occurred on a purely plant food/fruit diet without agriculture.

70

Human brain size has decreased 11% in the last 35,000 years and 8% in the last 10,000-concurrent with decreasing animal food consumption during that time brought about by the advent of agricultural subsistence. This evidence provides some grounds for suggesting that the calorically/nutritionally dense animal tissues that presumably fueled human brain evolution may not be able to be completely/adequately substituted for by the calorically dense plant-based products of agriculture such as grains. The claim that humans evolved as fruitarians is an example of crank science; there is no credible scientific evidence to support such theories. No doubt certain fruitarian extremists will denounce the material in this section as "pseudoscience." In light of the fact that such adherents have no credible evidence to support their views, such allegations by fruitarian extremists are simply examples of hypocrisy and, perhaps, intellectual dishonesty.

Comparative "proofs" that ignore intelligence and brain size are dubious. At this point it is
appropriate to once again connect this section and the previous one to the topic of comparative anatomy. Two points are relevant here: First, the previous section regarding the fossil record and the culture/evolution feedback loop-combined with the material of this section--yields a powerful argument that animal foods were, and are, a part of the human evolutionary diet, i.e., are natural. Second, as this section provides evidence that the evolution of the brain may be closely tied to diet, clearly, comparative "proofs" of diet that ignore intelligence and brain size are very dubious indeed. Unfortunately, the major comparative "proofs" do ignore intelligence. We shall see in coming sections how this oversight renders major portions of those "proofs" logically invalid.

PART 5: Limitations on Comparative Dietary Proofs


This section discusses the limitations--structural and logical--inherent in the various comparative "proofs" of particular diets. Such "proofs" typically focus on comparative anatomy and comparative physiology, though they often include other elements as well. The first part of this section provides a list, with explanations, of the various limitations. The second part discusses some real-world examples that illustrate the difficulties in using comparative studies as "proof."

Editorial note: In the following, the term "comparative proofs" refers to the comparative proofs for
particular diets. It is not a general reference to any/all comparative studies.

Logical and Structural Limitations

Comparative "proofs" of diet are subjective and indeterminate; i.e., they do not provide actual proof of diet.
The "proof" in Fit Food for Humanity that claims humans are natural vegetarians was discussed in an earlier section. This "proof" groups all the anthropoid apes into the vegetarian category. However, we now know that the diet of the great apes is not strictly vegetarian, so let's have some fun and consider the following logical exercise.

71

First, let's modify the "proof" in Fit Food for Humanity as follows. Based on the more recent knowledge now available, replace the collective category for vegetarian apes with two more-well-defined (and accurate) ones: first, with a category consisting solely of the gorilla (a folivore); then second, with a category consisting solely of the chimpanzee (a frugivore/omnivore). The comparative information in Groves [1986] and Aiello and Dean [1990] might be helpful in producing the new comparison tables. This yields two new comparative "proofs" similar to the one in Fit Food for Humanity: one "proof" that shows humans are folivores, and yet another "proof" that humans are frugivores/omnivores. Next, note that Mills (The Comparative Anatomy of Eating) provides a lengthy "proof" that purports to show humans are herbivores. Finally, note that Voegtlin [1975] as cited in Fallon and Enig [1997], provides a comparative anatomy analysis that indicates humans are actually natural carnivores. See Functional and Structural Comparison of Man's Digestive Tract with that of a Dog or Sheep (below) for a comparative anatomy analysis from Voegtlin [1975].

FUNCTIONAL AND STRUCTURAL COMPARISON OF MAN'S DIGESTIVE TRACT WITH THAT OF A DOG AND SHEEP MAN TEETH Incisors Molars Canines JAW Movements Function Mastication Rumination STOMACH Capacity Emptying time Interdigestive rest Bacteria present Protozoa present Gastric acidity Cellulose digestion 2 Quarts 3 Hours Yes No No Strong None 2 Quarts 3 Hours Yes No No Strong None 8 1/2 Gallons Never Empties No Yes-Vital Yes-Vital Weak 70% - Vital Vertical Tearing-Crushing Unimportant Never Vertical Tearing-Crushing Unimportant Never Rotary Grinding Vital Function Vital Function Both Jaws Ridged Small Both Jaws Ridged Large Lower Jaw Only Flat Absent DOG SHEEP

72

Digestive activity Food absorbed from COLON & CECUM Size of colon Size of cecum Function of cecum Appendix Rectum Digestive activity Cellulose digestion Bacterial flora Food absorbed from Volume of feces Gross food in feces GALL BLADDER Size Function DIGESTIVE ACTIVITY From pancreas From bacteria From protozoa Digestive efficiency FEEDING HABITS Frequency SURVIVAL WITHOUT Stomach Colon and cecum Microorganisms Plant foods Animal protein RATIO OF BODY

Weak No

Weak No

Vital Function Vital Function

Short-Small Tiny None Vestigial Small None None Putrefactive None Small-Firm Rare

Short-Small Tiny None Absent Small None None Putrefactive None Small-Firm Rare

Long-Capacious Long-Capacious Vital Function Cecum Capacious Vital function 30%-Vital Fermentative Vital Function Voluminous Large Amount

Well-Developed Strong

Well Developed Strong

Often Absent Weak or Absent

Solely None None 100%

Solely None None 100%

Partial Partial Partial 50% or Less

Intermittent

Intermittent

Continuous

Possible Possible Possible Possible Impossible

Possible Possible Possible Possible Impossible

Impossible Impossible Impossible Impossible Possible

73

LENGTH TO: Entire digestive tract Small intestine 1:5 1:4 1:7 1:6 1:27 1:25

From: The Stone Age Diet pgs.44-45, Walter L. Voegtlin, M.D., F.A.C.P. 1975 Vantage Press, NY, NY. Four "proofs"--but what do they prove? Thus, as a result of producing and collecting the "proofs"
cited above, we will have 4 different types of comparative "proofs":

Humans are natural herbivores: 2 examples--Mills and Fit Food for Humanity. Humans are folivores: modification (cf. gorillas) of Fit Food for Humanity. Humans are frugivores/omnivores: modification (cf. chimps) of Fit Food for Humanity. Humans are carnivores: Voegtlin.

Then, we simply note that the above cannot all be true as they are contradictory. This points to the conclusion that comparative "proofs" of diet are subjective and indeterminate--that is, they are not hard proof of anything. It should be mentioned here that some who present comparative "proofs" openly admit their subjective nature (though the point is often not given proper emphasis, for the obvious reasons), whereas others are not so candid.

There is no logical validation of the list comparison process. That is, there is no logical validation for the implicit assumption that comparing two or more species via a list of features constitutes proof that a particular diet is natural or optimal.
Comparative studies can be powerful tools for analysis, but there is so much variation in nature that one cannot logically derive a "should" (i.e., you should eat this diet) from an "is" derived from other species' diets (i.e., that other, allegedly similar species, eat the same type of diet). The presence of substantial morphological similarity between two different species does not prove, or even imply, that their diets must be similar. The idea that in nature a "should" can be derived from an "is" (of another species) in such a fashion is a major logical fallacy of the comparative "proofs" for particular diets. Additionally, one other point bears repeating here. Let us pretend that comparison of a list of features somehow constitutes hard "proof" for a specific diet. It then follows that one must ask relevant questions, listed below:

How are features chosen for a comparison list? Objective, unambiguous guidelines are
necessary to select features. What are those guidelines? (What to include in the list? What to measure?) Once a feature is selected, how to measure: length, weight, volume, surface area, or other measure? (How to measure the list features?) What is the definition of a "match"? Given features from 2 different species in a comparison list, how "close" do they have to be to be considered a "match"? How many matches are required for "proof" given a comparison list of features which contains some number of "matches"? What validation is there for the "magic" number of matches that allegedly constitute "proof"?

74

For the obvious reasons (they are too heavily subjective), the above questions are unanswered. The advocates of comparative "proofs" are assuming that a matching list somehow "proves" their claims. In reality, it proves nothing--a matching list is evidence of possible association or similarity; it helps to define hypotheses, but it does not provide definitive proof of a hypothesis. As Ridley [1983, p. 34] points out: The comparative method is used to look for associations (between the characters of organisms) which natural selection, or some other causal principle, might be expected to give rise to. It can test hypotheses about the cause of adaptive patterns. The result of this test is an association or correlation. The mere association of two characters, A and B, does not tell us whether in nature A was the cause of B, or B of A.

The comparative "proofs" focus only on similarities while ignoring differences.


In legitimate applications of comparative anatomy and/or physiology, the differences--those features that make each species unique--are of central interest. In the comparative "proofs" for diets, the differences are often ignored or rationalized away. (That is, any differences that inhibit promotion of the target diet will be rationalized or ignored.) By ignoring or rationalizing away the differences, the comparative "proofs" of diet render themselves logically invalid. In a discussion of the topic of comparative studies of adaptation, Ridley [1983, p. 8] observes: The most rudimentary kinds of comparative study are those which, after looking through the literature more or less thoroughly, present a list of only the species (or some of them) which conform to the hypothesis... Such a list may usefully demonstrate that a trait is taxonomically widespread. Or it may suggest a hypothesis. But it cannot test one. If the method does not also list the species that do not conform to the hypothesis it is analogous to selecting only the supporting part of the results from a mass of experimental data... The very first requirement, then, for a test of a comparative hypothesis is that the criterion for including a species in the test is independent of whether it supports the hypothesis. This criterion may seem obvious; it may seem that it does not require statement. But it is enough to rule out the entire [comparative] literature (with one exception) from before about 1960. That exception is the French biologist Etienne Rabaud. The basic problem of studying similarities while deliberately ignoring differences is that it constitutes picking-and-choosing the data to conform to the particular hypotheses of interest. Note: the studies of Rabaud (mentioned above) are discussed further, later herein.

The comparative "proofs" ignore the fossil record and evolution. The comparative "proofs" are dated and do not reflect current knowledge of ape diets. The comparative "proofs" assume dietary categories are discrete (distinct), while in nature diets form a continuum.
The above points were covered in previous sections.

The comparative "proofs" ignore the important features that make humans unique in nature.
In particular, these features are generally ignored: Tool usage. Fully opposable thumbs. Fully upright and bipedal posture (note: thumbs and posture are mentioned in a few proofs). Fully developed language and culture. And--most important of all--high intelligence.

75

As discussed in the previous section, the topic of evolution of the brain vs. diet is usually not covered. Our brain allows us to develop technology (e.g., tools and cooking), which can impact our morphology via the behavior/culture evolutionary feedback loop. We will see later that human intelligence, expressed via tools and behavior, allows humans to easily overcome many of the alleged limits on adaptation that the comparative "proofs" discuss. The "Expensive Tissue Hypothesis" [Aiello and Wheeler 1995] and related research discussed in the previous section suggest that the evolution of our brain and higher human intelligence is in part the result of a diet that included significant amounts of animal foods. The energy budget and life history events of humans are unique, relevant, and should be covered in a legitimate comparative study. The comparative "proofs" ignore these facts.

Comparative "proofs" assume the form/function connection is strict and necessary.


Here the word "necessary" is used in its logical sense; i.e., that function necessarily follows form, and that the form/function linkage is strict. This is a critical assumption underlying the comparative "proofs" that promote specific diets. However, the assumption is incorrect and logically invalid.

Similar functions can be served by dissimilar forms. John McArdle, Ph.D., a scientific advisor to
the American Anti-Vivisection Society notes [McArdle 1996, p. 173]: It is also important to remember that the relation between form (anatomy/physiology) and function (behavior) is not always one to one. Individual anatomical structures can serve one or more functions and similar functions can be served by several forms. The form/function relationship can be dissimilar even in closely related species. Sussman [1987] comments that (p. 153): When comparing species-specific dietary patterns between different primate species, we often find that these patterns are not dependent upon taxonomic relationships. Closely related species often have very different diets, whereas many unrelated species have convergent dietary patterns. However, we would assume that at some level there is a relationship between dietary patterns, foraging patterns, and anatomical and physiological adaptations. Overall systemic constraints as important as individual functions/forms. Finally, Zuckerman [1933] reminds us of important systematic restraints (pp. 12-13): Function cannot be stressed more than form. Their separation from each other, and their individual isolation from the intact working organism, are both essential, but to a large extent necessarily incomplete, operations. The same is true of the abstraction of biochemical facts. Form, chemistry, and function are indissolubly united, and it may truly be said that from the taxonomic standpoint all characters of the body, whatever their nature, have a fundamental equivalency... To discuss them on the basis of ex cathedra judgements [i.e., via subjective assessments only] of their relationship to heritage or habitus is to attempt the impossible, and to ignore in greater part the lesson taught by the experimental study of genetics and growth.

Research on the form/function linkage


Some examples of research studies that tried to document a form/function linkage are as follows:

Early research. As mentioned above, the French biologist Etienne Rabaud's work is among the
best early research on evolutionary adaptation, including the form/function linkage. From Ridley [1983, p. 8]: Rabaud was concerned with Cuverian laws of the relationship between anatomy and niche, such as that herbivores have flat, grinding teeth and carnivores sharp, piercing teeth. Rabaud first looked systematically at these laws in his book of 1911, and later in a longer work on Les phenomenes de convergence en biologie (1925). In 1925 he summarized, systematically, such standard examples of [evolutionary] convergence as adaptations for flying and swimming. Each systematic summary

76

led to the same conclusion: "one can recognize no necessary relation between the form and the way of life of the organisms considered" (1925, p. 150). The laws were not universal, he concluded, but were based only on facts that had been selected because they conformed... [to the hypotheses].

Gut morphology. Vermes and Weidholz [1930] as cited in Hill [1958] tried to correlate diet
with gut (digestive system) morphology in primates. Hill [1958] describes the results of Vermes and Weidholz (p. 146): The authors attempt to substantiate the correlation between diet and gastrointestinal morphology among a series of Primates; they conclude that the distinctions are not so obvious as in other orders and in most cases do not permit reliable conclusions to be drawn, while in some cases the evidence is contradictory. The above reflects the reality that analysis of the gut is not so simplistic as one might expect from the comparative "proofs." (In later sections, more recent studies--that yield more definitive results than Vermes and Weidholz--will be discussed.)

Mastication. Wood [1994] analyzed the correspondence between diet and masticatory (chewing)
morphology in 18 extant primate taxa. The analysis covered 30 coordinate points on the skull. Wood [1994, abstract] concludes: The results of this study suggest that there is no definitive or straightforward relationship between dietary type and the functional morphology of the masticatory apparatus discernible among the taxa studied, although some encouraging patterns may be observed. Primate dietary behavior is flexible and may not be predicted with certainty on morphological grounds. Once again, efforts were unsuccessful in establishing a strong linkage between form and function in the context of diet and morphology.

More reliable insights by limiting complexity and narrowly defining features to be studied. In contrast to the above studies, if a number of steps are taken to reduce the complexity
of the problem, i.e., limit the analysis to a small group of closely related species, study only a small data set of narrowly defined physical features, and limit the statistical analysis to elementary procedures, then one might find evidence of a form/function linkage (in the set of species studied). This describes the approach of Anapol and Lee [1994], whose research analyzed the morphology of the jaws and teeth of 8 species of platyrrhine monkeys from the Surinam rainforest. Their research found statistically significant differences that were associated with diet in some of the measures of tooth, jaw, and masticatory musculature. Examination of the Anapol and Lee [1994] research versus the comparative "proofs" of diet illustrates some important points: o

Anapol and Lee are interested in morphological specializations-differences--because those differences reflect evolutionary adaptations to different
diets. That is, Anapol and Lee studied both similarities AND differences.

o o

In contrast, the comparative "proofs" focus only on similarities and ignore or


rationalize away those differences deemed inconvenient to the goal of promoting dietary dogma. Also, the research of Anapol and Lee is descriptive; in contrast, the comparative proofs allege that their analysis is prescriptive, i.e., you "should" eat the diet they promote.

77

The work of Anapol and Lee is a good example of a legitimate application of comparative anatomy; in contrast, the comparative proofs of diet are really misapplications of comparative anatomy. Thus we see that a narrowly focused study, within a closely related group of species, might find possible evidence of a form/function linkage. Of course, in sharp contrast, the comparative "proofs" of diet are overly broad, and their conclusions dubious. As the form/function linkage assumption is such a critical part of the comparative "proofs" of diets, we will next examine additional aspects of the form/function relationship, including specific examples of forms that can serve more than one function.

The same form can serve multiple/different functions. Canine teeth. The quote by McArdle in the preceding section serves as a good introduction to this topic.
Canine teeth are a good example of one form serving different functions. In true carnivores, canine teeth help in cutting the flesh of prey. However, some frugivores find large canines to be helpful in processing fruit. Regarding 3 different species of platyrrhine monkeys, Anapol and Lee [1994, pp. 250-251], note: The primary food resource of Chiropotes santanas consists of young seeds from unripe fruit that are obtained by removing the tough pericarp with the anterior dentition (van Roosmalen et al., 1988; Kinzey and Norconk, 1990; Kinzey, 1992)... The most predominant feature of the dentition of Chiropotes satanas is the exceptionally robust upper and lower canines, also present in both Cebus species studied here... For primates harvesting sclerocarp, robust canines provide an advantage for opening exceptionally hard unripe fruits to gain access to seeds (Kinzey and Norconk, 1990).

Primate teeth. Also see Richard [1985, p. 190] for comments on why frugivorous primates need sturdy
teeth, specifically incisors. Richard [1985, p. 194] also notes that "features of tooth morphology indicate diet habits strongly but not perfectly." Mann [1981] reports that chimpanzee and gorilla teeth are very similar despite significantly different diets.

Teeth of hominids. Mann [1981] also contrasts the sharply different analyses of the diets suggested by
the dental systems of early hominids. He cites Jolly [1970, 1972], who argues for a predominantly plant diet (but non-vegetarian; Jolly specifically included vertebrate meat in the diet), versus Szalay [1975], who interprets the same dental evidence as adaptations to hunting and a meat-predominant diet. (See Mann [1981] for Jolly, Szalay citations.) Garn and Leonard [1989] discuss the form/function connection as it applies to the dentition (teeth) of both modern and prehistoric humans. Their remarks bring the issue into clear focus [Garn and Leonard 1989 (pp. 338-339): [E]ven the [Homo] erectus fossils did not have long, projecting gorilla-like canines or the highly convoluted wrinkled molar surfaces that suggest dependence on roots, shoots, and large leaves. Our dentitions are now at most indicative of an omnivorous but soft diet and they are equally suitable for gnawing at bones, gorging on fruit, or consuming fried rice, boiled wheat, and unleavened cakes. However, and despite their more specialized dentitions, chimpanzees prove to be far more omnivorous in the wild than we had earlier reason to imagine. (3)... We therefore grant the [Homo] erectus fossils a considerable year-round supply of animal flesh, but we have no knowledge of how much vegetation was also dug, pulled, picked, or stripped...

78

Climatic data and faunal associations indicate that these earlier people of our genus were not browsers. They correspond to the notion of "early man" as hunters, at least in part... Thus we see that a particular form can serve multiple functions. As we will discuss in a later section, the human hand is an excellent example of one form that can serve a myriad of functions.

Subtle changes in form can produce or support significant changes in the function of organs or body systems.
Milton [1993, p. 89] reports: This research further shows that even without major changes in the design of the digestive tract, subtle adjustments in the size of different segments of the gut can help compensate for nutritional problems posed by an animal's dietary choices. The above quote is even more relevant in light of the fact that human (and other primate) guts are "elastic" and their dimensions can change in reaction to dietary changes. This will be explored in a later section. Richard [1995] notes that juvenile apes have well-differentiated cusps in their molars, while adult apes have flat molars that display heavy wear. However, the larger teeth and stronger masticatory muscles in the adults may serve the same purpose as the weaker muscles and differentiated cusps in the juveniles. This is an example of soft-tissue changes compensating for hard-tissue changes, i.e., tooth wear.

The function served by a particular form can vary dramatically according to specific feeding behavior.
We have already discussed examples of this: the platyrrhine monkeys using canine teeth to eat hard fruits, and the discussion of the considerable versatility of human dentition. Additional examples are: The sportive lemur (Lepilemur mustelinus), a very small folivore, engages in coprophagy--it eats its own feces, for a "second-stage" digestion. This 2-stage process may increase the energy value of the lemur's diet [Richard 1995]; see also Hladik and Charles-Dominique [1974] as cited in Chivers and Hladik [1980].

The surface area of the digestive system may be less important than the retention time of food in the gut [Richard 1995]. Milton and Demment [1988] analyzed the digestion
mean transit time (retention time) of different types of foods in chimps and humans. They found that lower-quality foods (high-fiber, low-energy) passed through the system faster than higher quality (usually low-fiber, high-energy) foods. Reducing the transit time for lower-energy/lowerquality foods is apparently a survival tactic, as it supports increased intake of lower-energy foods (to meet energy requirements) when necessary.

We will see later that failure to consider differences in feeding behavior makes major parts of the comparative proofs of diets invalid.

Analysis of dietary adaptations is non-trivial.


However, some of the comparative "proofs" themselves do border on the trivial. More seriously, though, as Richard [1995] points out, evolutionary adaptations are compromise solutions that reflect the effect of multiple selection pressures, applied simultaneously. For example, a large gut might assist digestion. However, if the gut size increases sharply and in a disproportionate manner, the animal might not be able to outrun predators. Such an animal would quickly go extinct. This illustrates that one cannot look at single organs or subsystems in isolation. Rather, one must look at the whole organism, or at least attempt to do so.

79

An alternate way of stating the above is that adaptations (morphology) are the solutions to multivariate (multiple-variable) optimization problems, where: Many variables are constrained, and Time is a variable as well, i.e., the interrelationships can change over time.

Diets can temporarily change with habitat in the same species. To illustrate the latter point: a
species may evolve under a specific diet, then be forced to change the diet (within the implicit range of adaptation) because of habitat changes driven by climate change. As Richard [1995, p. 190] notes: But environments do change, sometimes quite rapidly, and an animal is not necessarily specialized to eat the foods we see it feeding on now; a species can change its diet (within limits, at least) faster than it can change its teeth. Staple foods vs. critical but more rare foods. Richard [1995] also notes the difficult problem of determining which selection pressure is more relevant in analyzing adaptation--the consumption of a dietary staple, or a food that is consumed rarely but that is the sole source of specific nutrients, e.g., vitamin B-12. The above reflect why examination of all the evidence of morphology, the fossil record, and paleoclimate are important in assessing adaptation.

Phylogenetic (structural) challenges in comparative studies. The form of digestive features can be affected by non-diet factors. The comparison of animal
features using data from individual species that belong to different taxonomic groups presents subtle but important technical (statistical) difficulties in an analysis. As Pagel and Harvey [1988, p. 417] note: [A]ny comparative relationship obtained from an analysis that includes more than one taxon [i.e., phylogenetic category] may be explained, in principle, by any other factors that vary across those taxa... For example, McNab (1986) pointed out that size-corrected metabolic rates of species of mammals are associated with diet. Elgar and Harvey (1987) however, show that McNab's results cannot be easily be separated by phylogeny... If the character of interest varies only at a very high taxonomic level it may be extremely difficult to separate the favored explanation for the character from many other explanations associated with phylogenetic differences. The basic problem here is that the level of aggregation [grouping] of the raw data used in the analysis interferes with the statistical independence required for analysis. (On top of which, the possible existence of common evolutionary ancestors for the species being compared may make the analysis even more complicated.) Readers interested in the statistical details should consult Pagel and Harvey [1988], and also Elgar and Harvey [1987] (as cited in Pagel and Harvey [1988]).

Difficulty of untangling cause and effect. The importance of the above as it relates to simplistic
comparative "proofs" of diet is that it raises the question of cause and effect. In the comparative "proofs," are all of the observed differences in features really due to diet, or are some of them simply artifacts of cross-taxa (between-group) comparisons? That is, if the features in question may be impinged on, or shaped, at least in part, by things that affect a range of features in that taxonomic class beyond diet itself, then one must be able to sort out the cause and effect relationships, or valid conclusions cannot be drawn.

80

Counterexamples to the Paradigm of Comparative Proofs of Diet


As briefly explained above, the comparative proofs of diet are subject to a number of logical and structural limitations. To further illustrate this, let's examine now two animals that illustrate the importance of feeding behavior, and how it can invalidate simplistic comparative proofs that assume the form/function linkage is strict.

Polar Bears: an example of semi-aquatic feeding behavior


The polar bear, Ursus maritimus, physically resembles its relatives, the brown bear and grizzly. Genetically, polar bears evolved from brown bears; see Talbot and Shields [1996], also Zhang and Ryder [1994] for information on the genetic similarities. A limited comparative study of polar bears vs. brown bears might find some evidence that the polar bear is more carnivorous than the brown bear. However, more importantly, without knowledge of the actual--and unusual, for bears--feeding behavior of the polar bear, one might conclude from a comparative study that the polar bear "should" be a predominantly terrestial animal rather than the semi-aquatic animal it actually is. (The primary evidence for aquatic adaptation by the polar bear is webbed feet; however one could presume that is an adaptation for walking on slippery ice, rather than swimming.) How aquatic is the polar bear? Stirling [1988] describes an aquatic stalking procedure commonly used by bears in hunting seals (a major food for them). One type of aquatic stalk is done by swimming underwater, in a stealthy manner, then exploding onto the ice by the seal (often the seal escapes). In another type of stalk, the polar bear lays down flat on the ice and slides along shallow water-filled channels on top of the ice. Stirling also describes how polar bears capture sea birds by diving underneath and biting from below. Garner et al. [1990] fitted 10 female polar bears with satellite telemetry collars in 1986 and tracked their movements on the Arctic pack ice floes, near Alaska and Siberia, for over a year. They calculated a minimum distance traveled by the bears on the shifting pack ice of the Arctic as ranging from 4,650-6,339 km (2,888-3,937 miles) over time periods ranging from 360-589 days. They note [Garner et al. 1990, p. 224], "[B]ears must move extensively to maintain contact with the seasonally fluctuating pack ice." Thus we observe that polar bears are indeed semi-aquatic in behavior, able to travel long distances over ice and open water. Yet polar bears have almost none of the adaptations for aquatic life that (carnivorous) fully aquatic mammals have. (See Estes [1989] for a discussion of the adaptations of carnivorous aquatic mammals.) The polar bear illustrates how an animal can lead a semi-aquatic life, with very few aquatic adaptations, simply via adaptive behavior. That is, adaptive behavior allows the polar bear to overcome what appears to be a limit (of sorts) on morphology. We will see later that the comparative proofs of diet make the serious mistake of ignoring adaptive human behavior.

The Giant Panda: a "carnivore" on a bamboo diet


The giant panda, Ailuropoda melanoleuca, is a member of the carnivore order, but the diet of the panda is predominantly plant food. Schaller et al. [1989, pp. 218-219] note that, "Pandas may consume plants other than bamboo, and they also eat meat when available (Hu 1981; Schaller et al. 1985)... F. scabrida bamboo was the principal food of most pandas."

81

Gittleman [1994] provides a comparative study of the panda. Citing Radinsky [1981], Gittleman notes that an analysis of panda skulls and bone lengths finds they are similar to other carnivore species. The life history traits, however, are significantly different for pandas (vs. other carnivores). Recall that life history traits are ignored by the comparative proofs of diet. Let's now briefly consider the characteristics of the panda that suggest a coarse, largely herbivorous diet. Raven [1936], as cited in Sheldon [1975], lists the adaptations as: lining of the esophagus, thick-walled stomach, small liver, large gall bladder, and pancreas. Gittleman [1994] and Sheldon [1975] both mention the large molars and masseter muscles of the panda (for mastication). Gittleman [1994, p. 458] also mentions the "enlarged radial sesamoids (the so-called panda's thumb) used for pulling apart leaves." So, a comparative study of pandas, if done in sufficient depth, might suggest that the panda diet probably includes substantial amounts of coarse vegetation. However, without knowledge of the habitat and actual feeding behavior, such a study would not constitute proof that the actual diet of the panda "should" or "must be" close to 100% bamboo. (The radial sesamoids are slim evidence on which to conclude that the diet is almost purely bamboo; one would be figuratively "hanging by the thumbs," or "hanging by the sesamoids" on such limited evidence.) The example of the panda illustrates the problem of "resolution levels." A comparative study might suggest that the panda eats considerable vegetation, but such a study does not have the resolution to "prove" that the panda "must" eat a diet that is 100% bamboo vs. a diet that is 75% bamboo + 25% other foods. This is relevant because there are fruitarian extremists who suggest that the bogus comparative "proofs" they promote suggest the human gut is specialized for a nearly 100% fruit diet (vs. for example, a diet that is 60% fruit + 40% other foods). Such claims are unrealistic, not only because of the oversimplistic nature of the comparisons themselves as discussed earlier, but also also given the low resolution of the comparative "proofs."

Polar Bears and Pandas: Epilogue


The examples of the polar bear and panda illustrate that feeding behavior can: Overcome what appear to be limits in morphology. That comparative studies have limits or a resolution level, and That comparative "proofs" that ignore behavior do so at the risk of being invalid.

Section Summary
Some of the major logical and structural limitations on comparative proofs of diet are: Comparative proofs of diet are subjective and indeterminate; i.e., they do not provide actual proof of diet. The comparative "proofs" focus only on similarities, and ignore differences. The comparative "proofs" ignore the fossil record, evolution, and current knowledge of ape diets. The comparative "proofs" assume dietary categories are discrete (distinct), while in nature diets form a continuum. The comparative "proofs" ignore the important features that make humans unique in nature. Comparative "proofs" assume the form/function connection is strict and necessary. However: o The same form can serve multiple, different functions. o Subtle changes in form can produce or support significant changes in the function of organs or body systems. o The function served by a particular form can vary dramatically, by feeding behavior. Analysis of dietary adaptations is non-trivial.

82

Some of the differences listed in the various comparative "proofs" may be artifacts of cross-taxa comparisons, rather than being due to differences in diets. Polar bears are an example of an animal whose feeding behavior is sharply different from that which would be expected from its morphology. Pandas illustrate the low resolution of comparative studies in "proving" diets.

PART 6: What Comparative Anatomy Does & Doesn't Tell Us about Human Diet
Primate Body Size Link between body size and diet is not strict. To a certain degree, body size may be a predictor of
diet in (land) mammals, including primates, as diet and body size are correlated. Note that the relationship is correlation, and not necessarily causation. Contrary to the claims made by some fruitarian extremists, the link between body size and diet is neither strict nor absolute. In primates, body size is, at best, a poor predictor of diets.

Correlation of body size and diet in primates. The concepts behind the claims regarding body size
vs. diet are explained by Richard [1995, p. 184]: Small mammals have high energy requirements per unit body weight, although their total requirements are small. This means that they must eat easily digested foods that can be processed fast; however, these foods do not have to be abundant, since they are not needed in large quantities. Large mammals have lower energy requirements per unit body weight than small ones and can afford to process food more slowly, but their total food requirements are great. This means their food items must be abundant but not necessarily easy to digest. Richard [1995, figure 5.13, p. 185] also provides frequency distributions of body weights, by diet, for a large set of primates. The frequency distributions indicate the order of generally increasing body sizes, by diet, is: insectivores, frugivores that eat insects, frugivores that also eat leaves, and folivores.

Humans are an exception to the rule. Given the frequency distributions in Richard [1995], the
average weight of humans (65 kg) appears to be at or above the upper limit of the range for frugivores eating leaves, and within the folivore range. This would suggest, if the "rule" relating body size and diet were strict, that the most likely diet for a primate the size of a human would be a diet in which leaves are the primary food, and fruit a minor/secondary food. Of course, that is a sharply different diet from the nearly 100% fruit diet advocated by certain extremists who claim that the body size/diet "rule" indicates a strict fruitarian diet for humans. Richard [1995, p. 186] provides further insight into the alleged "rule": Within broad limits, then, body weight predicts the general nature of a primate's diet. Still, there are exceptions to the rule, and body weight is not always a good predictor even of general trends...

83

[W]hen finer comparisons are made among the species the rule loses its value altogether. Specifically, similarly sized species often have quite different diets.

Specific human features imply dramatic breakthrough in diet. Milton [1987, p. 106] also
comments on the "rule": In contrast [to australopithecines], members of the genus Homo show thinner molar enamel, a dramatic reduction in cheek tooth size, and considerable cranial expansion (Grine 1981; McHenry 1982; S. Ambrose, pers. comm.). In combination, these dental and cranial features, as well as an increase in body size, apparently with no loss of mobility or sociality, strongly imply that early members of the genus Homo made a dramatic breakthrough with respect to diet--a breakthrough that enabled them to circumvent the nutritional constraints imposed on body size increases in the apes. The above quotes from Richard [1995] and Milton [1987] reflect that the body size "rule" is not strict, and that humans are a specific example of a species that has overcome it. Additionally, recall that our preceding section on brain evolution discussed how human energy metabolism is dramatically different from all other primates. We humans expend considerably more energy on our brains--energy that is apparently made available by our reduced gut size, which itself is made possible via a higher-quality diet. By changing the internal energy budget to supply our large brains with energy, humans have also overcome the "rule" on primate body size.

Analysis of Head, Oral Features, and Hands

Introduction: the main claims Oversimplistic either/or views of carnivorous adaptations. The various comparative "proofs" of
vegetarianism often rely heavily on comparisons with true carnivores, e.g., lions, tigers, etc. The basic arguments made are that meat cannot be a "natural" food for humans, because humans don't have the same physical features characteristic of the small group of animals that are considered to be "true" carnivores. However, such arguments are based on an oversimplified and flawed view of adaptation; that is, the underlying assumption being that there is only one--or at least one main set of--physical adaptation(s) consistent with eating meat. Such arguments also ignore the high intelligence and adaptive behavior of humans. Let's examine some of the claims made in Mills' The Comparative Anatomy of Eating, plus a few additional claims made in a recent extremist fruitarian book. (Out of politeness and to avoid what some might construe as inflammatory, the latter book will not be identified other than to say that it is a massive plagiarism of the book Raw Eating, by Arshavir Ter Hovannessian--a book published in English in Iran in the 1960s.)

Flawed comparisons that pit humans against "true" carnivores. The claims are made that the
human body lacks the features of certain carnivores, hence we should not eat meat, as it is not "natural." The differences are as follows; carnivores have, but humans do not have: Sharp, pointed teeth and claws for tearing into the flesh of the prey. A jaw that moves mostly up and down, with little lateral (side-to-side) movement; i.e., the jaw has a shearing motion. A head shape that allows the carnivore to dig into prey animals. Additionally, carnivores are different from humans (and herbivores) in that carnivores usually swallow their food whole (humans chew their food). Also, carnivores generally do not have starch-digesting enzymes in their saliva, whereas humans do. Note: Claims regarding the morphology of the gut (digestive system) are discussed in the next section.

84

In The Comparative Anatomy of Eating, Mills summarizes the above analysis with the statement that: An animal which captures, kills and eats prey must have the physical equipment which makes predation practical and efficient. We shall see that this claim by Mills is incorrect if one interprets it only in morphological terms. Without further delay, let's now examine the above claims.

Logical problems of the comparative proofs


The above physical comparisons are accurate--clearly, humans do not have the jaws, teeth, or claws of a lion. However, to use this information to conclude that humans "cannot" eat meat, or "must have" the same physical traits as other predators to do so, or did not adapt to meat in the diet, is logically invalid and bogus. A short list of errors in reaching the above conclusion is as follows:

Focusing on purely carnivorous adaptations rather than omnivorous ones. One


clarification should be made immediately: This paper does not suggest that humans are true carnivores adapted for a nearly pure-meat diet. Although it may be that humans might be able to do well on such a diet (e.g., the traditional diet of the Inuit), the focus of this paper is to investigate whether meat can be considered a natural part of the human diet--certainly the paleoanthropological evidence supports that view. Thus the focus here is not on the bogus issue "are humans pure carnivores?" but on "are humans faunivores/omnivores?" Invalid black-and-white views. The conclusion above (that meat "cannot" be a natural part of the human diet) is based on a simplistic (incomplete/invalid) view of adaptation. That is, the conclusion is based on the implicit assumption that the specific physical adaptations of the lion, tiger, etc., are the ONLY adaptations that can serve their intended function, i.e., meat-eating. Inasmuch as meat is a small but significant part of the diet of chimps, however--who also lack the carnivore adaptations (sharp teeth, claws, etc.)--we observe that the assumption is obviously false. Various oversimplistic assumptions. The analysis is simplistic and makes many of the mistakes listed in the preceding section--i.e., it assumes the form/function linkage is strict, fails to recognize that the same form can serve multiple functions, etc. Overlooked differences in adaptive behavior. The analysis ignores critical differences in feeding behavior, i.e., the ones relating to the hunting/feeding behavior of omnivorous primates (e.g., the chimp) in the wild, which is well-known to be different from that of lions and tigers. Also, adaptive behavior (enhanced via human intelligence and technology--tools) allows humans to easily overcome many of the physical limitations of our physical form and morphology. Impact of tool use and language on morphology disregarded. The analysis ignores the impact that human intelligence has had on morphology, specifically the evolutionary effect of technology (stone tools and cooking), as well as possible morphological changes to support language--yet another unique human feature. Obvious explanations rationalized away as "illegitimate." A simple, summary answer to the question of how humans can hunt animals and eat meat without the physical adaptations of the lion and tiger is the obvious one implicitly ignored and rationalized by the advocates of simplistic comparative "proofs": We don't need sharp teeth, powerful jaws, or claws to capture and butcher animals because we have used (since our inception as a genus ~2.5 million years ago) tools (or technology--stone weapons) for that purpose. Over the eons, evolution itself has adapted our physiologies to the results of this behavior along unique lines, quite regardless of the hue and cry over the "illegitimacy" with which these behaviors/skills are regarded by those extremists promoting the bizarre idea that human dietary behavior should be strictly limited to what we could do "naked, without tools." Technology, driven by our intelligence, supports adaptive behavior that allows us to easily overcome the physical limitations that the comparative "proofs" regard (incorrectly) as being limiting factors. Along similar lines, we don't need the strong bodies of a lion or tiger because we have something much more powerful: high intelligence, which allowed humans to become the most efficient hunters, and the dominant mammalian species, on the planet.

Examining comparative claims about the head, oral features, and hands

85

Now let's examine, in the subsections that follow, some of the claims of the comparative proofs (regarding the head, hands, oral features, etc.) in light of knowledge of prehistoric diets and evolutionary adaptation.

Cheek Pouches: An Example of Vague Claims


Before considering more serious subjects, one vague claim occasionally made in comparative "proofs" is that humans and herbivores both have cheek pouches, or cheeks whose structure is somehow similar to cheek pouches, whereas carnivores do not. Frankly, the claim that human cheek structure is similar to cheek pouches is vague and hard to evaluate. Note, however, one point is clear in this matter: humans do not have true cheek pouches as certain monkeys do. Richard [1995, pp. 195-196] notes that: A cheek pouch is an oblong sac that protrudes when full from the lower part of the cheek. Food is moved between it and the mouth through a slitlike opening. Richard [1995] also reports that cheek pouches may increase foraging efficiency and may assist in the pre-digestion of starchy foods. However, until those making claims regarding cheek pouches clarify their claims, the importance (if any) of this point, cannot be evaluated. Further, the function of the mouth and oral systems (including cheeks) is not restricted to eating--a point that is covered later, and one with particular relevance for humans given our special adaptations for language. The cheek-pouch claim is included here as an example of how vague some claims made in comparative proofs can be. Extremely vague claims are hard to test and evaluate. Also, one cannot help but get the impression--based on the very vagueness of this claim--that its advocates are simply "reaching" to find any possible explanation, without regard for plausibility, efficiency, supporting evidence, etc.: a feature typical of extremism in general.

Hands vs. Claws


The claim that meat cannot be a natural food for humans because we lack claws ignores the following: o Simple tool-based technology part of human evolutionary adaptation. As described above, human intelligence supports the creation and use of technology--stone tools in the case of our early human ancestors. Technology--even stone tools--serves the same function as claws, and humans have used such technology since the very inception of the genus. A human equipped with stone tools is a very efficient hunter, and has no need of claws. o Powerful synergism of versatile human behavior patterns. The human hunting pattern can be seen as an extension of the non-human primate predation pattern. Butynski [1982, p. 427] notes: It seems that complex vocal communication, bipedalism and weapon-use are not essential for primates hunting small vertebrates, including mammals. Nevertheless, when the basic predation pattern of non-human primates is supplemented with these unique capabilities, the complete hominid hunting pattern emerges and with it the ability to efficiently hunt and utilize large mammals (Table 2). Evidence for meat-eating by non-human primates and contemporary human hunter-gatherers indicates that, during the evolution of these three components of the hominid hunting pattern, a 30 to 35-fold increase in the consumption of meat occurred--an increase perhaps already evident more than 2.5 million years ago (Fagan, 1974; King, 1976). Clearly, thanks to technology (even as basic a technology as simple stone tools), humans do not need claws to be effective predators. Instead of claws, we have hands, similar to, yet different from, the hands of the great apes. Humans have fully opposable thumbs, but

86

lack the ape/orangutan adaptations for tree-climbing; the primary adaptations are curved phalanges, curved metacarpals, and increased length of fourth digit; see Aiello and Dean [1990, pp. 379-385] for further information on this; also see Hunt [1994] for information regarding the special muscular adaptations of chimps for tree-climbing. o

Special tree-climbing adaptations lacking in humans. Note that the fact humans
lack the important orangutan hand adaptations for tree-climbing, and chimp muscular adaptations for tree-climbing, directly contradicts the claims of certain fruitarian extremists who allege that humans have evolved for a nearly 100% fruit diet (without the use of tools). Consider that (according to the extremists) we are allegedly adapted to eat a nearly 100% fruit diet, yet we lack the critical adaptations to efficiently climb trees and pick that nearly 100% fruit diet. Something is not right here--specifically, such flawed claims. Versatility of the human hand. The human hand is a very versatile system, and its function is not limited to merely picking fruit. The very same hand can caress a lover, feed a person, brandish a weapon, or perform some other function. The application depends on the individual who owns the hand. The fruitarian claim that human hands were designed or evolved primarily for fruit-picking is just silly nonsense.

Stone tools compensate for lesser human physical capabilities via intelligence. On a similar note, the use of stone tools (made by the human hand, driven
by intelligence) to slaughter animals killed for food renders invalid the comparative proof claim (above) that humans "should" have a head shape that allows humans to "dig into" prey animals. Humans dig into prey using stone tools (which, by the way, were razorsharp) and their versatile hands. Similarly, the (previous) claim by Mills that humans "must have" the same physical features as true carnivores is invalid for the same reasons. In other words, the use of even rudimentary stone tools and a different feeding behavior by humans avoids the selection pressures that gave lions, tigers, and other carnivores those characteristic wedge-shaped heads, as well as the other features considered to be typical of carnivores.

Jaw Joints and Jaw Motion Oversimplifications in comparative "proofs." One of the claims made in the comparative
proofs is that the motion of the jaw joint of a true carnivore is largely a shearing (up-and-down) action, with limited lateral (side-to-side) movement. This is something of an oversimplification. Sicher [1944] reports that certain bears and pandas can and do masticate or chew plant foods. Sicher reports that the jaw joint of the carnivore actually provides a screwing motion, which has both shifting and shearing motions. In cats (and crocodiles) there is very little shifting motion, i.e., the joint motion is mostly shear. However, in pandas the shifting motion is aided by strong masseter muscles which allow mastication--side-to-side chewing--of plant foods. Similar remarks apply to certain bears. (Although bears are omnivorous and pandas nearly herbivorous, their jaw joint is typical of the order Carnivora.)

Considering detailed features in isolation is misleading. Thus we note that the


characterization of jaw motion as "shear-only" in "carnivores" (members of the order Carnivora) is an oversimplification. This is an example of how considering detailed features in isolation from other features can yield misleading results. It also reminds us that some members of the carnivore order are--surprise--predominantly herbivorous. Finally, the remarks regarding hands vs. claws apply here as well: humans never needed super-powerful jaw muscles to tear up prey animals because we used stone tools to cut them up instead.

Teeth
McArdle [1996, p. 174] reports that the best evidence humans are omnivores comes from the teeth:

87

Although evidence on the structure and function of human hands and jaws, behavior, and evolutionary history also either supports an omnivorous diet or fails to support strict vegetarianism, the best evidence comes from our teeth... In archeological sites, broken human molars are most often confused with broken premolars and molars of pigs, a classic omnivore. On the other hand, some herbivores have well-developed incisors that are often mistaken for those of human teeth when found in archeological excavations. In previous sections, we have discussed how teeth are a good, but not infallible, indicator of diet. McArdle's remarks should be seen in light of the limits on the tooth/diet connection.

Salivary Glands and Saliva


McArdle [1996, p. 174] reports that our salivary glands "indicate that we could be omnivores." Tanner [1981] reports that humans and the great apes share a class of proteins known as Pb and PPb in the saliva. These proteins help protect tooth enamel from decay due to carbohydrates and/or coarse plant foods in the diet. The presence of these proteins may be an artifact of past evolution, as the diet of humans is significantly different from that of the great apes. It may also reflect the status of humans as, figuratively, "eat nearly anything" omnivores/faunivores.

Teeth and Jaw: Structure and Mastication Muscles


Note: These are discussed together, as the same evolutionary pressures apply to both.

Hominid dental system is small relative to apes and has decreased in size over evolutionary time. The masticatory system of the great apes is larger than that of humans
[Aiello and Dean 1990]. Garn and Leonard [1989 (see quote in the preceding section)], Milton [1987 (p. 106 quote earlier in this section)], and also Leonard and Robertson [1992] report that the dentition of our ancestors decreased from Australopithecus to Homo erectus, coincident with the development of stone tools and increasing consumption of meat (hence decreasing consumption of coarse vegetable foods). Technology--stone tools--permitted our prehistoric ancestors to hunt, kill, and carve up for transport large animals. All of this was done via tools, obviating the need for the typical carnivore adaptations--strong jaws, sharp teeth, claws, powerful muscles for mastication, etc. Another technology--cooking--may have had a significant impact on human dentition as well.

Potential effect of primitive food processing technology. Brace et al. [1991] present a
fascinating hypothesis. Analyzing the evidence from Neanderthal sites in an area referred to as Mousterian, circa 200,000 years ago, they note (p. 46): [W]e repeat the observation that "The important thing to look to is not so much the food itself but what was done to it before it was eaten" (Brace, 1977:199). If that can be accepted, it should follow that the introduction of nondental food processing techniques should lead to changes in the forces of selection that had previously maintained the dentition. As their analysis continues, they note the following: o o o o The cooking of food (meat) in earth ovens can be dated back over 200,000 years ago. The climate of the Mousterian area was cold, and prey animals would freeze solid in winter unless eaten immediately. Cooking meat in an earth oven thaws out the frozen meat, and in hot weather may offset (within limits) the effects of meat spoilage. Brace et al. [1991, p. 47] note:

88

Meat cooked in such a fashion can become quite tender indeed, and in such condition it requires less chewing to render it swallowable than would be the case if it remained uncooked. In turn, this should represent the relaxation of selection for maintaining teeth at the size level that can be seen throughout the Middle Pleistocene. The appearance of the earth oven in the archaeological record, then, should mark the time at which the dental reduction manifest in the Late Pleistocene had its beginning. o They then tie the further reduction in human dentition to the use of pottery, which allows preparation of soups (on which one can survive even if toothless), and allows fermentation as well.

Universal cultural/technological innovations can reduce/change selection pressures.


Hence, Brace et al. [1991] correlate the (evolutionary) reduction in the size of human dentition to universal cultural innovations--cooking and food processing--which promoted survival and relaxed selection pressures that favor large, robust dentition. (This is an excellent example of the evolutionary behavior/culture feedback loop discussed in a previous section.) The end result of such selection pressure is our modern dentition, which is smaller and weaker than our prehistoric ancestors. Milton's comment in Aiello and Wheeler [1995, p. 215] nicely summarizes the situation: The reduced dentition of early humans indicates that technology had begun to intervene in human dietary behavior, in effect placing a buffer or barrier between human dental morphology and the human gut (and thus selection pressures) and foods consumed.

Appearance of modern human form corresponds with reduced dentition. Brace et al.
[1991, p. 48] note that: The emergence of "modern" human form was uniformly associated with dental reduction. The differences in tooth size that can be seen between the various living human populations, then, were the consequences of different amounts of reduction from the Middle Pleistocene condition. These in turn can then be associated with the different lengths of time that the forces of selection maintaining tooth size have been modified by "modern" Homo sapiens. The above material supports the view that technology, in the basic form of stone tools and cooking, may have had a significant impact on the form and structure of human dentition--teeth-and jaw muscles. (Cooked food is softer, so it relaxes selection pressures that favor stronger jaw muscles.) Regarding this point, the comparative proofs of diet that fail to consider the impacts of technology and human feeding behavior are therefore incomplete and/or inaccurate.

Other selection pressures on head and oral features: brain size, posture, and language Selection pressures are multiple and competing rather than solitary. The above material
suggests that changes in diet and food processing techniques relaxed selection pressures for "robust" dentition. However, such factors reflect but one set of selection pressures operating on the human mouth. The mouth is used for more than just eating, after all. Other factors that may impact the evolutionary morphology of the head, mouth, and oral systems include: brain size, consistently upright bipedal posture, and--extremely important--language.

Increasing brain size reduces space available for oral features. The increase in brain size, e.g.,
encephalization, increases the space required for the cranium (brain vault). This, coupled with the slight posture realignments required for bipedalism, may cause subtle but significant changes in the architecture of the head as a whole. The possibility of impacts from these is mentioned by Radinsky as cited in Hiiemae

89

[1984]. Aiello and Dean [1990] are an indirect reference on this issue--see discussion regarding hominoid mandible and cranium--and consider that a shift in the center of gravity of the head (caused by increasing brain size and bipedal posture) could increase selection pressures for a smaller (lower-weight) mouth and oral systems.

The influence of language on the oral system. Another important selection pressure acting on the
human mouth and oral systems was the development of language. As Milton [1987, p. 106] notes: An innovation such as language could help to coordinate foraging activities and thereby greatly enhance foraging efficiency (see, e.g., Lancaster 1968, 1975). Inasmuch as language greatly increased foraging efficiency, it increased survival and was an important selection pressure. Hiiemae [1984] points out that fully developed language requires a very flexible oral system--jaws, tongue, esophagus, etc. Thus it is no surprise that the human oral system emphasizes flexibility and is quite different from that of carnivores like cats or crocodiles, who have no need of language, and who lack tools and must use sharp teeth and claws instead. The human oral system reflects the evolutionary selection pressures on humans (and not the pressures acting on cats or crocodiles). Further, thanks to technology (tools and cooking) the human dentition was buffered to a certain extent from the diet's underlying content. That is, simple technology enables us to transform foods, as necessary, to better fit our dentition--and per Garn and Leonard [1989] and McArdle [1996], humans have the dentition of an omnivore that eats soft foods.

Selection pressures on humans due to language are unique. Hiiemae [1984] analyzes the process
of chewing and swallowing in mammals, and finds that the process is identical--except in humans. There is a significant difference in the sequence of events in humans, and Hiiemae [1984] describes the difference as (p. 227) "a reflection of a fundamental change." Hiiemae [1984, p. 278] notes: Man is the only mammal in which "communication" has become a dominant oropharyngeal activity. [Oropharyngeal refers to the area between the soft palate and the epiglottis.] Is it not possible that the single most important change in the evolution of the jaw apparatus in hominids has been the development of speech? To go further: many mammals can call (coyotes can "sing") but "speech" involves the exactly patterned modulation of the basic note emitted from the larynx. That patterning is produced by a change in the shape of the air space in the oral cavity and by use of a series of "stops" which involve the tongue, teeth, and lips. [These considerations address] a much more fundamental and wide-ranging question--the relationship between the evolution of speech and the changes in both the anatomy and functional relationships of the structures developed to process food.

Design tradeoffs in evolution of the human oral system due to speech/language. The above
suggests that the development of that unique human feature--language--may have required changes in morphology. Shea [1992] reports that the evolution of speech was associated with changes in the skull base and pharynx, both of which indirectly impact the jaw and dental systems. The uniqueness of the human oral system is explained by Lieberman [1992, pp. 134-135]: The human supralaryngeal airway differs from that of any other adult mammal... Air, liquids and solid food all use a common pathway through the pharynx. Humans are more liable than any other land animals to choke when they eat, because food can fall into the larynx and obstruct the pathway into the lungs... Human adults are less efficient at chewing because the roof of the mouth and the lower jaw have been reduced compared with non-human primates and archaic hominids. This reduced palate and mandible crowd our teeth and may lead to infection because of impaction--a potentially fatal condition before modern medicine.

90

These deficiencies in the mouth of an adult human are offset by the increased phonetic range of the supralaryngeal airway. Cziko [1995, p. 182] reiterates the same information as given above from Lieberman [1992], and adds the cogent comments: But if the design of the human throat and mouth is far from optimal for eating and breathing, it is superbly suited for producing speech sounds... We thus see an interesting trade-off in the evolution of the throat and mouth, with safety and efficiency in eating and breathing sacrificed to a significant extent for the sake of speaking.

Linkage of language and brain development in evolution of the human oral system. Deacon
[1992] notes the seamless integration of language into human nature, and suggests this indicates that language originated long ago. Cziko [1995, pp. 183, 185] mentions the likely language/brain evolution linkage: The study of our vocal tract also provides hints concerning the evolution of our brain. Obviously, the throat and mouth would not have evolved the way they did to facilitate language production and comprehension while compromising eating and respiration if the brain had not been capable of producing and comprehending language. The importance of language and the advantages it provides us in communicating, coordinating our activities, and thinking [we "think" in our native language] suggests that, in addition to being a product of our evolution, it also played a large part in shaping our evolution, particularly that of the brain. The above quote suggests that the evolution of the mouth/oral systems were closely related to the evolution of the brain, via the feature of language. A diagram illustrating the various selection pressures acting on the mouth and oral systems is given below.

91

Human oral system forged as an evolutionary compromise. Thus we observe that dietary changes
and food processing are not the only selection pressures acting on the head, mouth, and oral systems. Our jaw, mouth, and throat are the result of a compromise between multiple selection pressures acting on those structures: The structural impact of food pre-processing (stone tools, cooking), The oral system flexibility required to support language, and The subtle changes required to support increased encephalization and upright bipedal posture.

Thus we see that the "big picture" here is not so trivial or simplistic as the claims of the comparative proofs that "humans cannot eat meat because they lack claws, powerful jaws, and the sharp canine teeth of lions." It is a pity that people cling to such simplistic claims--real life is more complicated, but also far more interesting!

To summarize:

92

Humans do not have claws, razor-sharp teeth, or the other adaptations found in carnivores because from the very inception of the genus Homo we have used technology--at first, and at the least, in the form of stone tools--to serve the same functions as claws, sharp teeth, etc. This buffered humans from the selection pressures associated with the development of the features associated with carnivores. Additional selection pressures on the human head came from encephalization (increasing brain size), bipedal posture, and the development of language (which required a flexible oral system). The typical claims made in comparative proofs of diet that humans are not adapted to meat because of lack of claws, fangs, etc., are fallacious and based on the assumption that only one adaptation is possible for the function. In reality, from earliest beginnings, humans have employed adaptive behavior and technology to overcome the limits of morphology.

Overview of Gut (Digestive System) Morphology in Primates and Humans


Introduction
After the various claims of the comparative proofs that "humans can't eat meat because they lack claws, fangs, etc.," the next set of claims made concerns the digestive system. Typically, a number of features of the digestive system are compared: stomach (type, relative capacity, etc.), length of small intestine, and so on. (See The Comparative Anatomy of Eating, by Mills, for a longer list.)

Diet categories in comparative proofs are typically narrow. The comparisons made are relatively
simplistic, and the thinking process involved is also usually rather narrow: herbivores vs. carnivores vs. humans (and omnivores are sometimes added to the list). In referring to narrow thinking here, we mean that the comparative proofs neatly define their dietary categories in a narrow way, and ignore the fact that in nature, diets are usually not strict (i.e., "pure" diets are rare). The comparison of digestive system features comprises the second major component of many comparative proofs of diet.

Other shortcomings of typical comparative proofs. Needless to say, the comparative proofs suffer
from many of the limitations identified in previous sections--subjective list construction, assuming the form/function linkage is strict, ignoring the reality of modern knowledge of ape diets, resolution problem (analysis may be on too gross a level), and so on.

Preview of this section. Instead of listing the numerous claims of Mills and the other proofs, and
analyzing them directly for possible shortcomings, this paper takes a different approach. First, we'll review the work of Milton [1987], as it nicely illustrates that the human gut is (nearly) unique--even when compared with the great apes. Then we'll review an excellent series of papers: Chivers and Hladik [1980, 1984], Martin et al. [1985], and MacLarnon et al. [1986], which are the most authoritative analyses of gut morphology done to date. Also, Sussman [1987] and Hladik et al. [1999] are discussed, for they provide additional insight into the major papers on gut morphology. Finally, the different definitions of the term "omnivore" are explored, including how those differences have been exploited in an intellectually dishonest way (in my opinion) by a fruitarian extremist.

Humans are unique, and the human gut is (nearly) unique


Since we will be comparing the gut (digestive system) of humans with other animals, especially primates, it is appropriate to begin with an illustration that provides a comparison.

93

Milton [1987, p. 102] notes: When compared to those of most other mammals, the relative proportions of the human gut are unusual (my calculations, using data from Chivers and Hladik [1980], and Hladik [1967]). Human gut small compared to apes. Observing that human gut proportions are different from those found in carnivores, herbivores, swine (an omnivore), and even most other primates, including the anthropoid apes, Milton [1987, p. 101] notes that "...the size of the human gut relative to body mass is small in comparison with other anthropoids (R.D. Martin, pers. comm.)." Milton [1987] includes a table (3.2, p. 99) that compares the relative volumes of the different parts of the gut for selected hominid species. The table shows the stomach at 10-24% of total gut volume in humans, while for orangs and chimps it is 17-20%. The small intestine is 56-67% of total gut volume in humans, 23-28% in orangs and chimps. And the colon is 17-23% of total gut volume in humans, while it is 52-54% in orangs and chimps. The percentages quoted in the preceding sentence are unscaled, i.e. are not scaled for inter-specific differences

94

in body size. Despite this, the figures are useful to compare patterns of gut proportions, and the general pattern is clear: humans have "large" intestines, while chimps and orangs have "large" colons. Additionally, Milton [1987] discusses two primates whose gut proportions appear to roughly match those of humans: Capuchin monkey (Cebus species) which has a high-quality diet of sweet fruits, oily seeds, and (40-50% by feeding time) animal foods--invertebrates (insects) and small vertebrates [Milton 1987, pp. 102-103]. The savanna baboon (Papio papio) is a selective feeder who searches for high-quality, nutritious foods. [Caution--remark is based on only one measured specimen for gut proportions.]

Gut characteristics reflect dietary quality. Like humans, Capuchin monkeys and savanna baboons
make extensive use of their hands for pre-processing of food items. Milton concludes that the similarity in gut proportions reflects adaptation to high-quality diets [Milton 1987, p. 103]: Rather, it appears to represent similar adaptive trends in gut morphology in response to diets made up of unusually high-quality dietary items that are capable of being digested and absorbed primarily in the small intestine. The plasticity (or elasticity) of the human gut--that is, how the proportions can change to accommodate temporary fluctuations in diet--is discussed in Milton [1987]. (The topic will be addressed later here.) She predicts that further studies will (continue to) show that the human gut is dominated by the small intestine, with high variability in colon size due to temporary changes in diet.

Quantitative Analysis of Gut Morphology in Primates and Humans


Introduction
A series of excellent papers--Chivers and Hladik [1980, 1984]; Martin, Chivers, MacLarnon, and Hladik [1985] (hereafter referred to as Martin et al. [1985]), and MacLarnon, Martin, Chivers, and Hladik [1986] (hereafter referred to as MacLarnon et al. [1986])--provides two separate and different quantitative analyses of gut morphology, with cross-reference to diet. Together, these two detailed analyses provide a sharp contrast to the simplistic analyses one typically finds in the comparative "proofs" of diet. The papers of Sussman [1987] and Hladik et al. [1999] are also discussed below, for the supplementary insights they provide. Note: The above papers are lengthy and provide considerable detailed, technical information. Only a summary of the major points in each analysis is provided here. Those with a specific interest in gut morphology can learn much from the above papers. And frankly, if you do read the papers cited above, you will realize just how simplistic the analyses presented in the typical comparative "proofs" of diet, really are.

The research of Chivers and Hladik [1980, 1984]


The two papers of Chivers and Hladik [1980, 1984] provide the first analysis of gut morphology of the two different analyses that will be discussed here. To begin with a brief summary, the major points of the approach in these two papers are:

GI tracts from numerous species were analyzed. Gastrointestinal (GI) tracts were analyzed
from 180 individuals of 78 different species; 112 of the individuals in the analysis were primates. Morphology points to 3 basic dietary groupings. Differences in morphology reflect adaptation to different diets. Foods can be classed into 3 groups, by structure and composition:

95

o o

Animal matter, including invertebrates and vertebrates. Such foods are easily digested,

and a short, simple gut is adequate for such foods [faunivory]. Fruits--the reproductive parts of plants, including seeds and tubers. Fruits are high-sugar foods, and tubers are high-starch foods that are converted into sugar in digestion; such foods are readily digested and absorbed in the intestines [frugivory]. o Leaves--including the structural parts of plants, i.e., grasses, stems, bark, and gums. These foods require fermentation in a large stomach or the large intestine/colon [folivory]. Dietary categories reflect a continuum, not sharp divisions. The categories of dietary adaptation described above--faunivory, frugivory, folivory--are not discrete (distinct or black-andwhite) categories, but reflect a continuum that stretches from animal foods (hard to catch but easy to digest) to folivory (easy to find but hard to digest), with frugivory an intermediate form (fruit is available, but often only in limited quantities). Typical gut of faunivores. Chivers and Hladik [1980] describe the typical gut pattern for a faunivore (p. 338): The basic pattern of gut structure among faunivores consists of a simple globular stomach, tortuous [containing many twists/bends] small intestine, short conical caecum, and simple smoothwalled colon.

Folivore gut characteristics. Regarding folivores, the long-chain carbohydrates found in


leaves and structural plant parts require bacterial decomposition (fermentation) for digestion and assimilation. There are two primary adaptations for fermentation: chambers in the fore-gut (stomach) or mid-gut (caecum and colon). Frugivore guts are variable. Regarding frugivores, Chivers and Hladik [1980, p. 340] note that: This group contains most primates, but none of them subsist entirely on fruit. All frugivores supplement their diets with varying amounts of insects and/or leaves, but have no distinctive structural specialization in the gut, although its morphology may show considerable variation between species. Contrast the above real-world information with the position of certain fruitarian extremists who claim that a nearly 100% fruit diet is optimal/most "natural" for humans. In actuality, it appears there is no such thing in nature as a 100%-fruit frugivorous primate.

Faunivory and folivory are the endpoints with frugivory intermediate. Pure faunivory
and pure folivory reflect the two extremes of adaptation. However, as diets are a continuum, Chivers and Hladik argue that putting all intermediate diets in the (too broad) category of omnivores is inappropriate. They note [Chivers and Hladik 1984, p. 216]: No mammal mixes large quantities of both animal matter and leaves in its diet without including fruit. Since faunivory and folivory seem to represent contrasting and incompatible [morphological] adaptations, the quantity of fruit in such a mixed diet is always considerable. The objection of Chivers and Hladik to the term omnivore appears to reflect a different view of the definition of the term than is used by many other authors. This will be discussed later in this section.

Determining/measuring relevant gut characteristics is non-trivial. Chivers and Hladik


[1980] includes a discussion of the major challenges involved in measuring the relevant gut parameters--volume, surface area, etc. Additional challenges include: the gut itself is elastic (i.e., it may stretch when you try to measure it), consists of irregular shapes which may be complicated by

96

folds or specialized structures, and so on. In the real world, choosing what parameters to measure and how to measure them, in the area of gut morphology, is decidedly non-trivial. The simplistic comparative proofs of diet don't bother to tell you such details. Basic raw data collected: For the subject animals, the sizes of stomach, large intestine, and small intestine were all measured in terms of surface area, weight, and volume. Data on overall size of the animal--length and weight--were recorded as well. Chivers and Hladik [1980] includes a discussion on the importance and reliability of the various types of measures--surface area, weight, volume--and conclude that volume is the least reliable measure because of confounding factors--i.e., it is hard to distinguish and compare the volume of a fermenting chamber in a folivore versus the volume of absorbing regions in a faunivore.

Gut differentiation coefficients calculated. Given that the primary gut morphology
specializations related to diet range from a gastrointestinal (GI) tract dominated by the small intestine in faunivores, to a GI tract dominated by fermentation chambers in the stomach and/or cecum or large intestine (colon) in folivores, Chivers and Hladik define and calculate coefficients of gut differentiation as:

Measure of stomach + caecum + colon measure of small intestine


where measure = one of weight, volume, or surface area. In other words, the numerator in the equation above represents a measure of the structures that tend to dominate in more folivorous species. The coefficient of gut differentiation compares the numerator against (divides it by) the denominator, which is a measure of the feature (small intestine) that dominates in species with more faunivorous (animal-based) diets. The resulting coefficient provides a simple (though crude) measure of gut morphology specialization. Coefficients of gut differentiation were calculated for each species in the study, and the results plotted by dietary specialization. In regard to the classification of animals by dietary specialization, Chivers and Hladik [1980, p. 377] note: While recognizing the special significance of the gross dietary categories to which each species can usually be assigned, particularly the most specialized forms, we have tried to avoid any implication that a classification into faunivores, frugivores, and folivores reflects exclusive diets.

Size scaling to compare species adjusted according to body length. To be able to


compare gastrointestinal tracts across species, the raw data was adjusted for body size--in this case, body length. (Note: depending on the source of data, body length might not be as accurate an indicator as adjusting for body weight--which scales better with the body's metabolic demands. The later research papers of Martin et al. [1985] and MacLarnon et al. [1986], which we'll examine shortly and which are extensions of the research described in Chivers and Hladik [1980, 1984], does use body weight to adjust for size differences between species.) Index of gut specialization. Incorporating this scaling adjustment, the GI tracts studied were analyzed for the three different dietary classes (faunivore, frugivore, folivore) to produce a numeric index--an index of gut specialization. (For details on the methodology used to derive the index of gut specialization. The index of gut specialization is an attempt to produce an index that reflects the apparent morphological specialization for the animal in question. That is, it is intended to indicate what the animal's morphology suggests the animal's diet might be. Dietary index: what the animal actually eats. Having developed the index of gut specialization, which suggests what the animal might eat, Chivers and Hladik then turn their attention to the issue of what the animals actually do eat. As the three main dietary specializations are leaves, fruit, and fauna, they use a triangular plot whose 3 vertices represent 100% faunivory,

97

100% frugivory, and 100% folivory. Use of a triangular plot provides a nice 2-dimensional plot for the 3 dimensions required. Each animal is plotted in the triangle according to its actual diet. An update of the original graph [Chivers and Hladik 1980], from Chivers [1992], is as follows.

Chivers and Hladik assert that overlaying an x-axis on the triangular plot provides the best univariate (i.e., one number per animal) summary description of actual diets. Note that because the plot is triangular and a projection of a 3-dimensional plot, the scales are more complicated than in a standard rectangular plot. In this case, the x-axis is defined as: x = (% leaves) - (% animals) The value of the x-axis for each point in the triangular plot of dietary characteristics (above) is used as the dietary index, an approximate indicator of the actual diet of the animal. Note also that the three percentages (fruit, leaves, animal matter) are not independent, as they must add up to 100%.

98

Chivers and Hladik then compare the dietary index values (which indicate actual diet) against gut specialization index values (which indicate what diet would be suggested by gut characteristics) for 8 selected species. The results of the comparison are "good" (i.e., some similarities between the indices of diet and gut specialization). This provides some supporting evidence for the analysis.

A footnote on Chivers and Hladik [1980, 1984]: Human gut morphology


Sussman [1987] describes the analysis of the gut of 6 human cadavers using the measures defined in Chivers and Hladik [1980, 1984]. Analysis of the human gut data using the coefficient of gut differentiation (a measure of gut specialization) placed humans in the frugivore range, along the margin with the faunivore category. However, analysis of the same data using the index of gut specialization (yet another measure of gut morphological specialization) placed humans squarely in the faunivore range. Note that the frugivore classification above came from using the coefficient of gut differentiation, which is an intermediate result in Chivers and Hladik [1980, 1984], hence presumably less desirable (from a certain analytical viewpoint) than the (faunivore) classification achieved using the end result of Chivers and Hladik [1980, 1984], i.e., the index of gut specialization. Also recall that the term frugivore does not mean or imply that a diet of nearly 100% sweet fruit (as advocated by some fruitarians) is appropriate. Recall that all frugivorous primates eat at least some quantities of animal foods, even if only insects. Thus the result that humans appeared to be frugivores by one measure and faunivores by another suggests a natural diet for humans that includes both animal foods and fruits.

The research of Martin et al. [1985] Improved methodology. The research of Martin et al. [1985] is very important, for it provides a second,
different analysis of the (same) data analyzed previously in Chivers and Hladik [1980, 1984]. The analysis of Martin et al. [1985] includes a number of important features/improvements: Data for 6 humans, modern Homo sapiens, are included in the analysis. All comparisons are done with respect to body weight. Weight is more commonly used in allometric analysis/adjustments than is body length. (Chivers and Hladik [1980, 1984] used the cube of body length for their analysis.) Another advantage of performing analysis based on body weight is that it is closely related to the metabolic energy requirements of each animal (Kleiber's Law again). The research uses gastrointestinal (GI) quotients (explained below), similar to encephalization quotients. In Chivers and Hladik [1980, 1984], the animals were grouped into dietary categories on an a priori basis prior to quantitative analysis. One can argue that such an a priori classification is inappropriate and introduces bias. In contrast, Martin et al. [1985] allow the data itself to determine the dietary groupings, based on an analysis of gastrointestinal quotients (alone). The groupings are then analyzed for dietary commonality. (In other words, the data themselves here determine the dietary groupings--a critically important point.) Major axis line-fitting is used in preference to linear regression, as it may have advantages when both dependent and independent variables are measured with errors.

The major points from Martin et al. [1985], in summary, are:

Size scaling adjustments done by body weight rather than length. The raw data is
allometrically adjusted (scaled) to adjust for body weight. Martin et al. [1985, p. 66] note: For instance it is now well established that basal metabolic rate scales to body weight in mammals with an exponent value of 0.75 ["Kleiber's law" (Kleiber 1961; Hemmingsen, 1950, 1960; Schmidt-Nielsen, 1972)], and there is new evidence to indicate that active metabolic rate (i.e., total metabolic turnover in a standard time) also scales to body size with a comparable exponent value

99

(Mace and Harvey, 1982). Hence, one might expect any organ in the body that is directly concerned with metabolic turnover to scale to body size in accordance with Kleiber's law.

GI quotients for 4 different digestive system components (stomach, intestines, cecum,


and colon) for each animal in the study are calculated by dividing the component's actual weight by its "expected" weight (the latter projected using Kleiber's Law). Measurement of the size of digestive components was calculated using surface area. A GI quotient greater than 1 means the size is larger than expected; a value less than 1 indicates a size smaller than expected. Human GI quotient pattern typical of faunivores. Human GI quotients are considerably lower than predicted/expected for all 4 digestive system components measured. Martin et al. report [1985, p. 72] that: Calculation of gut quotient values has particular interest in the case of the four average surface areas of the gut compartments determined for six Homo sapiens. It can be seen from figs. 1-4 that man has values of less than one for all four gut compartments, most notably with respect to the cecum: GQ = 0.31; IQ = 0.76; CQ = 0.16; LQ = 0.58 [In the above, GQ is the quotient for the stomach, IQ for the small intestine, CQ for the cecum, and LQ for the colon.] This is a pattern shared with a number of animals relying heavily on animal food ["faunivores" (Chivers and Hladik, 1980)].

Meaningful dietary groupings based on statistical analysis of GI quotients. A


dendrogram, or "tree" diagram, based on statistical analysis of the GI quotients for the different animal species in the study was derived in order to determine meaningful dietary groupings according to similarity of GI tracts. The dendrogram for the study can be found in Figure 11, p. 81 of Martin et al. [1985], and includes Homo sapiens. Humans fall into group A2 in the dendrogram, about which, Martin et al. [1985, p. 82] comment: Group A can be characterized as containing numerous mammalian species (primates and nonprimates) that include at least some animal food in their diets. Again, there is a separation into two subcategories (A1, A2), the second of which contains most of the mammalian carnivores and only two primate species--Cebus capucinus and Homo sapiens. Thus the result of the advanced statistical analysis in Martin et al. [1985] is that humans fall into the faunivore--meat-eater--class, yet again. Note also that the Capuchin monkey, Cebus capucinus, is in the same statistical grouping as humans, thereby confirming the remarks in Milton [1987], discussed earlier in this section, that the human and Capuchin monkey gut dimensions are similar.

The research of MacLarnon et al. [1986] Refinement needed in analytical techniques used in earlier study. The research of MacLarnon et
al. [1986] provides an extension and analytical refinement of Martin et al. [1985]. The analytical techniques needed further refinement because:

Clustering technique is labile. The dendrogram clustering technique used in Martin et al.
[1985] was found to be labile, i.e., small changes in the data set analyzed could cause substantial changes in the cluster pattern produced.

100

Non-symmetric transformation used in earlier study. The anti-logarithmic transformation


applied in computing GI-quotient values in Martin et al. [1985] was non-symmetric and tended to overemphasize the importance of GI components that were larger than expected. The multidimensional scaling (MDS: a dimension-reducing data analysis method) plots produced in Martin et al. [1985] were crowded and difficult to interpret.

MacLarnon et al. [1986] addressed the above issues by performing additional analyses of the Martin et al. [1985] data set:

The data were reanalyzed in their original, untransformed logarithmic form. This
created another problem, however, as it forced the exclusion (from the analysis) of those animals that lacked a caecum (i.e., some faunivores). [The exclusion was necessary because the logarithm of zero is undefined.] Primates were analyzed as a separate group, for comparison with the other analyses. The MDS was repeated using the original logarithmic data.

The results of MacLarnon et al. [1986] can be summarized as follows.

Primates-only analysis. In the dendrogram clustering using the transformed data [Martin et al.
1985], humans (Homo sapiens, #42 in the analyses) and Capuchin monkeys (Cebus capuchinus, #16 in the analyses) appear together in a group that consists of frugivores and frugivoreinsectivores. In the dendrogram clustering using the untransformed (logarithmic) data [MacLarnon et al. 1986], humans and the Capuchin monkey occur together, as an apparent outlier in the clustering. In the MDS analysis using logarithmic data, humans and Capuchin monkeys appear as outliers. In the anti-logarithmic data MDS, humans and Capuchin monkeys appear on the edge of the central cluster of primate species.

MDS analysis of a more complete data set: humans grouped (again) with faunivores.
When a more complete data set (primates plus other mammals, excluding those faunivores with no caecum) was analyzed, humans and Capuchin monkeys were grouped with non-primate faunivores in the MDS logarithmic data analysis. This is clearly illustrated in Figure 5 of MacLarnon et al. [1986, p. 302], where humans and the Capuchin monkey appear in the faunivore group. Note that Martin et al. [1985] report similar results using untransformed anti-logarithmic data, though the results are not as clear as in the logarithmic data set. Wild vs. captive specimens. Further analysis showed that including data on captive primates had only a small impact on the results. The primary effect was to rearrange the frugivores and frugivore-insectivores, which are the most labile species in the analysis.

Conclusions. MacLarnon et al. [1986] conclude that:


The use of logarithmic quotients is preferable to the use of anti-logarithmic quotients in MDS analyses. MDS analysis techniques are more robust (for the subject data set) than dendrogram-based clustering techniques. Human GI tract shows possible faunivore adaptations. From MacLarnon et al. [1986, p. 297]:

101

...[T]his being the case, the new evidence from the approach using logarithmic quotient values (Fig. 1, 3 and 5) is particularly interesting in that it suggests a marked departure of Cebus [Capuchin monkey] and Homo [humans] from the typical pattern of primates lacking any special adaptation for folivory...in the direction of faunivorous non-primate mammals.... 5. Use of logarithmic quotient values for clustering purposes suggests that Cebus and Homo possess gastrointestinal tracts that have become adapted in parallel to those of faunivorous mammals, with notable reduction in size of caecum relative to body size. Nevertheless, because of the artificiality of most modern human diets, it cannot be concluded with confidence that the small human sample examined to date reflects any "natural" adaptation for a particular kind of diet. The results obtained so far are suggestive but by no means conclusive.

Thus the research of MacLarnon et al. [1986] suggests, but is not (by itself) conclusive proof, that the human GI tract is adapted for the consumption of animal foods.
Assessing Martin et al. [1985], MacLarnon et al. [1986], and related studies Further study still needed to confirm results. Although very impressive in their scope and
sophistication, the results of Martin et al. [1985] and MacLarnon et al. [1986] that relate to humans specifically are based on limited data (6 individuals). Further study is appropriate before sufficient evidence is available to definitely class humans as faunivores or frugivores, based solely on gut morphology. Some of the reasons for caution regarding the study results are as follows:

Small sample size. Some species used in the study (although there were 180 individuals total
from 78 different species) have only a single specimen--i.e., small sample size for those species. Gut dimensions can vary in response to current diet. The gut dimensions of animals can vary significantly between wild and captive animals (of the same species, of course). Gut dimensions can change quickly (in captivity or in the wild) in response to changes in dietary quality. For information on this topic, consult Hladik [1967] as cited in Chivers and Hladik [1980]; also the following sources cited in Milton [1987]: Gentle and Savory [1975]; Gross, Wang, and Wunder [in press per citation]; Koong et al. [1982]; Miller [1975]; Moss [1972]; and Murray, Tulloch, and Winter [1977]. Knowledge of human gut variability is limited. More information is needed on how the human gut dimensions vary with diet. Milton [1987, p. 101] notes: The size of the present-day human small intestine could be an ancient or a relatively recent trait. Indeed, it is not known whether all modern human populations show such gut proportions.

Gut structure and gut transit times. The analyses of Martin et al. [1985] and MacLarnon et
al. [1986] are limited to measures of structure, and do not include gut transit times. Note, however, that data on gut transit times for wild animals will likely be difficult to obtain. Different analytical techniques, different results: the confusion factor. The papers discussed so far in this section provide a range of different analytical techniques, and the results for humans are not consistent across the different techniques. This can be confusing to readers, as one part of the analysis suggests humans are probably faunivores, while another suggests humans might be frugivores.

We have presented the results of these papers in some detail here, so the reader can get a feel for the overall thrust of the analyses. The basic result appears to be that the anatomy of the human GI tract shows what appear to be adaptations for faunivory (consumption of animal foods), regardless of whether humans fall into the faunivore or frugivore class. This leads us to the next paper on gut morphology to be discussed here.

102

The note of Hladik et al. [1999]


A recent note: Hladik, Chivers, and Pasquet [1999] (referred to hereafter as Hladik et al. [1999]) provides further information on gut morphology. The Hladik et al. [1999] paper must be understood in the context of the 4 major papers that preceded it, as well as the fact that it is a comment on the Expensive Tissue Hypothesis of Aiello and Wheeler [1995], which was discussed in a previous section. The context is important because a fruitarian extremist has already quoted parts of Hladik et al. [1999] out of context, and--in my opinion--misrepresented the meaning thereof.

Humans are frugivores by one measure, faunivores by another. Let's look at some
relevant quotes from the note. Hladik et al. [1999] includes a version of a figure from Chivers and Hladik [1980], with the comment [Hladik et al. 1999, p. 695]: Some of these data are reported here (fig 1), combined with one of the original figures [Chivers and Hladik 1980] illustrating the reduced axis [line fit] for three major dietary tendencies of primates and other mammals (folivore/frugivore/faunivore). The human specimens fall, as expected, on the main axis for frugivores, a gross category corresponding to fruit and seed eaters (or omnivores, when occasional meat eating is practiced). Examination of figure 1 from Hladik et al. [1999] suggests that it is a version of figure 26 from Chivers and Hladik [1980]. The figure (1) shows the relationship between body size (horizontal axis) and area of absorptive mucosa (vertical axis). Hladik et al. [1999] report that the human specimens fall into the frugivore range. However, Sussman [1987] analyzed 6 human cadavers and found that their intestinal surface areas fell into the frugivore range using one measure, but the faunivore range using another measure (specifically: figure 26 from Chivers and Hladik [1980]; see also figure 9.8 in Sussman [1987, p. 176]). Thus we have again that the classification of humans by gut morphology is not consistent; humans appear to be frugivores per Hladik et al. [1999], but faunivores by a similar (perhaps the same kind of) measure per Sussman [1987]. If one considers frugivore as a gross category that includes omnivores, then humans might indeed fall into the category, as the sum total of current evidence suggests that humans (and Capuchin monkeys) are (figuratively) where the faunivore and frugivore classes "meet."

Gut surface areas might not support Expensive Tissue Hypothesis. From Hladik et al.
[1999, pp. 696-697]: A specialized carnivorous adaptation in humans that would correspond to a minimized gut size is obviously not supported by our data (fig. 1). The large variations in human diets (Hladik and Simmen 1996) are probably allowed by our gut morphology as unspecialized "frugivores," a flexibility allowing Pygmies, Inuit, and several other populations, present and past, to feed extensively on animal matter... The first sentence above, re: carnivorous adaptation, must be understood in context: as a comment on the Expensive Tissue Hypothesis. It claims that there is no major change in gut surface areas as the Expensive Tissue Hypothesis suggests. It does not mean there is absolutely no adaptation to faunivory: the major adaptation to faunivory in humans was previously identified as a reduction in size of the caecum and colon, per Martin et al. [1985] and MacLarnon et al. [1986]. The above quote does not contradict the 1985 and 1986 papers.

Humans fail on raw, ape-style frugivore diets, but thrive on faunivore diets. The
second part of the above quote is very interesting. Let's consider the raw-food, ape-style frugivore diets that various dietary advocates claim are "optimal" for humans. The fruitarian diet is a diet of predominantly raw fruit, with some leaves and seeds. It is a strict vegetarian frugivore diet.

103

However, extensive anecdotal evidence (the only evidence available) indicates that the fruitarian diet is a massive failure. There are very few long-term success stories on fruitarian diets, and many of the claims of success are not credible. If we consider an arguably more natural frugivore diet, one of raw fruit and raw meat/animal foods, we have the instinctive eating diet or anopsology. That diet does have a few credible longterm success stories, but only very few. As well, the few long-term success stories known to this writer are of individuals who consume significant quantities of (raw) animal foods (i.e., in caloric terms they are probably closer to faunivore than frugivore). The point here is the obvious irony: humans might be frugivores, but in fact humans fare poorly and do not succeed, in the long-term, on ape-style frugivore diets. In sharp contrast to the failure of ape-style frugivore diets, the example of the Inuit shows that humans can succeed and even thrive, long-term, on a diet similar to a faunivore diet. This, of course, brings us right back to the key question: should we class humans as frugivores or faunivores?

Insects (or comparable animal foods) a part of natural human diet. Ultimately, Hladik
et al. [1999] directly acknowledge that the natural human nutritional requirements call for some animal foods (from p. 697, emphasis below is mine): There is no doubt our species needs a rich diet to cover large energy expenses, but it requires relatively no richer a diet than many Cebidae and Cercopithecidae feeding on sweet fruits complimented by the protein and fat of a large proportion of insects.

Postscript: humans are where faunivore and frugivore "meet." The Hladik et al. [1999] note
reminds us that gut morphology alone probably won't settle the issue of whether humans should be classed as frugivores or faunivores. Indeed, it might not be possible to unequivocally "prove" humans are in one class but not the other. One point is clear, though: gut morphology suggests that humans are where the classes of faunivores and frugivores "meet," i.e., it suggests that animal foods (even if just insects) are a part of the natural human diet.

On the term "omnivore," and misuse of quotes


Definition of "omnivore" a critical point
Out-of-context quotes from the writings of D.J. Chivers are sometimes used by fruitarian extremists in support of their claims that humans are not omnivores, but are instead adapted to a strict vegetarian/fruitbased diet. From the previous material in this section, recall that the research of Chivers and associates suggests that the human gut morphology is similar to that of faunivores, meat-eaters (which suggests that animal foods are part of the natural human diet), and does not support the alleged claim that humans are adapted for a nearly 100% fruit diet, or even a 100% vegetarian diet. One major factor in understanding the quotes from Chivers regarding omnivores is that he uses the word in a more precise and different way than the most common usage. To understand this point, let's now examine the different definitions of the term "omnivore."

104

Omnivore: the common definition. The most common usage of the term omnivore is to indicate an
animal that includes foods from more than one trophic level in its diet, which is usually interpreted to mean an animal that eats both plant and animal foods. From Milton [1987, p. 93]: Humans are generally regarded as omnivores (Fischler 1981; Harding 1981). By definition, an omnivore is any animal that takes food from more than one trophic level. Most mammals are in fact omnivorous (Landry 1970; Morris and Rogers 1983a, 1983b)... Milton [1987] goes on to note that the term omnivore is vague as there is substantial variability in the foods omnivores eat, and the term is not linked to gut morphology. Thus saying a mammal is an omnivore tells one little about the actual diet of the animal.

Omnivore: Chivers uses the term differently. Let's review some of the relevant quotes. From
Chivers [1992, pp. 60-61]: [F]or anatomical and physiological reasons, no mammal can exploit large amounts of both animal matter and leaves, the widely used term "omnivore" is singularly inappropriate, even for primates. Humans might reasonably be called omnivores, however, as a result of food processing and cookery... Because of cooking and food processing, humans can be described as the only omnivores, but we shall see that their [human] gut dimensions are those of a faunivore.

"Omnivore" a vague term lacking in relevance for GI tract functions. A further relevant quote
is from Chivers and Langer [1994, p. 4]; emphasis below is mine: The concept of omnivory is weakened by the anatomical and physiological difficulties of digesting significant quantities of animal matter and fruit and leaves... animal matter is swamped in a large gut, and foliage cannot be digested in a small gut. A compromise is not really feasible... Humans are only omnivorous thanks to food processing and cookery; their guts have the dimensions of a (faunivore) carnivore but the taeniae, haustra and semi-lunar folds are characteristic of folivores. Among the so-called omnivores, most eat either mainly fruit and animal matter (if smaller) or fruit and foliage (if larger) but not all three. Thus we note that Chivers appears to define an omnivore as a general feeder with a gut morphology that supports a diet that includes significant amounts of all three types of foods: fruits, leaves, and animal matter. Such a gut morphology is not found in mammals, hence the term is indeed inappropriate for mammals.

Contradictory claims about omnivores: which is correct? Thus we have what appear to be
contradictory statements: most mammals are omnivores; no mammal is an omnivore. Which is correct? The answer is that both are correct, because they are using different definitions of the term "omnivore." Chivers' criticism of the common definition of the term "omnivore" is relevant: it would be better (more precise) to use terms that are linked to gut morphology: folivore, frugivore, faunivore. However, that does not mean that those who are using the common definition are making incorrect or invalid statements. Recall that a definition is simply a convention that people follow. While it is desirable that definitions possess analytical rigor, it is not a requirement that they do so. Hence the meaning of a statment like "chimps are omnivores" or "humans are omnivores" is clear, i.e., the natural diet of humans and chimps includes both animal and plant foods. A fruitarian extremist has used the difference in definitions of the term "omnivore" to suggest that statements like "chimps are omnivores" are incorrect and irrelevant. Because the meaning of such statements is clear (even to those who support Chivers' remarks), it is my opinion that the fruitarian extremist is engaging here in a blatantly intellectually dishonest word game in an effort to distract attention from the well-known fact that animal foods are a significant (even if small) part of the natural diet of many primates.

105

More examples of out-of-context quoting by dietary extremists. Those who use (some or all of)
the above quotes from Chivers' writings in support of the fallacious claim that humans evolved on a strict vegetarian/fruit diet often neglect to quote the following, from one of the same source articles, e.g., Chivers [1992, pp. 60, 64]: Exclusive frugivory is practically impossible, because certain essential amino acids and other nutrients are found only in leaves or in animal matter... Humans are on the inner edge of the faunivore [meat-eater] cluster, showing the distinctive adaptations of their guts for meat-eating, or for other rapidly digested foods, in contrast to the frugivorous apes (and monkeys). The first part of the above quote indicates how unlikely the bogus claims that humans evolved as fruitarians (on very low-protein diets) really are. The second part of the above quote points out that human gut morphology is more similar to that of faunivores than the fruit-diet frugivores. Finally, readers should be aware that a more recent paper, Chivers [1998], includes quotes similar to the above. This is mentioned in the event the fruitarian extremist in question here might decide to "update" their argument by utilizing more recent quotes from Chivers than the ones above.

Misuse of other quotes


Speaking of intellectual dishonesty and misuse of quotes, another quote on gut morphology that is sometimes misused by fruitarian extremists is from Elliot and Barclay-Smith [1904], as cited in Stevens and Hume [1995, p. 112]: There can be little doubt that the human colon is rather of the herbivorous than carnivorous type. By itself, the above quote may seem straightforward enough. However, now compare the above to a more complete version of the same quotation in full context, from Stevens and Hume [1995, p. 112] (italicized emphasis mine below): They cited a study of 1,000 Egyptian mummies that indicated that their cecum was considerably larger than that of present-day humans. Therefore, in spite of reservations about deducing function from structure, these investigators concluded that, "There can be little doubt that the human colon is rather of the herbivorous than carnivorous type." As previously mentioned, human gut dimensions can vary with diet. The diet of ancient Egyptians would likely be much higher in fiber than a modern Western diet of processed foods. See Milton [1987] for more information on this topic. The above questionable use of a quote is yet another example (in my opinion) of intellectual dishonesty by a fruitarian extremist. The moral of the story here is that checking the references can be critical, particularly where only a few references have been (perhaps selectively) quoted.

Section summary and synopsis Although by comparative anatomy analysis (alone) the issue is not yet settled, the results of two different statistical analyses of a "large" data set on gut morphology and diet (i.e., the best available scientific evidence) support the idea that animal foods are a natural part of the human diet. That is:

Humans are faunivores or frugivores adapted to a diet that includes significant amounts of animal foods. The morphology of the human gut does not correspond to that expected for a nearly 100%-fruit frugivore, as claimed by various fruitarian extremists.

106

Finally, the simplistic analyses of gut morphology found in the various comparative proofs of diet are (badly) outdated.

PART 7: Insights about Human Nutrition & Digestion from Comparative Physiology
Preface. The objective of this section is to investigate the topic of comparative physiology for any insights
it might provide regarding the natural diet of humanity. Also of interest will be information that can be gleaned about the relative efficiencies and inefficiencies in the availability and absorption of certain key nutrients depending on their source in the diet (i.e., plant or animal foods). To a certain degree, this topic is already discussed in the article Metabolic Evidence of Human Adaptation to Increased Carnivory on this site. This section serves to summarize the main points of that article, and to provide further information and discussion (which in some cases is extensive), as appropriate. Due to the complexity of the issues, some of the subsections below are relatively detailed (and lengthy). This is necessary to establish a basis for discussing the implications physiology has regarding diet. If you find that a particular subsection is not of interest (or too technical, though I have tried to make the material as accessible as possible), you might want to advance to the next subsection.

Key Nutrients vis-a-vis Omnivorous Adaptation and Vegetarianism

Vitamin B-12: Rhetoric and Reality

Introduction: standard information Vitamin B-12 (cobalamin) is an essential nutrient required for the synthesis of blood cells and the
myelin sheath of the central nervous system. Only tiny amounts are needed to sustain health, and a healthy body is efficient at recycling B-12. Additionally, the body maintains a sizable store of the vitamin. The recommended daily allowance/input, or RDA/RDI, for vitamin B-12 is as follows, from NRC [1989, p. 162]: Adults. A dietary intake of 1 mcg daily can be expected to sustain average normal adults. To allow for biological variation and the maintenance of normal serum concentrations and substantial body stores, the

vitamin B-12 RDA for adults is set at 2.0 mcg.


Vitamin B-12 is made only by bacteria; it is not synthesized by plants or animals. The very limited (usually only trace) amount of B-12 in plants comes from uptake of the vitamin from the soil, and from surface contamination with B-12 producing bacteria. (This is discussed in detail below.) Animals concentrate B-12 from the food they eat, and, in the case of folivores, biologically active B-12 may be produced by bacteria in the fermenting chambers in the digestive system. The end result of this is that plant foods provide little (if any) B-12, and animal foods are the only reliable food sources for B-12. Because plant foods are deficient in B-12, the use of B-12 supplements (or fortified foods, e.g., nutritional yeast) is recommended for vegans by most nutrition experts. Of course, some extremists disagree, and one must seriously question whether such "experts" (or "diet gurus") are putting dietary dogma ahead of the health and welfare of those who follow their advice.

107

The primate connection B-12 also an essential nutrient for non-human primates. From Hamilton and Busse [1978, p. 763]:
Many captive primate species enter into hypovitaminosis B-12 [deficiency] when maintained on vegetarian diets (Hauser and Beard 1969, Oxnard 1964, 1966, 1967, Siddons 1974, Siddons and Jacob 1975)... Vitamin B-12 is the least readily available vitamin to omnivorous primates... Deficiency diseases have not been identified for any wild primate population (Kerr 1972, Wolf 1972). The fact that wild primates avoid B-12 deficiency suggests that their natural diet provides adequate B-12. Inasmuch as all primates eat insects, and some insects contain B-12 (see Wakayama et al. [1984]), this suggests insects as a possible B-12 source for some primates, along with production by fermentation bacteria as a possible source for folivorous primates (e.g., gorillas, whose consumption of insects is very small when compared to their intake of plant foods).

B-12 recycling and interactions To a great extent, B-12 is recycled from liver bile in the digestive system. This is one reason why
vitamin B-12 deficiency is rare among vegans, even those who do not use supplements or supplemented foods. The recycling is summarized by Herbert [1994, p. 1217S]: The enterohepatic circulation of vitamin B-12 is very important in vitamin B-12 economy and homeostasis (27). Nonvegetarians normally eat ~2-6 mcg of vitamin B-12/d and excrete from their liver into the intestine via their bile 5-10 mcg of vitamin B-12/d. If they have no gastric, pancreatic, or small bowel dysfunction interfering with reabsorption, their bodies reabsorb ~3-5 mcg of bile vitamin B-12/d. Because of this, an efficient enterohepatic circulation keeps the adult vegan, who eats very little vitamin B-12, from developing B-12 deficiency disease for 20-30 y (27)... Unlike the vegetarian whose absorption machinery is normal, the person whose absorption machinery is damaged by a defect in gastric secretion, by a defect in pancreatic secretion, or by a defect in the gut that produces intestinal malabsorption will develop vitamin B-12 deficiency in 1-3 y because these absorption defects block not only absorption of food vitamin B-12, but reabsorption of vitamin B-12 excreted into the intestinal tract in the bile (2,6).

Reduction in stomach acid promotes B-12 deficiency. A reduction in gastric (stomach) acids is associated with the development of bacterial colonies in the stomach that produce analogues of vitamin
B-12, which can accelerate or promote B-12 deficiency. From Herbert et al. [1984, p. 164]: As pernicious anemia develops, the first loss usually is of gastric acid. Figure 3 (from Drasar and Hill, 23) shows that the achlorhydric stomach [one unable to produce hydrochloric acid] is usually heavily colonized with enteric bacteria. The increased colonies of enteric bacteria in the achlorhydric stomach and small intestine of the pernicious anemia patient may produce analogue which may in three ways accelerate the development of B-12 deficiency. The loss of gastric acids may also occur in iron deficiency. The iron in plant foods is of much lower bioavailability than in animal foods. The common grain-based vegan diet contains antinutrient factors that may inhibit iron absorption (discussed later in this section). Vegetarians, especially vegans, are at higher risk of iron deficiency. From Herbert [1994, p. 1215S]: ...iron deficiency is twice as common in vegetarians as in omnivores (3)... Prolonged iron deficiency damages the gastric mucosa and promotes atrophic gastritis and gastric atrophy, including loss of gastric acid and I.F. [intrinsic factor] secretion, and therefore diminished vitamin B-12 absorption (3, 4, 19). This would cause vitamin B-12 deficiency in twice as many vegetarians as omnivores (3, 4, 19).

Does strict fruitarianism accelerate B-12 deficiency?


Note that loss of gastric acid, whether caused by an iron deficiency or other cause, may promote B-12

108

deficiency. In this regard, the anecdotal experience of some fruitarians may be relevant. (Please note that the following remarks regarding fruitarianism are based on anecdotal evidence, as that is all that is available; there are no published, peer-reviewed studies on fruitarians on this topic.)

Possible reduction in stomach acid in long-term fruitarians. Among the very few people who
manage to follow the fruitarian diet with at least a limited degree of success for longer than a year (i.e., manage to stay on the diet, without extensive binges or cheating--both of which are quite common in fruitarianism), some report that when they try to eat other food again (i.e., foods other than fruits), they are unable to digest it. Food may be eaten and pass through the digestive system, coming out as stools, yet appearing almost exactly as it did when consumed. Typically this effect is observed with protein foods or cooked food, but it may occur with other foods as well. (Surprisingly, the effect may also occur with fruit, if one stays on the diet long enough.) The above phenomenon (which this writer personally experienced after abandoning fruitarianism and returning to a more normal vegetarian diet) suggests a possible deficiency of gastric acid caused by the fruitarian diet. (In time, with a more normal diet, the digestive system appears to recover.) If so, it suggests that fruitarian diets, if they decrease gastric acid, may actively promote B-12 deficiency over and above any effect due to lack of B-12 in the food. (This is a hypothesis at present, of course, and would need research to validate it).

Other negative symptoms of fruitarianism. Many who attempt strict fruitarian diets report other
negative symptoms--intermittent fatigue, lassitude, loss of libido, sugar metabolism problems (diabetes-like symptoms: excess urination, thirst, mood swings, etc.), as well as emaciation. (Note: thirst on a fruitarian diet may seem counterintuitive, since the level of fluid intake is so high. However, the diuretic effect of some fruits [such as citrus], if used as staples, can negate that, depending on the case.) The possibility of fruitarianism increasing the potential for a B-12 deficiency raises the question whether the fatigue, loss of libido, and lassitude that some fruitarians experience are possibly related to B-12 deficiency, or alternatively (or both), to the sugar metabolism/emaciation-starvation effects of the diet.

Clinical B-12 deficiency rare. Doubtless some fruitarian advocates will challenge the above and ask
where are the fruitarians with clinical signs of B-12 deficiency? They are hard to find, for the following reasons: Fruitarians are quite rare, and ones who are strict are even rarer. Eating-disorder behavior, especially binge-eating, is quite common in fruitarianism. A 1997 survey (unpublished, conducted by Craig Woods Schiemann) of members of SF-LiFE, a raw-food support group in San Francisco, found that over 50% of members surveyed reported problems with binges and cravings. (Side note: thanks to Craig for the survey information.) Indeed, even some of the extremists who promote fruit as the ideal diet admit that they binge-eat and can't follow (strictly) their allegedly "ideal" diet. Binge-eating, of course, may help prevent B-12 deficiency if foods containing B-12 are consumed in binges or as "exceptions" (also commonly known as "cheating," if done in secret). Most fruitarians oppose conventional medicine and refuse to take blood tests, i.e., refuse to have their serum B-12 levels measured. The lack of B-12 tests makes it impossible to validate the B-12 status of fruitarians.

Finally, there are a few sensible, credible fruitarians (though more loosely defined) who specifically include small amounts of animal products in the diet to satisfy B-12 requirements.

Vitamin B-12: Rhetoric and Reality (CONT., 2 OF 5)

109

Vitamin B-12 deficiency in natural hygienists


Dong and Scott [1982] took blood samples at the 1979 annual convention of the American Natural Hygiene Society (ANHS), and tested the samples for serum B-12 levels and other parameters of interest. A total of 83 volunteers provided blood samples. Each individual in the study provided detailed dietary information via a survey form, which asked about the individual's consumption of animal foods (including eggs and dairy), and also asked for a typical daily diet.

Description of natural hygiene diet, and diets of those in survey. For readers unfamiliar with the
term "natural hygiene," in this context the classical definition refers to a predominantly raw diet of fruits, vegetables, nuts, and seeds. Note that the data were collected in 1979, at which point the ANHS had a long history of strongly emphasizing raw plant foods. Since then, the ANHS has revised their position. While still emphasizing plant foods (veganism), the stress previously placed on raw foods has been deemphasized; and adding cooked starches (grains, legumes, tubers, squashes) to the diet is now recommended. Based on the dietary data provided in the survey form, subjects were classified according to their diet, with vegans being defined as those who consumed no animal foods, lacto-vegetarians as those who consumed dairy as their only animal food, lacto-ovo-vegetarians as those who consumed dairy and eggs, and semivegetarians as those who consumed animal flesh foods two or less times per month. It should be noted here that because the subjects were recruited at an ANHS convention, and the ANHS emphasized raw foods at the time, at least some of the subjects in the vegan category presumably were raw or predominantly raw vegans.

Serum B-12 levels of vegan natural hygienists below lower limit of normal range. Dong and
Scott [1982, pp. 214-215] report: Among subjects who did not supplement their diets with B-12 or multiple vitamin tablets, 92% of the vegans, 64% of the lacto-vegetarians, 47% of the lacto-ovo-vegetarians and 29% of the semi-vegetarians had serum B-12 levels less than 200 pg/ml [the lower limit of the normal range]. Mean serum B-12 levels of the dietary groups increased with increasing dietary sources of B-12... Some cases of mild macrocytosis [pernicious anemia] were seen among the vegetarians, but fewer than expected... The data indicates that increasing diversity of animal products consumed increased the serum B-12 level. Dong and Scott [1982, p. 210] report the average serum B-12 levels found as shown below. In the following, note that the normal serum B-12 level is 200-900 pg/ml.

Mean Serum B-12 Levels (pg/ml) Observed at 1979 ANHS Annual Convention from Dong and Scott [1982] DIETARY CATEGORY Vegan Lacto-vegetarian Lacto-ovo-vegetarian Semi-vegetarian Males 120 200 190 360 Females 110 180 260 240

110

Non-vegetarian

360

660

Note that the B-12 serum levels observed for the vegans is in the range regarded as deficient [Dong and Scott, pp. 213-214].

Vitamin B-12 levels in "living foods" and other vegan/vegetarian diets

B-12 in living foods (raw vegan) diets. Rauma et al. [1995] provides data on the vitamin B12 status of living-fooders, i.e., individuals who follow the raw vegan "living foods" diet as taught by Ann Wigmore. Such a diet emphasizes raw sprouts, fermented foods, blended/liquefied foods (raw soups), dehydrated foods, and wheatgrass juice. There were two components to their study: a cross-sectional component and a longitudinal study.

Technical note concerning measurement units in Rauma et al. [1995] study. The study
under discussion here reported results in pmol/L, rather than pg/ml as used by the other studies cited in this section. However, there are reasons to believe that the units for the cross-reference portion of the subject study are actually pg/ml, rather than pmol/L. Because of this, the numbers below (which come from the cross-reference part of the study) are listed here as pg/ml. The rationale for this is as follows: First, as the concern here is a potential inadvertent mix-up between the two different measurement units, pmol/L (pico-moles per liter) and pg/ml (picograms per milliliter), note that the relationship between the two units of measure is: 1 pmol/L B-12 = ~1.35 pg/ml where the ~ above indicates approximately. This follows from the definition of moles, and simple algebra with the units, using the molecular weight of B-12. o If the unit "mole" is unfamiliar to you, check an elementary chemistry text, e.g., Oxtoby and Nachtrieb [1986, pp. 21-23] for explanation. o Note that the term vitamin B-12 does not refer just to one specific molecule; instead the term is applied to a number of biologically active compounds (though not the biologically inactive B-12 "analogues"), an approximate average molecular weight for which is 1350 [Schneider and Stroinski 1987, pp. 56-59]. These slight variations in molecular weight between different active B-12 compounds account for the approximate nature of the conversion ratio between the two measures pmol/L and pg/ml. Second, Rauma et al. [1995] report that one test kit had a lower reference limit of 150 pmol/L, which after conversion is approximately 200 pg/ml, the standard. However, the kit that was used for the cross-reference study (and the numbers below) supposedly had a lower reference limit of 200 pmol/L, which after conversion is 270 pg/ml--a number that is non-standard and appears to make no sense. Further adding to the uncertainty here is that Rauma et al. cite the names of two different manufacturers for the second B-12 assay kit, but email correspondence with the technical support departments of both the named companies reveals that neither company ever made B-12 assay kits. Although the units issue is confusing, the weight of the evidence seems to point to the units for the cross-reference study portion as being pg/ml, and that is how they are reported here. As this website article is being published, an inquiry is pending with the authors of the cited study. When the issue is resolved, this section will be updated as appropriate. Until then, please recognize that there is some uncertainty regarding the measurement units for this study [Rauma et al. 1995].

111

The cross-sectional analysis compared the serum B-12 levels of 21 living-fooders,


each paired with a control (standard Western diet, i.e., non-vegetarian) matched by sex, age, and other factors. The findings of the cross-sectional study found that the control group (standard Western diet) had significantly higher serum B-12 levels (311 pg/ml) than the living-food vegans (193 pg/ml). Note that the average B-12 level reported for living-fooders, 193 pg/ml, is close to the lower limit of the normal range: 200 pg/ml. However, Rauma et al. [1995] report that 57% of the living-foods vegans had B-12 levels below 200 pg/ml.

The longitudinal study was conducted over a 2-year period, using 9 individuals who reportedly
followed the living-foods vegan diet long-term. They note [Rauma et al. 1995, pp. 2513, 2514]: The longitudinal study revealed a decrease in serum vitamin B-12 concentrations with time in six of nine subjects, indicating that the supply of vitamin B-12 from the "living food diet" is inadequate to maintain the serum vitamin B-12 concentration... [T]he occurrence of low serum vitamin B-12 concentrations in over 50% of the long-term adherents of this diet warrants further study of the possible health risks involved.

B-12 levels in long-term vegans. Bar-Sella et al. [1990] examined the serum B-12 levels of 36
strict vegans who had followed the diet long-term (5-35 years). The vegans had an average serum B-12 level of 164 pg/ml; the control group (standard Western diet) had an average serum B-12 level of 400 pg/ml. The difference between the two groups was significant at the 0.001 level. 26 vegans (from a sample of size of 36, i.e., 72.2% of vegans in the sample) had B-12 levels below 200, the lower limit of the normal range [Bar-Sella et al. 1990, p. 310]. None of the vegans displayed hematological defects (i.e., signs of anemia); however, 4 vegans with B-12 levels below 100 had neurological symptoms. Three of the four (with neurological symptoms) were followed, and they all showed clinical improvements in symptoms after intramuscular (injected) vitamin B12 supplementation. Crane et al. [1998] studied the vitamin B-12 (cobalamin) status of two vegan families. The individuals studied had been vegans for 2-21 years. Their study is noteworthy for two reasons. First, it was very thorough and included tests for levels of homocysteine, methylmalonate, and several other factors involved in B-12 metabolism. Second, it discusses the use of oral B-12 supplements, and recommends that B-12 supplements be chewed, rather than swallowed whole, for best absorption. Crane et al. [1998, pp. 87, 88] note: Results showed that if the data of the serum CBL [cobalamin, vitamin B-12] and urinary MMA [methylmalonic acid] are combined, all nine of the [study] subjects had chemical evidence of insufficient CBL [i.e., deficiency]... Evidence, which we have reported elsewhere (Crane et al., 1994) indicates that over 80 per cent of those people who have been vegans for two years or more are deficient in CBL... Only one person in the study showed possible symptoms of deficiency, which were alleviated by oral B-12 therapy. However, five of the test subjects showed mild signs of anemia.

B-12 levels in lacto-vegetarians. In contrast to the low B-12 levels reported in vegans, B-12
levels in lacto-vegetarians appear to be closer to normal. Dong and Scott [1982], discussed above, tested lacto-vegetarians; see table above for serum B-12 values observed. Solberg et al. [1998] analyzed the plasma B-12 levels of 63 long-time Norwegian lacto-vegetarians, and found no significant differences when compared to a control group (standard Western diet). An interesting

112

side point in the Solberg et al. [1998] study is that lacto-vegetarians who took B-12 supplements had plasma B-12 levels equal to the lacto-vegetarians who did not take supplements. [Note: this should not be interpreted as suggesting that oral cobalamin supplementation is ineffective. Quite to the contrary, an excellent recent study, Kuzminski et al. [1998] has found oral cobalamin therapy to be more effective than intramuscular (injected) cobalamin therapy.]

Negative review paper critical of studies pointing to deficient B-12 status in vegetarians found to be inaccurate and outdated
Immerman [1981], a study of vitamin B-12 status in vegetarians, is subtitled, "A critical review," and attempts to review the clinical studies of vitamin B-12 deficiencies in vegetarians published before 1981. Immerman's approach is to check each study against a criteria list. His reference list had 7 criteria, of which the first 5 were considered essential for a study to be credible. After a detailed review of the studies, Immerman concludes [1981, p. 47], "When judged by these criteria, most of the studies which have found inadequate B-12 status in lactovegetarians and vegans are unconvincing." However, a retrospective and analytical review of Immerman [1981] shows that his assessment criteria are logically flawed, and there may be bias present in his review. A summary of the flaws in Immerman's review is as follows.

Review criteria admitted only long-term deficiencies (a logical fallacy). Criteria 2 and 4
of Immerman require low serum levels of B-12, plus the presence of (i.e., diagnosis that includes) anemia and/or subacute combined degeneration (SCD). The problem with such narrow criteria is that the symptoms specified are serious long-term symptoms only; see Herbert [1994] for a discussion of the staging of B-12 deficiencies. For example, requiring a diagnosis of subacute combined degeneration means that milder neurological impairments are ignored (mild B-12 deficiency might not produce anemia or SCD.) Hence, it appears that Immerman's requirements can be met only by serious, advanced cases, and he may be rejecting perfectly good data--a major statistical and logical fallacy that renders his conclusions invalid. Review criteria required non-standard therapy for B-12 deficiency. Criterion 5 of Immerman requires that the B-12 symptoms be reversed by oral administration of 1 mcg of B-12 per day. To those who are not acquainted with standard treatment protocols for B-12 deficiency, this may seem reasonable. However, those acquainted with the issues involved in cobalamin therapy will immediately recognize criterion 5 as unrealistic and deceptive. Since the late 1950s, the standard method of treatment for B-12 deficiency in the U.S. has been via intramuscular injections (see Hathcock and Troendle [1991, p. 96] and Lederle [1991]), and not via oral cobalamin as Immerman appears to require. Indeed, Immerman appears to be so rigid in his evaluation of the studies that he objects to one study that administered 5 mcg B-12 (and one unit of blood) to a patient.

Review criteria called for symptom relief on an inadequate dose of B-12. Such rigidity
is actually deceptive because the absorption rate for oral cobalamin in pernicious anemia is approximately 1.2% [Berlin et al. 1968, as cited in Lederle 1991]. Hathcock and Troendle [1991] report that oral doses of 80-150 mcg per day are helpful but do not restore cobalamin serum levels to normal. Lederle [1991] notes that some early studies reported poor results from oral doses of 100-250 mcg of cobalamin per day. Oral doses of 300 to 1,000 mcg per day have proven effective for treatment of pernicious anemia; see Hathcock and Troendle [1991] for citation of 5 relevant studies. Now let us consider Immerman's criterion 5--that is, strictly requiring an oral dose of 1 mcg per day--in light of the following facts. o B-12 (cobalamin) RDA of ~2 mcg/day.

113

o o

An approximate absorption rate of 1.2% for oral cobalamin in pernicious anemia cases.

The review criterion of 1 mcg oral B-12 per day is not supported by reference cited in review. Immerman cites Chanarin [1969, pp. 40-63, 708-714] as
the reference for criterion 5. However, Chanarin [1969] states that 1 mcg B-12 per day is insufficient (p. 57), recommends oral administration of 5-10 mcg B-12 per day in cases of deficiency (p. 712), and cites two studies: Schloesser and Schilling [1963], and Winawer et al. [1967], which reported no or suboptimal improvement on oral doses of 1 mcg B12/day. Thus it appears that Chanarin [1969] does not support Immerman's criterion 5. More recent studies indicate that daily oral doses of 5-20 mcg/day (i.e., higher than the dose specified by Immerman) without intrinsic factor may be ineffective; see Lederle [1991] for two relevant citations. Standard treatment (U.S.) for B-12 deficiency has been via injections rather than oral cobalamin.

o o

We conclude, then, that Immerman's criterion 5 requires symptomatic relief from what appears to be an insufficient dosage of cobalamin, administered in a non-standard way (orally vs. injections). This leads to the conclusion that Immerman's criterion 5 is unrealistic and deceptive, and thus is a logical fallacy that renders his conclusions invalid.

Rationalizations and bias in review? A review of Immerman [1981] suggests he grasps at


every possible rationalization in an attempt to avoid facing the simple reality that if inadequate amounts of vitamin B-12 are ingested, a deficiency may result. Some may interpret such rationalizations as suggesting possible bias. While it is certainly possible for a vegetarian to be deficient in B-12 due to factors unrelated to diet, to try to rationalize away all B-12 deficiencies in vegetarians via non-diet reasons makes the issue of bias relevant here. Immerman actually did not address one of the most important issues regarding older B-12 studies: the assay methods used and their reliability (i.e, some of the older assay methods are unreliable). For more on this point, see the following subsection on "B-12 in spirulina and other plant foods" just below.

A review of Immerman's "critical review" finds it outdated and invalid. Immerman's "critical
review" does show that some of the older B-12 studies are not as convincing as more recent studies, i.e., in light of newer knowledge, the shortcomings of the older studies are readily apparent. Similarly, a retrospective review of Immerman [1981] finds it to be seriously flawed, logically invalid, and potentially biased. If any reader thinks it unfair to judge Immerman [1981] using more recent knowledge, please recognize that this is also what Immerman himself was attempting to do when assessing the earlier B-12 studies. However, note that Immerman's review is invalidated not solely or simply on grounds of more recent knowledge. The criticisms made here (above) of criterion 5 were relevant at the time of Immerman's review article; i.e., Immerman's study was logically flawed at the time it was published in 1981. [Note: Despite the risk or reality of belaboring the point here, I have discussed Immerman [1981] at length because it is occasionally cited by raw/veg*n diet advocates who wish to discount the published studies showing B-12 deficiencies in veg*ns, and/or deemphasize the importance of B-12 in veg*n diets.]

Steer clear of dietary extremists who rationalize B-12 deficiency concerns. The above evidence
of apparent vitamin B-12 deficiency in vegans (both raw and conventional) should serve as a warning to fruitarian/veg*n extremists and their followers who claim adequate vitamin B-12 can be obtained by eating unwashed plant foods or from intestinal bacterial synthesis (the latter topic is discussed later herein).

Vitamin B-12: Rhetoric and Reality (CONT., 3 OF 5)


B-12 in spirulina and other plant foods

114

Microbial assays for B-12 are unreliable. A common misconception in vegan circles is that
fermented foods and spirulina contain B-12. This claim may, at times, be supported by lab tests for B-12 based on the USP (U.S. Pharmacopeia) assay methods. Unfortunately, as explained in Herbert et al. [1984] and Herbert [1988], the USP assay method for B-12 is unreliable. The assay measures total corrinoids--that is, true B-12 plus analogues (forms of B-12 that are not metabolically active in the body)--and the analogues have the potential to block the absorption of true B-12 by occupying B-12 receptor sites. A preferred, reliable test that can differentiate between true B-12 and corrinoids is provided by differential radioassay. The assay problem must be considered in evaluating "old" studies on B-12.

Spirulina and tempeh contain mostly analogues of B-12. Herbert [1988] reports that tests on
tempeh, a fermented soy product, and spirulina revealed that they contained almost no true B-12, i.e., the "B-12" they contained (per USP assay test) was predominantly analogues. Herbert [1988, p. 857] reports: We suspect that people taking spirulina as a source of vitamin B-12 may get vitamin B-12 deficiency quicker because the analogues in the product block human mammalian cell metabolism in culture [i.e., in the lab] and we suspect that they will also do this in the living human. The presence of analogues, rather than true B-12, in fermented foods makes them unreliable sources for B12.

Effects of cooking and processing on B-12 in foods Available information not well-controlled enough to provide definitive answers. Recognizing
that some vegans advocate raw-food diets, the question of the effect of cooking and processing on B-12 levels is relevant. The information on this topic (available to this writer) is less clear than desired. That is, for a reliable comparison to be made, tests must be made on both raw and cooked samples from the same base lot of raw foods, and a reliable assay method must be used. Comparison of B-12 levels, raw vs. cooked, via standard nutritional tables, is not an optimal comparison method, as it may not be clear whether the above conditions for comparison are met, and/or if reliable assay methods were used in the table analysis. The limited evidence available, though, suggests that cooking reduces B-12 levels, though the exact extent is, unfortunately, unclear. Herbert [1984] reports that dehydration at 200C (392F) for 6 days reduced B-12 levels by approximately one-third. Note that 200C is well above normal boiling/steaming temperatures. Herbert [1984] also reports that boiling in alkali destroys 85% of the corrinoid content, per L. liechmanni (USP, bacterial assay) test. This result is difficult to interpret, as typical cooking practices do not involve boiling in alkali, and the USP test, as discussed above, is not a reliable measure of true B-12. Banerjee and Chatterjea [1963] report wide variation in B-12 losses in cooking various types of fish, meat, and milk. B-12 losses, when they occurred, ranged from 23.7-96.4%. However, one fish species in their study showed no loss of B-12, and samples from three fish species (and also goat liver) showed increases in B-12 levels from cooking. (The authors suggest the increase may be caused by cooking increasing the B-12 level in the extraction solvent.) The result that B-12 level for some species increased via cooking, of course, raises questions regarding the overall reliability of their experimental methods and results. Also, their study measured B-12 via microbial assay (Euglena gracilis var. bacillaris), which today is regarded as a less-reliable assay method (does not/may not distinguish analogue forms from true B-12). Banerjee and Chatterjea [1963] also tested 20 types of plant foods, and found no B-12 activity in any of the raw plant foods. Heyssel et al. [1966] report that liver boiled in water for 5 minutes lost only 8% of its vitamin B12 content (vs. raw liver), while muscle meat broiled at 171C (340F) for 45 minutes lost 27% of vitamin B-12 (vs. raw meat). [Note: The paper mentions 340F in one place and 340C in another.

115

Given that 340C = 644F (an unusually high cooking temperature), that figure is probably a typo, and the correct figure is most likely 340F (171C).] They note that some of the B-12 they record as lost might actually be contained in the drippings (liquids discarded) in cooking. Their paper also used a Euglena gracilis assay for B-12. The possibility that plant foods might contain some B-12 will be discussed later herein. At that time we will note that there is little or no data on the effect of cooking on B-12 levels in plant foods.

Is biologically active B-12 produced by intestinal bacteria? Claims of intestinal B-12 production may be based on insufficient evidence. Albert et al. [1980]
is sometimes cited as evidence that B-12 producing bacteria can exist in the small intestine. Sometimes explicit claims are made, e.g., that intestinal bacteria allegedly can produce adequate B-12. Baker [1981] and Nutrition Reviews [1980] are related citations that comment on Albert et al. [1980]. However, a careful reading of Albert et al. [1980] shows that it used bacteriological assays, which are of lower reliability, to measure B-12 levels. Specifically, the most accurate bacteriological assay they used is Ochromonas malhamensis. Note that Ochromonas is the most accurate bacterial assay method for B-12; however, even it may report values for some analogues as part of its "B-12" results [Schneider and Stroinski 1987, Tables 3-2, 5-3 to 5-5, pp. 56-57, 119-123]. Herbert and Das [1994, p. 405] apparently regard all the bacterial assay methods as being less reliable than differential radioassay; also see Herbert et al. [1984] and Herbert [1988] for related information. Additionally, the data obtained in Albert et al. [1980] comes from isolated bacterial cultures. Therefore, it is unclear whether the bacteria would produce similar amounts of B-12 under the conditions present in the intestines. This point is discussed in Albert et al. [1980], but is sometimes ignored by dietary advocates with an ideological interest in minimizing the requirement for B-12 in the diet. The bottom line in the paper of Albert et al. [1980] is that it shows certain intestinal bacteria might produce B-12, but it is unclear whether/how much might be produced (and absorbed) under actual conditions in the small intestine. Langley [1995, p. 74] summarizes the situation nicely: In some people, B-12 producing bacteria certainly exist in the small intestine where the vitamin manufactured can, in theory at least, be absorbed. Exactly what contribution this makes to the daily B-12 intake of vegans remains to be clarified. Also recall the discussion above (from Herbert [1984]) regarding the achlorhydric stomach being colonized by bacteria that produce abundant analogues of B-12. Analogues (which block uptake of true B-12) are a major concern whenever one discusses the possibility of B-12 being produced in the small intestine.

Direct coprophagy: a reliable (vegan?) B-12 source


Note: this section may be considered to be in poor taste--both figuratively and literally--by some readers. It is included here for completeness, and in the event certain (extremist) fruitarian/veg*ns might be interested in experimenting with a vegan (?) source of vitamin B-12 that is truly radical in character.

B-12 produced in, but cannot be absorbed from, the human colon. The human colon contains
bacteria that produce vitamin B-12, and fecal matter is a rich source of B-12. This raises the question of whether B-12 can be absorbed from the colon. From Herbert [1988, p. 852]: In one of the less appetizing but more brilliant experiments in the field of vitamin B-12 metabolism in the 50s, Sheila Callendar (7) in England delineated that colon bacteria make large amounts of vitamin B-12. Although the bacterial vitamin B-12 is not absorbed through the colon, it is active for humans. Callendar studied vegan volunteers who had vitamin B-12 deficiency characterized by classic megaloblastic anemia. She collected 24-h stools, made water extracts of them, and fed the extract to the patients, thereby curing their vitamin B-12 deficiency. This experiment demonstrated clearly that 1) colon bacteria of vegans make enough vitamin B-12 to cure vitamin B-12 deficiency, 2) the vitamin B-12 is not absorbed through the colon wall, and 3) if given by mouth, it is absorbed primarily in the small bowel.

116

Herbert et al. [1984] collected the 24-hour fecal output from 6 men. They found that the (24-hour) total fecal output contained ~100 mcg of total corrinoids, of which only ~5 mcg was true B-12 (the remainder being analogues). (Note: see Mozafar [1994] for a table of B-12 levels in manure, feces, soil, sludge, etc.) Given this, the work of Callendar mentioned above could be taken to suggest that the true B-12 in the feces (if reingested and passed back through the small bowel) would be absorbed, despite the substantial amount of analogues present.

Any takers? Further, the daily output of ~5 mcg versus the RDA/RDI of 1-2 mcg suggests that a direct
coprophagy level (i.e., reingestion of feces) of 20-40% of output will meet requirements for B-12. Might this qualify as the only truly reliable, vegan (?) source of B-12? Will coprophagy be the next fad among certain fruitarian extremists? (Obligatory warning: coprophagy, and the handling of feces, is unsafe and increases the risk of transmission of parasites and diseases. Coprophagy is not recommended.)

Vitamin B-12: Rhetoric and Reality (CONT., 4 OF 5)


Indirect coprophagy: plant foods fertilized with feces/manure
The consumption of plant foods grown in soil fertilized with human manure (occasionally called "night soil") might also provide adequate B-12. Herbert [1988, pp. 852, 854] notes: The more frequent source of vitamin B-12 in association with plant food is external contamination with bacteria, often of fecal origin... The fact that stool vitamin B-12 can be important in human vitamin B-12 economy was delineated by James Halsted (11) working with Iranian vegans who did not get vitamin B-12 deficiency... Halsted went to Iran and found that they grew their vegetables in night soil (human manure). The vegetables were eaten without being carefully washed and the amount of retained vitamin B-12 from the manure-rich soil was adequate to prevent vitamin B-12 deficiency. Note that the first part of the quote above--that B-12 on plant foods is primarily present as contamination-may be outdated, per the more recent research of Mozafar [1994]. The paper of Mozafar is sometimes cited by raw/vegan advocates as "proof" that a raw/vegan diet, if grown in the "right" soil (manured), can provide adequate B-12. Therefore, let's now take a closer look at this research. Mozafar [1994] grew soybeans, barley, and spinach on three kinds of soils--unenriched soil (control), soil fertilized with raw, dried cow dung, and soil enriched with vitamin B-12. The discussion of Mozafar [1994] in this paper will omit the results from soil enriched with B-12, as it is not an economically feasible agricultural practice at present. Vitamin B-12 levels were measured using "radioisotope dilution" [Mozafar 1994, p. 307], also called protein-binding. This assay claims a high resolution in distinguishing between true B-12 and analogues.

Plants may absorb B-12 from the soil. Mozafar [1994, pp. 307, 309-310] claims:
[M]ost, if not all, of the B-12 in the plant may be in the free form... Considering the reports that plant roots and leaves can absorb the relatively large molecules of B-12 from nutrient solutions and transport them to other plant parts (Mozafar and Oertli 1992a) and the belief that plants cannot synthesize this vitamin (Friedrich, 1987; Lehninger, 1977; Smith, 1960), it seems that the observed increase in the concentration of B-12 in barley seeds and spinach leaves fertilized with cow dung is mostly (if not fully) due to the uptake of this vitamin by the roots from the soil and not due to superficial contamination or an increased synthesis within the plant.

117

The claim that the B-12 in the above experiment was mostly true B-12--and absorbed and taken up by the plants--is surprising in light of the known presence of B-12 analogues in many foods (see the papers by Herbert for discussions on this point). It also raises an important issue: if the assay techniques used by Mozafar did not also measure vitamin B-12/cobalamin analogues, then we really don't know if analogues are present, which could (potentially) interfere with absorption of vitamin B-12 from plant foods. A later paper by Mozafar--i.e., Mozafar [1997, p. 51]--reports that very little is known about the absorption of B-12 analogues by plants. Note also that the second part of the above quote contradicts the claim by Herbert that B-12 is usually found primarily externally in plants. (The claim by Herbert may be outdated/incorrect; see Mozafar and Oertli [1992] for discussion of their research that shows soybean roots are able to absorb B-12, and the vitamin can be transported within the plant.)

B-12 found in tested plant foods. Mozafar found that the levels of B-12 in barley and spinach grown in
untreated (control) soil versus soil treated with cow dung were significantly different. However, the B-12 levels of soybeans were not significantly different for the two soil types. The B-12 measurements in Mozafar [1994] are frankly hard to interpret as-is. That is, the measurement unit utilized, i.e., nanograms of B-12 per gram of plant food, dry weight, is not meaningful to most readers. To help readers understand the Mozafar results, a part of the results are used below to estimate the amount of each plant food (by itself) needed to get 2 mcg of B-12 per day. The last 2 columns in the table below are the most important--they show the estimated weight (in kilograms and in pounds) of raw plant foods required to satisfy daily B-12 requirements.

B-12 Data from Mozafar [1994] and Estimated Weights of Plant Foods Required to Supply 2 Mcg of B-12 FOOD SOURCE Mcg of B-12 per kilogram (dry wt.) 1.6 2.9 Amount of food necessary to achieve 2 mcg daily requirement of B-12 Dry wt. (KILOGRAMS) 1.25 0.69 Wet wt. (KILOGRAMS) 1.37 0.75 Sprouted* (KILOGRAMS) 4.78 2.64 Sprouted* (POUNDS) 10.52 5.81

Soybean, control Soybean, manured Barley kernels, control Barley kernels, manured Spinach, control Spinach,

2.6

0.77

0.85

1.70

3.74

9.1

0.22

0.24

0.49

1.07

6.9 17.8

0.29 0.11

3.44 1.33

3.44 1.33

7.57 2.94

118

manured

*NOTE: Sprouting applies to the figures for soybeans and barley; spinach is eaten as-is.
Analysis of Mozafar [1994] results
On the surface, the results of Mozafar [1994] appear to indicate that one may be able to get adequate B-12 from a diet of raw plant food. However, let's take a closer look at the numbers.

Soybeans. Soybeans in their crude form are inedible, and need sprouting or cooking to be rendered edible.
Many raw-fooders would challenge the idea that even raw, sprouted soybeans are actually that edible, however, as the flavor is--how to say it--"intensely awful." At any rate, to satisfy B-12 requirements one must eat 2.64-4.78 kg, or 5.81-10.52 pounds of soybean sprouts, per day, to meet B-12 requirements. It is very difficult/nearly impossible to eat that much bulk on a regular basis. The crude beans (0.75-1.37 kg) can be cooked, and the bulk is more manageable, but it is not clear how much extra one would need to eat to adjust for B-12 losses in cooking.

Barley kernels. Barley kernels in their crude form are inedible. While not 100% certain, it appears
Mozafar tested unhulled barley, which has a tough hull (with sharp edges when chewed) that makes eating it raw very difficult, even when sprouted. The amounts of barley required are lower than for soybeans; in sprouted form, the estimate is 0.49-1.70 kg, or 1.07-3.74 pounds. However, the reality of the extremely tough hull on barley makes eating even 0.49 kg (1.07 pounds) per day a difficult and unpleasant task. The 0.24-0.85 kg (0.53-1.87 pounds) of crude barley kernels, if hulled and cooked, are easy to consume. Again, the problem arises in that one needs accurate data on B-12 losses in cooking to know how much extra to eat to compensate for loss of B-12 in cooking.

Spinach. Here 1.33-3.44 kg (2.94-7.57 pounds) are required. The amounts are within the range of
possibility for a person to eat, though it would be difficult (bulk--multiple meals required). The problem, of course, is that a reliable supply of manured spinach is not available to most people, and the weights required of control (unmanured) spinach--3.44 kg or 7.57 pounds--are far too high, again, i.e, yet another bulk problem. It would be extremely difficult to eat that weight of greens in a day.

Comments on the Mozafar [1994] research Research needs to be repeated and verified. The Mozafar paper is the first paper to report B-12
levels above trace in common plant foods. As it is but one study, the research needs to be validated--i.e., it should be repeated and extended by others. (Repetition of the research would provide important confirmation information.) It would be interesting to include root crops, fruit, and wheat in any future research along these lines; indeed, the extension of Mozafar's research to woody plants like tree and bush crops could be very important. Additional comments:

B-12 levels in soil highly variable. The table in Mozafar [1994, Table 3, p. 309] indicates that
B-12 levels in soil can vary widely. Accordingly, it would be unwise to assume that a particular soil will supply levels of B-12 to plants that are adequate for human nutritional needs. B-12 losses in shipping and processing? Mozafar analyzed the plant foods at the location where they were grown, just after harvest. Most consumers buy their foods from markets, and the food has been processed and shipped. This raises the question of whether, and how much, postharvest processing and shipping can change the B-12 levels in foods. Similar questions apply to types of cooking and food processing. Legal risks from use of raw manure. The study used raw animal manure; owing to the liability risks (E. coli) of raw manure, the research should be repeated with composted manure. The use of raw, dry cow dung as a soil amendment is uncommon (nowadays) in farming as it can introduce weed seeds.

119

The use of manure is not universal, even in organic gardening; it is not the case that "organic = manured" or even "organic = composted." Some organic farmers prefer to simply use heavy mulches, with no manure or compost. Some vegan purists reject the use of animal manure. The term used for such gardening is "veganic." Inasmuch as some vegans condemn the use of manure in farming, the information that the only possible "natural" way to get B-12 in the usual plant foods is via the use of manure--an animal product--as fertilizer provides yet another contradiction/irony in veganism. Of course, some might reply that human manure can be used instead. Indeed it can, but if cow dung is abhorrent to vegan purists because it is an "animal product," then is human manure an "animal product" as well? What about human manure from meat-eaters? Per Mozafar, sewage sludge is very high in B-12, but organic food advocates oppose the use of sewage sludge because of possible heavy metal contamination. (The underlying point here is that sewage sludge might be used--in very small quantities, as a compost additive--to inoculate soils with B-12.) Raw human feces, aka "night soil," is high in B-12. However, its use in farming raises the risk of parasites, E. coli, and potential liability lawsuits aimed at farmers. It is not a feasible soil additive for farms, at present. The Mozafar [1994] paper is considered controversial by some veg*n nutritionists. For comments, see Craig [1997] as cited in Mozafar [1997], and Mozafar [1997] for reply. Possible exaggerations by raw/veg*n advocates. Given that Mozafar [1994] is only one (limited) study, that it did not test wild foods, fruits or nuts, roots or tubers, then to claim that it "proves" plant foods are adequate sources for B-12 would be an exaggeration. (Mozafar makes no such claims; the claim is occasionally made by raw vegan advocates).

Geophagy: another source for B-12?


Warning: for some, this may be in poor taste too. (But remember here our goal to be as thorough as possible on the topic.) Chimps and other apes have been observed engaging in geophagy, i.e., eating dirt, though the predominant hypothesis is that chimp geophagy is done to ingest clay that absorbs excess plant tannins in the GI tract (and not as a B-12 source). Mozafar [1994] reports 14 mcg B-12 per kg of soil. If we assume this is all true B-12 and no analogues (highly unlikely), it implies a daily intake of ~143 g (0.31 pounds) of soil daily to satisfy B-12 requirements. Such an intake is obviously not feasible. Note that Mozafar lists B-12 values for soil from other studies as well (Table 3, p. 309, Mozafar [1994]). However, many of the studies cited are old and thus the assay methods used should be checked. Mozafar reports that soil is one of the richest sources of B-12, but that there is no information available on the level of analogues vs. true B-12 in soil.

Synopsis of impact of Mozafar research


The Mozafar research provides a limited confirmation of the earlier research by Halsted (as reported in Herbert [1988]). That is, if you eat enough plant foods grown in manured soil, you might get adequate B12. However, inasmuch as most of us buy our foods from markets, it would be unsafe to assume that a typical raw/vegan diet provides adequate B-12. Accordingly, the use of B-12 supplements (and/or supplemented foods) by vegans is still appropriate

Vitamin B-12: Rhetoric and Reality (CONT., 5 OF 5)


Feasible B-12 sources in evolution/pre-agricultural times
Both geophagy and, more rarely, coprophagy, are practiced on a limited scale by chimps; see Goodall [1986] for relevant discussion. It is known (from toothwear studies) that prehistoric humans display

120

significant toothwear from "grit" or soil. No one knows how much grit prehistoric humans consumed, or how much might have been inadvertent (versus intentional), though the figure given above of 143 g/day of soil that would be required to achieve one's daily B-12 requirement is probably far more grit than anyone-even a less-than-fastidious prehistoric human--could tolerate. This suggests that geophagy was not a significant B-12 source for prehistoric humans. Prior to agriculture, there were no domesticated crops or herds of domesticated animals. Consequently, there was no deliberate manuring of food crops--which is an agricultural practice, not a hunter-gatherer practice. Of course, some wild plants will--by chance or other reasons--receive limited amounts of wild animal manure. However, there is insufficient evidence to support a claim that such incidental, occasional manuring (of wild plants) would provide a reliable source of adequate B-12 (particularly in light of the extremely high bulk of unmanured plant foods required to meet B-12 requirements).

Potentially feasible plant B-12 sources--legumes and grains--not available to huntergatherers. Note that Mozafar found higher levels of B-12 in soybeans (a legume) and barley (a grain) than
in spinach. Prior to the development of agriculture, such items (legumes, grains) would have been available only in very limited quantities to hunter-gatherer tribes. Further, prior to the development of cooking technology (stone ovens and/or pottery), it would not be possible to cook (or sprout) significant quantities of such foods. (Think of the difficult task of roasting, or sprouting--without implements or containers--a pile of loose, small seeds.) Pottery is also required for cooking most vegetables other than tubers or hard squashes. That means early humans, if they were strictly vegan, would need to eat huge amounts of raw greens per day to satisfy B-12 requirements (and even larger amounts to meet minimal calorie requirements; see The Calorie Paradox of Raw Veganism for more details). The seasonal limits on the availability of greens (and other plant foods) in temperate climates, combined with the large amounts required, further disqualifies greens as a feasible B-12 source.

Fauna (animal foods) the consistent B-12 source in evolution. Accordingly, the only
reliable/feasible sources of B-12, pre-agriculture, were coprophagy and the consumption of animal products. There is ample evidence of consumption of animal products in pre-agricultural times, but no evidence that coprophagy was ever a common practice. That leaves animal products (including insects) as the only reliable B-12 source in pre-agricultural times, and by implication suggests (if further evidence were needed beyond that already existing about prehistoric peoples) that there never existed any strictly vegan hunter-gatherers, as B-12 is an essential nutrient. (This also suggests that strict fruitarian/vegan diets were never a factor in human evolution.)

Vitamin B-12 and mercury


A not-very-clear claim occasionally made in fruitarian circles is that vitamin B-12 deficiency is caused not by a lack of vitamin B-12, but by the cobalt in B-12 undergoing oxidation due to heavy-metal action, specifically inorganic mercury from dental fillings (amalgam). However, a review of the limited information on the "theory" raises serious questions regarding its validity. Some of these reservations are:

Apparently not a peer-reviewed theory. The theory reportedly appears in an article in a


periodical called Heavy Metal Bulletin (1995, not in the reference list for this paper). A search of several university libraries failed to find the periodical, suggesting an incorrect title, or, perhaps, that the periodical is not a scientific, peer-reviewed journal. Note that the comments here are based on a poorly written article (by a fruitarian extremist) that discusses the theory. (The article included what is claimed to be the reference list from the original article.) Defective reference list. A check of the articles allegedly in the reference list for the original theory article finds that at least 4 articles were either cited incorrectly or apparently do not exist. The net effect of such irregularities is to raise serious questions regarding the paper's validity. Evidence of non-human primates raises doubts about theory. The fact that wild primates (who have no silver-mercury amalgams in their mouths) who are fed vegetarian diets in captivity

121

become B-12 deficient (references cited above) suggests that B-12 deficiency is real; while the "B12/mercury hypothesis" is, at best, unproven. Pardridge [1976] showed that inorganic mercury interfered with brain uptake of certain amino acids, but his research did not test trans-cobalamin, methionine, or homocysteine--the proteins/amino acids involved in B-12 metabolism in the brain. Selenium/mercury association (in the brain) confirmed. Friberg and Mottett [1989], in their review article, suggest, and Bjorkman et al. [1995] confirm, a statistical relationship between inorganic mercury in the brain and selenium (not cobalt). This does not disprove the cobalt hypothesis, but it provides an alternate and proven hypothesis (whereas the cobalt hypothesis is unproven). Additionally, selenium is far more common than cobalt, which raises the question whether it might prevent potential B-12/mercury oxidation by "out-competing" the cobalt for any inorganic mercury present. Mercury chelated by tryptophan, a common amino acid. Pardridge [1976] reports that inorganic mercury is chelated by tryptophan, a common amino acid. This suggests that highprotein diets might provide limited protection from inorganic mercury uptake. (Ironically, the fruitarian extremist touting the vague B-12/mercury hypothesis advocates very low-protein diets.) Kanazawa and Herbert [1982] report that the B-12 in the human brain is nearly all true B-12; there are only trace amounts of B-12 analogues. This raises the interesting question of whether a mercury/B-12 compound could cross the blood/brain barrier, or if it would be rejected as an analogue. Note that this information is based on a very small sample, and must be interpreted with caution. Is there enough mercury to "starve" the brain of B-12, long-term? Given the information in the three preceding paragraphs, one wonders if dental fillings could provide, over a long period of time, enough mercury to react with all the available tryptophan, selenium, and cobalt? Given that tryptophan and selenium are relatively "common" (at least in comparison to cobalt), and that the amounts of mercury released by dental amalgam are extremely small, such a hypothesis seems dubious and would require credible evidence that has been published in a peerreviewed, scientific journal. Implication: Rely only on legitimate scientific information on B-12. For a legitimate scientific introduction to cobalamin metabolism, readers are encouraged to consult Tefferi and Pruthi [1994], which was consulted in the preparation of these remarks. An additional legitimate resource regarding B-12 deficiency and its biochemical basis is Cooper and Rosenblatt [1987].

In summary, the above points raise serious doubts about the (vague) mercury/B-12 hypothesis. Until such a theory is clarified, experimentally tested, and published in a credible scientific journal, it should be regarded as a speculative hypothesis.

Attempts to reverse burden of proof. Finally, this exercise also illustrates one of the bad habits of
the raw/veg*n movement: adopt an unproven theory or claim, then aggressively demand that others disprove it. Such an approach is an attempt to reverse the burden of proof, and the purveyors of such untested hypotheses should not be so easily allowed to get away with such assaults against reason.

A note on Victor Herbert


Victor Herbert, whose research on B-12 has been cited here, is a controversial figure in alternative health circles, given that he lobbies against alternative medicine and supports government restrictions on alternative health. He promotes the allopathic "party line," i.e., drugs and surgery. Because of this, Victor Herbert the politician is very unpopular in alternative health/diet circles. Unfortunately, many in such circles cannot distinguish between Victor Herbert the politician and Victor Herbert the research scientist. Herbert is one of the top researchers in the B-12 area and has a lengthy list of publications in scientific/professional journals. That record--extensive publication in refereed scientific journals--is a record that the anti-Victor Herbert lobby cannot match. Accordingly, I encourage you to evaluate his research on its technical merits, and not to be swayed by emotional or political sentiments.

122

Vitamin B-12: summary/conclusion

Vitamin B-12 is an essential vitamin for which the only reliable sources are fauna (animal foods) and coprophagy (should you freely choose to engage in it, which is most definitely not recommended). There is no evidence to support the idea that coprophagy is a natural human behavior, while there is extensive evidence that humans have consumed animal foods since the inception of the (human) species. Indirect coprophagy--eating plants fertilized with raw excrement--might provide adequate B-12, although the supporting evidence for this is very limited at present. As well, the use of raw excrement in farming poses risks--of parasites and disease for the consumer--and liability lawsuits aimed at the farmer. Indirect coprophagy is not a feasible option at present. The nutritional requirement for B-12, coupled with the lack of ability to assimilate B-12 at the site synthesized by bacteria in the human body (the colon), and the fact that plant foods are not reliable sources of B-12, provides evidence of apparent long consumption (and evolutionary adaptation) to animal foods in the human diet.

Protein Digestion: Plant vs. Animal Sources


Essential and conditionally essential amino acids
Young and Pellett [1994, p. 1204S] provide a good summary of protein requirements: The requirement for dietary protein consists of two components: 1) the requirement for nutritionally indispensable amino acids (histidine, isoleucine, leucine, lysine, methionine, phenylalanine, threonine, tryptophan, and valine) under all conditions and for conditionally indispensable amino acids (cysteine, tyrosine, taurine, glycine, arginine, glutamine, proline) under specific physiological and pathological conditions, and 2) the requirement for nonspecific nitrogen for the synthesis of the nutritionally dispensable amino acids (aspartic acid, asparagine, glutamic acid, alanine, serine) and other physiologically important nitrogen-containing compounds such as nucleic acids, creatine, and porphyrins.

Daily recommendations
The FAO/WHO/UNU (Food and Agriculture Organization, World Health Organization, United Nations University) recommend a daily average intake of protein for adults of 0.76 g per kg of body weight. However, on a related note, Young and Pellett [1994, p. 1205S] note that: However, there is increasing evidence that the current international FAO/WHO/UNU (11) and national (10) requirement estimates for most indispensable amino acids in adults are far too low (9, 23, 24). Several groups (25-27) are seeking to further substantiate this evidence.

Digestibility
A number of procedures have been established to measure the digestibility of proteins; for an example of protein digestibility scoring methods and lists of the protein scores of a wide variety of foods, see FAO [1970]. Through the use of such scoring methods, one can compare the digestibility of the protein in various foods. Young and Pellet note that the level and quality of protein in some individual plant foods make them inadequate as sole protein sources. They make the interesting observation that [Young and Pellett 1994, p. 1210S]: In general, the digestibility of vegetable proteins in their natural form is lower than that of animal proteins... Boiling in water generally improves protein quality, whereas toasting or dry heating reduces protein quality.

123

Note that the information above that we digest certain conservatively cooked protein foods more effectively than raw protein foods challenges the claims of some raw diet advocates that raw is always "better" than cooked. Young and Pellett [1994] provide a table (table 10, p. 1209S) showing the digestibility of various protein sources. Meat, dairy, and eggs are 94-97% digestible, the much-maligned standard American diet is 96% digestible, but grains, beans are "only" 78-85% digestible. One is tempted to conclude that this is due to adaptation to animal foods in the diet over evolutionary time. However, that is not necessarily the case. Animal foods are in general more easily digested than plant foods, for structural reasons: plant foods come encased in cellulose cell walls, which are hard to penetrate and digest. Ruminant animals such as cows, fed animal products in their feed, have no difficulty digesting the animal protein, even though it is arguably "unnatural" for them to eat animal foods.

Protein sources in prehistoric times


Earlier sections of this paper discuss the role of animal foods in the diet during evolution. A question that arises is whether, in theory, plant foods could take the place of animal foods as a protein source in such diets. Clearly, plant foods, if available in sufficient quantity, can satisfy protein requirements. The problem, of course, is availability. Prior to the development of containers, plant foods were difficult to collect and store, in quantity. Without suitable storage technology (even crude technology), plant foods will not last long (because of spoilage and insect attack). Even "tough" plant foods like nuts and acorns are subject to insect and rodent attack unless they are kept in protective storage. The preceding suggests that plant foods were available in season, but not necessarily on a year-round basis (in temperate climates), during evolution. This in turn suggests (but does not prove) that the consumption of animal foods was necessary to satisfy protein (and calorie) requirements in pre-historic times.

Protein in raw vegan diet circles


Protein is a topic of considerable interest in raw vegan circles. One can find fruitarian extremists promoting crank theories that protein (other than the small amount found in sweet fruits) is "toxic" in the sense that the metabolic by-products of protein metabolism are toxic, and will harm you. Such theories are based on a pathological fear of "toxins" and an amazing ignorance of the reality that the human body is well-equipped to process the metabolic by-products of protein metabolism. Further, such theories are sometimes promoted in hateful and dishonest ways (in my opinion and experience). See the article, "Is Protein Toxic?" (not yet available) for a discussion of such theories.

Taurine, a conditionally essential amino acid


Introduction. Taurine is one of the most common sulfur-based amino acids found in nature. Huxtable [1992, p. 101] provides a good introduction to the topic:
2-Aminoethane sulfonic acid, or taurine, is a phylogenetically ancient compound with a disjunct distribution in the biosphere. It is present in high concentration in algae (159, 649, 748) and in the animal kingdom, including insects and arthropods, but is generally absent or present in traces in the bacterial and plant kingdoms. In many animals, including mammals, it is one of the most abundant of the low-molecularweight organic constituents. A 70-kg human contains up to 70 g of taurine. One is not stumbling into the abyss of teleology in thinking that a compound conserved so strongly and present in such high amounts is exhibiting functions that are advantageous to the life forms containing it.

Distribution of taurine in plant/animal kingdoms


Huxtable [1992] reports that taurine is found at high concentrations in the animal kingdom, but is missing or present in trace amounts in other kingdoms. He notes (p. 105): In the plant kingdom, taurine occurs in traces, averaging ~0.01 mc mol/g fresh wt of green tissue according to one report (424). This is <1% of the content of the most abundant free amino acids. Taurine occurs in red algae, but not brown or green algae. Taurine has been reported at levels up to 0.046 mc mol/g, wet weight, in a few plant foods (nuts). Taurine is also present, at high levels, in insects. [Huxtable 1992].

124

Huxtable [1992] warns of the difficulty in assaying plant foods for taurine content. Due to the precision with which taurine must be measured, the taurine content of plant foods given by an analysis may be incorrect due to contamination of the sample with insects, insect parts, animal droppings. As Huxtable says (p. 105): One wet thumb print contains 1 nmol of taurine (236). This amount may be compared, for example, with the analytical range of 0.1-0.3 nmol employed by the high-performance liquid chromatographic method used in one recent paper reporting [taurine] concentrations in plants (600). Note that reference #600 above is Pasantes-Morales [1989] as cited in Huxtable [1992]. Huxtable summarizes that taurine concentrations in plants, when taurine is present at all, are measured in nmol (billionths of a mole), whereas in animal foods taurine is present in micro-moles (millionths of a mole), i.e., an order of magnitude difference of one thousand.

Taurine: selected functions and interactions Taurine is a very important amino acid involved in a large number of metabolic processes.
Huxtable provides a lengthy list of the biological functions provided by taurine [1992, Table 1, p. 102]. Taurine is important in the visual pathways, the brain and nervous system, cardiac function, and it is a conjugator of bile acids. Another important function of taurine is as a detoxifier. Gaull [1986, p. 123] notes: Retinol [vitamin A] in excess amounts, i.e., unbound to retinol-binding protein, can act as a poison. When the long-term lymphoid cell lines are exposed to 10 mc M retinol, their viability decreases strikingly over a 90-minute period [18]. Addition of zinc improves the viability slightly. Further addition of taurine protects the cells even more. If a combination of zinc and taurine is added, there is a striking protective effect... Note that the above suggests that taurine and zinc, both found in animal foods, provide protection from excess vitamin A--a vitamin found in full form only in animal foods. This is an interesting synergism, to say the least. Yet another zinc/taurine interaction is mentioned by Huxtable [1992, p. 129]: Zinc is another metal ion with which taurine interacts. Zinc deficiency leads to increased excretion of taurine (277). Inasmuch as zinc is a mineral in relatively low supply (in terms of quantities and/or bioavailability) in raw/vegan diets, the above raises interesting questions of the possibility of yet another zinc/taurine synergism.

Is taurine essential? Taurine is a conditionally essential amino acid in adult humans; it is an essential amino acid for
infants (mother's milk contains taurine). Gaull [1982, p. 90] discusses the need for taurine: The finding of a dietary requirement for taurine in the human infant is consistent with the negligible activity of cysteinesulfinic acid decarboxylase present both in fetal and in mature human liver (4)... In adult man only 1% of an oral load of L-cysteine was recovered as increased urinary excretion of taurine, giving further evidence that mature human beings also have a relatively limited ability to synthesize taurine and may be largely dependant on dietary taurine (22). The rat, in striking contrast, has considerable ability to convert dietary cysteine to taurine (17). See also Gaull [1977] as cited in Gaull [1982, 1986] for more information on taurine synthesis in humans vs. other animals.

Taurine levels in vegans


Laidlaw et al. [1988] studied the taurine levels (in urine and plasma) of a group of vegans (staff members of a Seventh Day Adventist college) compared to a control group (non-vegans, standard American diet).

Taurine levels lower in vegans. Laidlaw et al. report [1988, pp. 661-663]:

125

The results of the present study indicate that the plasma taurine concentrations in the vegans were significantly reduced to 78% of control values. Urinary taurine was reduced in the vegans to only 29% of control values... These findings suggest there may be a nutritional need for taurine and that plasma levels and urinary excretion fall with chronically low taurine intakes. Possibly the diet of the vegans was low in metabolic substrates or in cofactors for taurine synthesis... Although taurine is synthesized in humans, the current study suggests that the rate of synthesis is inadequate to maintain normal plasma taurine concentrations in the presence of chronically low taurine intakes. It is possible that a higher cysteine intake could increase taurine synthesis in the vegans... Long-term adherence to a strict vegetarian diet may lead to clinical manifestations of taurine deficiency. Citing Sturman et al. [1984], Laidlaw et al. [1988] report that neonatal primates developed abnormal eye function after being fed a taurine-free diet.

Section summary and synopsis Taurine is plentiful in nature, except in the plant kingdom, and is required in the metabolism of all mammals. Cats, an obligate carnivore, have lost their ability to synthesize taurine, which implies a long evolutionary dependence on foods that contain taurine. In humans, the ability to synthesize taurine is apparently limited. Animals that are more herbivorous (e.g., the rat) have a greater ability to synthesize taurine than humans do (they synthesize taurine with three times greater efficiency; see Gaull [1986]). The inefficient status of human synthesis of taurine (compared to more herbivorous animals) suggests a long dependence on a diet that includes taurine, i.e., a diet that includes fauna.

Vitamin A and Beta-Carotene


The primary emphasis of this section is on the low conversion rates of beta-carotene to vitamin A (i.e., low bioavailability) and its possible implications regarding humanity's natural diet.

Introduction
Biesalski [1997, p. 571] provides an excellent introduction to the subject of vitamin A: The term vitamin A is employed generically for all derivatives of beta-ionone (other than the carotenoids) that possess the biological activity of all trans retinol. Retinal [note slightly different spelling] is active in vision and is also an intermediate in the conversion of retinol to retinoic acid. Retinoic acid acts like a hormone, regulating cell differentiation, growth, and embryonic development. Animals raised on retinoic acid as their sole source of vitamin A can grow normally, but they become blind, because retinoic acid cannot be converted to retinol... Vitamin A from animal sources is ingested as dietary retinyl esters which are hydrolyzed to retinol by the lipases of the intestinal lumen... Human plasma contains about 2 mc mol/L retinol bound to retinol binding protein (RBP) and 5-10 nmol/L retinoic acid presumably bound to albumin (Blomhoff et al. 1990). Vitamin A is stored in the liver, and released in the blood attached to RBP as mentioned in the above quote.

Units of measurement. Discussions of vitamin A and beta-carotene use two different measures: IU
(international units) and RE (retinol equivalents). The conversion equivalents are:

126

1 IU = 0.3 mcg retinol = 0.6 mcg beta-carotene = 1.2 mcg other carotenoids and: 1 RE = 1 mcg retinol = 6 mcg beta-carotene = 12 mcg other carotenoids We will see later that the above measures are controversial because they imply a conversion efficiency that is higher than commonly encountered.

Beta-carotene: an introduction
Wang [1994, pp. 314-315] provides an introduction to the topic of beta-carotene: Beta-C is a symmetrical molecule with several conjugated double bonds on its polyene chain... Vitamin A activity is an important function of beta-C. Beta-C also has a variety of functions including enhanced immune response [2], antioxidant function [3,4], enhancement of gap junction communication [5,6], induction of carcinogen-metabolizing enzymes [7], and photoprotection [8]. Beta-C is one of the most important food components for which there is strong evidence for an anticancer role from epidemiological studies in human populations [9-11] and studies in animal model systems [12]...

It should be pointed out that many experimental studies on the anticancer activity of beta-C have been confounded by the poor absorption and low tissue levels of carotenoids in the rodent models used for such studies. Beta-carotene metabolism The metabolism of beta-carotene is very complex and is not fully understood. Wang [1994]
reports that beta-carotene is converted to retinoic acid via two mechanisms: central cleavage and excentric cleavage. Wang et al. [1992] demonstrated the production of retinoic acid (RA) and retinol from betacarotene in human intestinal homogenates (in vitro). They noted (p. 303): The production of RA and retinol in our human intestinal incubations is much lower than that reported by Napoli and Race in rat tissues (10), and only trace amounts of retinol were formed during both betacarotene and beta-apocarotenal incubations. Note: Wang et al. [1992] used intestinal homogenates in their research; see Parker [1996] for concerns on the reliability of results from cell homogenates. Note also that rats (a natural herbivore) produce more RA and retinol than humans. Meanwhile, cats, an obligate carnivore, are unable to synthesize RA or retinol from beta-carotene at all. That is, humans fall "in between" a carnivore and an herbivore in this capability. This is in line with the idea that humans are faunivores/omnivores adapted to a diet that includes some fauna (hence a diet that includes some fully formed vitamin A).

The full significance of the discovery of the cleavage of beta-carotene to retinol in the intestines is not clear, as the post-absorptive metabolism (sites, degree) is unknown [Parker 1996,
abstract]. Yet another confounding factor is that some ingested beta-carotene and vitamin A may be destroyed in the gut, precluding their use (from Goodman et al. [1966], Blomstrand and Werner [1967], both as cited in de Pee and West [1996]).

Bioavailability of beta-carotene
Solomons and Bulux [1997, p. S45] discuss the relative bioavailability of vitamin A from beta-carotene: Herbivores have lesser hepatic [liver] reserves of vitamin A than do carnivores when both meet their energy requirements. Human vegetarians show a somewhat parallel relationship in terms of retinol status when compared to omnivores.... In most of the literature's intervention studies, moreover, there was no improvement in indicators of retinol or vitamin A-nutriture status when a plant source of carotene was fed continuously over periods from months to years. Bioavailability is highly variable. Brown et al. [1989] compared the efficiency of absorption of betacarotene from supplements, carrots, broccoli, and tomato juice. They found that the supplements and carrots increased plasma carotenoids, but not broccoli or tomato juice. They also found wide differences in

127

absorption rates (carotenoid utilization) among individual subjects. A similar study by de Pee et al. [1995] in Indonesia found (p. 75): There is little evidence to support the general assumption that dietary carotenoids can improve vitamin A status. Solomons and Bulux [1997] apparently concur; they note that (p. 45): The findings...suggest that provitamin A compounds are converted less efficiently than the currently accepted factors of 1:6 and 1:12 would suggest. Parker [1996] discusses the many difficulties involved in measuring beta-carotene absorption. He notes that many studies do not take into account the intestinal conversion of beta-carotene to retinoids other than vitamin A (retinol). Parker [1996] cites the study by van Vliet et al. [1995] as an exception, which found an absorption efficiency of 11% for beta-carotene supplements as measured in plasma [blood]. As most studies show higher absorption rates for supplements than for beta-carotene in food, this would suggest an actual absorption rate well below 11% in food items. (Note that the result mentioned is from only one study, and one should be cautious in making inferences from a single study).

Poor design of beta-carotene conversion studies. de Pee and West [1996], a review article,
summarizes the studies on the conversion of beta-carotene to vitamin A (p. S38): Many experimental studies indicating a positive effect of fruits and vegetables [on vitamin A status] can be criticized for their poor experimental design while recent experimental studies have found no effect of vegetables on vitamin A status. Thus, it is too early to draw firm conclusions about the role of carotene-rich fruits and vegetables in overcoming vitamin A deficiency. Solomons and Bulux [1997] note that beta-carotene serves as a "sunscreen" for plants, it is an antioxidant, and that it may serve similar functions in those who consume the plant [Jukes 1992 and Krinsky 1992, both as cited in Solomons and Bulux 1997]. This would indicate competing uses for beta-carotene--between its potential use as a carotenoid vs. cleaving it to create vitamin A.

The bioavailability of beta-carotene is influenced by a number of factors. These are discussed


in de Pee and West [1996], de Pee et al. [1995], and Parker [1996]. We will consider only 3 factors of interest here:

Heating/conservative cooking increases the bioavailability of beta-carotene in carrots


[de Pee et al. 1995; Parker 1996]. This is significant because it conflicts with the idea that raw foods are always best, an idea promoted by raw vegan advocates. Eating vegetables with fat can increase carotene absorption, as beta-carotene is a lipid, i.e., a fat; see Parker [1996, pp. 543-544] for a detailed discussion on this point. This is of interest because there are fruitarian extremists who promote the crank science/science fiction claim that all (or nearly all) fat is "toxic." (Apparently they have never heard of the concept of essential fatty acids, or refuse to acknowledge it.) Genetics. From de Pee and West [1996, p. S49]: [T]here is evidence that some people have a genetic defect which renders them unable to convert beta-carotene to retinol [Blomstrand & Werner, 1967; McLaren & Zekian 1971]. Obviously, someone who cannot convert beta-carotene to retinol must rely on animal food sources for all vitamin A. Such a genetic defect raises the interesting question of whether such a genetic change increases, or decreases, evolutionary survival rates.

Vitamin A and beta-carotene: synopsis


The complexity of vitamin A/beta-carotene metabolism, coupled with the apparently inefficient conversion of beta-carotene to active vitamin A, makes it difficult to reach firm conclusions. However, given the number of vegans who do not use animal products or supplements, and who do not display symptoms of vitamin A deficiency, such observation suggests that conversion of beta-carotene to vitamin A, though

128

inefficient and sluggish, is adequate to prevent deficiency. At the same time, however, the low efficiency of that conversion (when compared to that of the rat, an herbivore) suggests that humans show evidence of preferential use of, and some degree of evolutionary adaptation to, preformed vitamin A-containing diets, i.e., those including animal foods.

MINERALS (IRON AND ZINC)


Preface. This section will address the issue of the low bioavailability of certain minerals in the vegan diet. The more narrow the diet, the greater the risk of deficiency, a point of real concern for some raw vegans. By way of introduction, Freeland-Graves [1988, p. 861] notes: The fact that vegetarian populations have no gross mineral deficiencies suggests that the mineral status of most vegetarians is probably adequate. Note that most vegetarian populations are not strictly vegan, and particularly not raw vegan; thus, as mentioned, the narrower the diet, the more likely the bioavailability issue might come into play.

Iron (Fe)
Introduction
Iron is an essential nutrient. Iron occurs in two forms in foods: heme ("organic") and non-heme ("inorganic"). (Note that the terms organic and non-organic are in quotes because the usage is figurative rather than literal.) NRC [1989] suggests a daily RDA/RDI of 15 mg/day for adults. Hurrell [1997] provides a good overview for iron (p. S4): The body requires iron (Fe) for the synthesis of the oxygen transport proteins haemoglobin and myoglobin, and for the formation of haem enzymes, and other Fe-containing enzymes, which participate in electron transfer and oxidation-reduction reactions... The body regulates iron homeostasis by controlling absorption and not by modifying excretion as with most other metals; absorption is increased during deficiency and decreased when erythropoeisis [red blood cell production] is depressed. The body has a limited capacity to excrete Fe and that in excess of needs is stored... All non-haem food Fe is assumed to enter a common Fe pool in the gut lumen (Cook et al, 1972), from where absorption is strongly influenced by the presence of other food components. [inhibitors]... As little or no Fe is excreted, absorption is synonymous with bioavailability. In general, heme iron (found in significant quantities only in animal foods) is absorbed much more readily than the non-heme iron from plant foods. In animal foods, the iron is approximately 40% heme [NRC 1989, p. 198]. However, Hurrell [1997], citing MacPhail et al. [1985], reports that in lean meat, heme iron may vary from 30-70% of total iron. The remainder of the iron in animal foods, and effectively all of the iron in plant foods, is non-heme. The NRC reports [1989] that the absorption of non-heme iron may vary by a factor as high as ten with the presence/absence of inhibitors or enhancers of iron absorption.

Factors that inhibit iron absorption


Let's look at the factors that inhibit iron absorption, as follows.

Phytates/phytic acid--myoinostol hexaphosphate--is common in grains and legumes, and


sharply reduces the bioavailability of iron. The phytate content of foods is reduced, but not eliminated by milling and some common processing techniques. Of interest to raw vegans, Hurrell [1997, p. S5] notes:

129

Some traditional processes such as fermentation, germination, and even soaking can activate phytases in cereal grains or flours, which then degrade phytic acid and improve Fe absorption (Sandberg and Svanberg, 1991).

Polyphenols--including phenolic acids, tannins, flavonoids. Common in tea, coffee, chocolate,


wine.

Calcium (Ca)--inhibits iron absorption when given in inorganic (supplement) form, or in dairy.
The effect of Ca depends on the composition of the meal, and the level of Ca it contains. Mixed or complex meals (i.e., meals of several food items) generally do not inhibit Fe absorption. In contrast, Ca may inhibit Fe absorption in mono-meals (the practice of mono-eating is recommended by some raw vegans) depending on the food eaten. Fiber. Freeland-Graves [1988], citing Bindra et al. [1986], reports that dietary fiber might inhibit Fe absorption. Hurrell [1997], citing Rosssander et al. [1992], reports that fiber has little effect on Fe absorption. It appears that the effect of fiber on Fe absorption has not been resolved yet. Protein. Proteins can inhibit or enhance iron absorption depending on the type of protein. Proteins are converted into peptides in digestion, and these can bind Fe. Legume proteins (including soy), milk casein, and egg albumin, can bind iron. Nuts. Nuts can sharply inhibit the absorption of iron in a meal; the inhibition can be overcome, however, by the addition of vitamin C to the meal (Craig [1994]).

Factors that enhance iron absorption


Muscle tissue. This brings us back to protein again. In contrast to plant protein, the presence of
animal muscle (meat) sharply increases the absorption of Fe. This may be due to the level of cysteine-based proteins in meat. Ascorbic acid (vitamin C) and other organic acids. A good absorption enhancer in supplement form and also in the form of fruits and vegetables. Hurrell [1997], citing Hallberg et al. [1989], reports that at sufficiently high concentrations, the iron enhancement effect of ascorbic acid can overcome the inhibiting effect of phytic acid in grains. Craig [1994] reports that other organic acids (citric, malic, tartaric, and lactic) can also enhance iron absorption.

Iron deficiency in veg*ns


Freeland-Graves [1988, p. 861] notes: Dwyer et al (28) reported mild Fe deficiency in 25% of preschool vegetarian children despite normal intakes of Fe. In adult lacto-ovo-vegetarians, Bindra and Gibson (30) also found a high prevalence of Fe deficiency with normal dietary Fe intakes. In new vegans (32) low Fe stores as measured by serum ferritin were found in 27% of the females, and 10% had values <10 mcg/L. Low serum ferritin has also been observed in lacto-ovo-vegetarian students (33). The above quote indicates that at least some veg*ns are low or deficient in iron. Craig [1994] cites four studies that found vegetarians had lower serum ferritin levels, and reduced iron stores, when compared to omnivores. In contrast to the preceding, Craig [1994] cites four other studies that did not find differences in iron levels, comparing vegetarians against non-vegetarians. Craig [1994] concludes that a balanced veg*n diet can supply adequate iron. However, Craig suggests a diet that includes grains, nuts, and seeds, as well as fresh fruits and vegetables. The idea is that the vitamin C in the fresh fruits and vegetables will counteract the iron inhibitors in the grains and seeds, hence make the iron (in the grains and seeds) available for absorption. From an evolutionary perspective, such a diet is of limited relevance: grains are available in quantity only via agriculture, and nuts and seeds were available only seasonally prior to the development of containers and food storage techniques.

From iron deficiency to iron overload: the distribution of hereditary hemochromatosis

130

(Acknowledgment: This topic was introduced to me via a posting made by Gary Ditta to the Paleolithic Diet email list, paleodiet@maelstrom.stjohns.edu, on 4/30/98.)

Introduction and incidence. Hereditary hemochromatosis (HH) is a genetic disease, a type of "iron
overload," which is "the most common inherited metabolic disease among the white [European descent] population worldwide" (from Bonkovsky et al. [1996], abstract). HH is an "iron overload" disease in which the body absorbs excess iron and stores it in the major organs (heart, pancreas, liver, etc.). This can eventually cause cirrhosis of the liver, diabetes mellitus, and cardiac complications [Niederau et al. 1994]. HH is chronic and progressive, and is treated by phlebotomy--blood-letting. The estimated incidence of HH varies from study to study. Bonkovsky et al. [1996] and Feder et al. [1996] report a gene frequency of 10%. Citing a variety of studies, Niederau et al. [1994] report a gene frequency of 2.3-10.7%, a frequency of heterozygotes (those with just one copy of the gene) of 4.3-19.1%, and a frequency of homozygotes (those having two copies of the gene) of 0.074-1.16%. The full disease occurs only in affected homozygotes, though a small number of heterozygotes may exhibit abnormal iron metabolism.

Hemochromatosis and diet: a hypothesis


The high incidence of HH has some potential implications regarding diet. The gene appears to reduce survival, yet it has become relatively common and widespread among those of European (Celtic) ancestry. Consider the following hypothetical analysis. For 99% of the time since the inception of the species, humans have lived as hunter-gatherers, eating a diet that includes animal foods that are rich in easily absorbed iron. Under such circumstances, the gene responsible for HH would not survive for long, as the hunter-gatherer diet is rich in animal foods and heme iron, and the HH gene (genetic mutation) would sharply reduce survival of those unlucky enough to have it. In the most recent 1% of our history, humans developed agriculture, stopped being huntergatherers, and switched from a diet rich in easily absorbed iron to a diet based on grains--low in bioavailable iron and high in phytates (iron inhibitors). Under the new circumstances, a genetic mutation that increases iron absorption (e.g., the gene associated with HH), would occur in an environment where it actually enhances survival. Under those circumstances, such a gene would both survive and spread. Modern times arrive, and bring large-scale agriculture and agricultural technology, industrialization, and greatly increased wealth. Over a very short period of time, the meat/animal food content of the diet increases substantially from what it was 100 years ago (if less than what it was prior to the development of agriculture). However, those with the HH gene who eat a diet rich in animal foods are now at increased risk of disease, because their bodies absorb "too much" iron from the recent dietary change that allows them to eat far more meat than their grandparents could afford. That is, their bodies have partially adapted (via the HH gene) to a diet in which iron is of very low bioavailability.

The above hypothesis explains the incidence of HH, and also may serve as evidence, in selected populations, of partial, limited adaptation to the "new diet"--i.e., the high-carbohydrate, grain diets provided by agriculture. Contrast the above--possible evidence of limited genetic adaptation to dietary changes in the approximately 10,000 years since agriculture began--with the unsupported claims made by certain raw/veg*n advocates that humans could not, and did not, adapt to a diet that includes animal foods, after ~2.5 million years. Reminder: the above is a hypothesis, for discussion .

Hemochromatosis: supplementary notes Iron overload diseases in Africa. Iron overload is also common in Ethiopia and the Bantu tribe of
South Africa. Hemochromatosis is not limited to hereditary transmission; it can occur any time iron is consumed in great excess. Hennigar et al. [1979] report that iron overload in the Bantu tribe is due to

131

consuming large amounts of a local alcoholic beverage that is brewed in iron pots. Wapnir [1990, p. 101] mentions that iron overload in Ethiopia and the Bantu is due to soil type and use of iron pots and utensils.

Heme iron receptors in the human intestine Heme and non-heme iron receptors. Both heme iron and non-heme iron are absorbed via intestinal
receptors, though the absorption pathways are different. Heme iron is absorbed via specific intestinal receptors (i.e., the receptor function is limited to absorption of heme iron). The heme iron receptors are in the brush borders of the intestinal mucosa, with the duodenum having the most receptors. For confirmation of the existence of the heme receptors, see Tenhunen et al. [1980], Grasbeck et al. [1979], Conrad et al. [1967], and, for a related study on pig intestine, see Grasbeck et al. [1982]. After the initial absorption process has taken place, the iron is removed from the heme, and enters the non-heme iron pool [Wapnir 1991, p. 106]. The human intestine also has receptors for non-heme iron, but there appear to be multiple receptors, and the transfer proteins involved are general and may also absorb some metals other than iron. See Davidson and Lonnerdahl [1989] for discussion of one of the types of non-heme receptor, and its involvement in the absorption of manganese. As well, Davidson and Lonnerdahl cite other studies that propose the subject receptor is also involved in absorption of zinc. Wapnir [1991, pp. 106-107] provides an overview of the various transfer proteins involved in absorption of iron and other metals.

Heme iron receptors: an adaptation to animal foods in the diet. The information that the human
intestine has receptors specifically for absorption of heme iron is significant. Heme iron is found almost exclusively in animal foods. Plant foods may contain heme iron as well, as cytochrome c, a protein found in plants, reportedly contains a heme group. However, the level of heme iron in plants is extremely low, and not nutritionally significant. (Most of the standard references, such as NRC [1989, p. 198], report that all of the iron in plants is in non-heme form). Given that plant foods contain effectively no heme iron, and that heme iron is found only in animal foods, the presence of intestinal receptors in humans that are specific to heme iron is strong evidence of evolutionary, physiological adaptation to animal foods in the diet.

Heme iron: a quote out of context. The following quote has been used to suggest that plant foods
contain significant amounts of heme iron; from Wardlaw [1999, p. 513]: The heme iron in the meat, poultry, fish, dry beans, eggs, and nuts group is especially well absorbed. However, examination of Wardlaw [1999] reveals the following. The quote is taken out of context; it is part of the caption for Figure 14-4 on page 513, which depicts the standard food pyramid, which of course groups food sources together instead of treating individual food items individually. Wardlaw [1999, p. 507] provides definitions of heme and non-heme iron. In those definitions, heme iron is identified as coming from animal tissues (implicitly, only from animal tissues).

Taken out of context, the above quote from Wardlaw [1999] could be used to suggest that beans and nuts are good sources of heme iron. A closer examination of the material, in context, reveals otherwise. In the quote, Wardlaw is discussing a food group, for which 4 out of 6 members are animal foods. Being animal foods, those 4 foods contain heme iron, and hence, the group, considered as a whole, provides heme iron, so long as one eats a balance of foods from the group. More precisely, eating a balance of foods from a group means that one eats, over an appropriate period of time, some of every food type in the group. Such a strategy would insure the consumption of heme iron from the animal foods in the group. Finally, note that eating a balance of foods from the food groups is "standard" advice in nutrition, and this context should be taken into account when interpreting statements based on such sources.

132

Iron: synopsis
The existence of receptors in the human intestine that are specific for the absorption of heme iron, which is found in nutritionally significant amounts only in animal foods, coupled with the evidence of the low bioavailability of iron in plant foods, provides evidence of evolutionary, physiological adaptation to a diet that includes animal foods.

Zinc (Zn)
Zinc is an essential nutrient involved in many enzyme pathways in the body. The human body
maintains a small pool of zinc (mostly in bone and muscle), with a fast turnover rate (deficiency symptoms appear quickly in test animals). The (U.S.) RDA/RDI for zinc is 15 mg/day for adult men, 12 mg/day for adult women [NRC 1989]. Sandstrom [1997] summarizes the functions of zinc, and how homeostasis is maintained (p. S67): Its biochemical function is an essential component of a large number of zinc dependent enzymes participating in the synthesis and degradation of carbohydrates, lipids, protein and nucleic acids... At low and excessive intakes changes in urinary and skin losses contribute to maintain homeostasis. The primary dietary sources for zinc are animal foods and, to a lesser extent, grains. The bioavailability of zinc in animal products is higher than in grains [Inglett 1983, as cited in NRC 1989].

Zinc bioavailability (inhibitors) and interactions Zinc may be low in vegan diets. Freeland-Graves [1988] summarizes the issue of zinc concisely (p.
859): Vegetarian diets have the potential to be limited in Zn because foods that are considered to be the best sources of this mineral, such as meats, poultry, and seafood, are excluded (1). In addition, these diets contain copious quantities of fiber, phytate, and oxalate, compounds that have been found to bind to minerals and reduce bioavailability. Comments on zinc inhibitors and interactions are as follows.

Phytic acid is a strong inhibitor, and phytate is present at "high" levels in many grains.
Sandstrom [1997], citing Rossander et al. [1992], reports that zinc absorption is usually below 15% from meals that are high in phytate. The inhibiting effect of phytic acid in grains is counteracted by including animal proteins in the meal [Sandstrom et al. 1980, as cited in Sandstrom 1997]. Oxalate in high-fiber diets may reduce zinc bioavailability [Kelsay et al. 1983, as cited in Freeland-Graves 1988]. This could be a real concern for raw vegans. Soy protein may reduce zinc bioavailability [Freeland-Graves 1988]. Large amounts of non-heme iron in a veg*n diet may reduce Zn bioavailability [Solomons and Jacob 1981, as cited in Freeland-Graves 1988]. Calcium and phytate in combination. The combination of high calcium intake plus high phytate intake may decrease Zn bioavailability [O'Dell 1984, as cited in Freeland-Graves 1988].

Zinc deficiency in veg*ns Low zinc levels in vegans. Freeland-Graves [1988, pp. 859-860] notes:

133

In a study of 79 vegetarians, our laboratory (9) observed that only female vegans had low dietary levels of Zn. These low levels were attributed to heavy reliance on fruits, salads, and vegetables, foods that are poor sources of Zn... Levels of Zn in salivary sediment were significantly lower in vegetarians than in nonvegetarians, and vegans exhibited the lowest levels. The first remark above, about reliance on fruits and vegetables, is relevant to those raw vegans, mostly fruitarians, who avoid grains/sprouts. It suggests that such individuals are--not surprisingly--at higher risk of a Zn deficiency. Many fruitarians report (anecdotal evidence) a "light" or "euphoric" mental feeling that actually may be a symptom of zinc deficiency (a feeling often mistaken for a "spiritual" feeling). Some advocates of fruitarianism claim the "euphoric" mental feeling is a benefit of the diet. Additional relevant anecdotal evidence is that some raw vegans and fruitarians report loss of libido, which may also be a symptom of zinc deficiency. The possibility that one of the most hyped effects of the fruitarian diet, the allegedly "euphoric" mental state, may be a Zn deficiency is of course a hypothesis. However, it is a plausible hypothesis (based on the available scientific and anecdotal data), and deserves further research. Side note. As a former fruitarian who has personally experienced the "light" or "euphoric" mental effect of a fruit diet, and who has also done some spiritual work, I can corroborate that, at least for me, the "light" or "euphoric" feeling of a fruit diet is not a "spiritual" feeling. Rather, the effect of the fruit diet is more like a mild drug or sugar high--an airy, apathetic feeling (lassitude).

Zinc: synopsis That the bioavailability of Zn is higher in animal foods than plant foods may suggest that humans are better adapted to the digestion and assimilation of animal foods (i.e., in contrast, many plant foods contain zinc inhibitors). Also, specific symptoms that are commonly reported by fruitarians (anecdotal evidence), i.e., the "light" mental feeling and loss of libido, suggest the possibility that zinc deficiency may be widespread among fruitarians. This is an area where research is needed.

Essential Fatty Acids (EFAs) (1 OF 2)


Introduction
Fatty acids serve a number of important functions. As specified in Nettleton [1995]: Some are essential nutrients, e.g., linoleic acid. Some, particularly short-chain fatty acids, are digested and provide energy (calories). Some--certain long-chain fatty acids--are components of cells and membranes.

The primary fatty acids of interest here are, first, linoleic acid (an n-6 acid) and alpha-linolenic acid (an n-3 acid). Second, given the presence of suitable enzymes, linoleic and/or alpha-linolenic acid can be converted to arachidonic acid (AA), eicosapentaenoic acid (EPA), docosahexaenoic acid (DHA), eicosanoids, and other important metabolic compounds. DHA is very important in the central nervous system; Jumpsen and Clandinin [1995, p. 24] report: Approximately one third of fatty acids with ethanolamine and serine phosphogylcerides in the cerebral cortex of humans, monkeys, and rats is docosahexaenoic acid (O'Brien and Sampson, 1965; Svennerholm, 1968; Neuringer and Connor, 1989).

134

Can synthesis of EPA and DHA from precursors maintain levels equal to obtaining them in the diet preformed? As we proceed, the central question that will be addressed here is how the
conversion of precursors such as alpha-linolenic acid (from plant foods) to EPA and DHA compares to obtaining these crucial fatty acids preformed directly from animal foods. During evolution, humans would have been dependent on obtaining EPA and DHA primarily from animal sources [Eaton et al. 1998]. Given that DHA is particularly important in growth and development of the brain, which tripled in size during human evolution, the question of whether obtaining EPA/DHA primarily through synthesis from precursors is efficient enough to maintain optimum levels is of prime interest.

Understanding the terminology for EFAs


A number of systems and abbreviations are used to denote the various fatty acids of interest here, and these can at times be confusing. This section serves to introduce the relevant systems. The two main groups or families of polyunsaturated fatty acids of interest here are the omega-3 and omega6 families, which are written as n-3 or w-3, and n-6 or w-6. The different families of fatty acids have different chemical structures. They all consist of chains of carbon atoms with a methyl group (CH3) at one end of the chain, and an acid or carboxyl group at the other end (HO-C=O). A standard system has been established for describing the various fatty acids. An example will illustrate the principles involved in the nomenclature.

18:2n-6 Linoleic Acid (LA): The first number specifies the number of carbon atoms in the chain; here it
is 18. The second number, 2, tells us how many of the carbon atom bonds in the chain are double bonds. The last number, 6 in this case, specifies the position number of the first carbon double bond from the methyl end of the chain. Putting the above together, letting "C" be a carbon atom, "-" a single bond, and "=" a double bond, we can crudely depict linoleic acid as: (CH3)-C-C-C-C-C-C=C-C=C-C-C-C-C-C-C-C-(HO-C=O)

Saturated vs. unsaturated fats. A fatty acid is saturated if it contains the maximum number of
hydrogen atoms possible, i.e., it has no double bonds between adjacent carbon atoms. A fatty acid is unsaturated if it is not saturated, that is, if it has one or more double bonds between adjacent carbon atoms. Both linoleic and alpha-linolenic acids contain double carbon atom bonds, hence are unsaturated. (Note that the depiction of linoleic acid above is a summary and does not show all the hydrogen atoms.)

Abbreviations: an "alphabet soup" of terms. A sometimes-confusing array of numbers,


abbreviations, and names are used for the various fatty acids and related terms. The following table is provided as an aid in understanding, and as a quick reference for fatty acids of interest here. The term "precursor" below refers to which fatty acid (FA) is a precursor in synthesizing another fatty acid. The precursors shown below are the ones of interest here, and are not necessarily the only precursors, and in some cases are actually intermediate precursors.

IMPORTANT OMEGA-3 (n3) AND OMEGA-6 (n6) POLYUNSATURATED FATTY ACIDS Numeric Designation Fatty Acid Name Abbreviation Can be Made from Synthesis Precursors

OMEGA-3 FAMILY

135

18:3n-3 20:5n-3 22:6n-3

Linolenic acid, also known as alpha-linolenic acid Eicosapentaenoic acid Docosahexaenoic acid

LNA or ALA EPA DHA

[n-3 pathway] LNA/ALA EPA

OMEGA-6 FAMILY 18:2n-6 20:4n-6 22:5n-6 Linoleic acid Arachidonic acid Docosapentaenoic acid LA AA DPA [n-6 pathway] LA AA

The table above is based on information from Jumpsen and Clandinin [1995], and Nettleton [1995]. Note that EPA and DHA are synthesized in a pathway that starts with LNA (all three of which are n-3 FAs), while AA and DPA are synthesized in a pathway that starts with LA (all n-6 FAs). The primary focus here will be on the n-3 family and its longer-chain derivatives, EPA and DHA.

Additional terms of interest.


Short-chain: a FA with 8 or fewer carbon atoms. Long-chain: a FA with 12+ carbon atoms. Medium-chain: intermediate between short and long, 9 to 12 carbon atoms.

Requirements for EFAs Linoleic acid: a minimum of 1 to 3+% of calories. Only small amounts of linoleic acid (an n-6 acid)
are required. An RDA/RDI has not been formally adopted; however linoleic acid at 1-2% of calories will prevent deficiency. NRC [1989] recommends a minimum intake of linoleic acid, for adults, of 3-6 g/day. The comments of Jumpsen and Clandinin [1995, p. 29] are appropriate here: ...[B]ecause competition exists among the fatty acids for desaturating enzymes (Brenner, 1981), a level of at least 3% of energy should be met by n-6 fatty acids (FAO, 1977)... Uauy et al. (1989) recently suggested that the recommendation of 3.0% of total energy is adequate to prevent clinical signs of deficiency but may be insufficient to ensure functional and biochemical normalcy. FAO [1995], a joint publication with the United Nations World Health Organization (WHO), recommends linoleic acid consumption in the range of 4-10% of total energy. They specifically recommend consumption of linoleic acid at the 10% level when total intake of saturated fatty acids is high. Note that the FAO/WHOrecommended consumption level is higher than the suggested minimums.

Alpha-linolenic acid recommendation: 0.5% of calories. In reference to alpha-linolenic acid (an n3 acid), Jumpsen and Clandinin [1995, p. 21] note: ...[A] dietary requirement for 18:3n-3 [alpha-linolenic] has not been clearly established, a growing body of evidence indicates that this series of fatty acids is also essential (Lamptey and Walker, 1976; LeprohonGreenwood and Anderson, 1986; Bourre et al., 1990a). In Canada, nutrition recommendations have established the dietary requirement level for 18:3n-3 as 0.5% of energy (Nutrition Recommendations, Health and Welfare Canada, 1990).

136

Neuringer et al. [1984] has demonstrated n-3 fatty-acid deficiencies in animals, and Bourre et al. [1989b] showed that a diet deficient in n-3 oils affected enzyme activity and learning abilities in rats [Neuringer, Bourre, as cited in Jumpsen and Clandinin 1995]. The preceding lend credence to the hypothesis that alphalinolenic acid is an essential nutrient. Note that certain fruitarian extremists make the science-fiction claim that all, or nearly all, fats (i.e., fats above the levels found in sweet fruits) are "toxic." To date the extremists have not presented any credible evidence to support the idea that, say, avocados are "toxic" because of their fat/oil content.

Essential Fatty Acids (EFAs) (CONT., 2 OF 2)


Bioavailability of EFAs: plant vs. animal sources Diet is only source for n-3 fatty acids. Nettleton [1995] points out that plants can convert linoleic acid
(n-6) to alpha-linolenic acid (n-3), but humans cannot. Hence, diet is the only source for n-3 fatty acids for humans. In general, plant foods provide linoleic (n-6) and alpha-linolenic acid (n-3), while animal foods provide DHA and other preformed long-chain n-3 fatty acids, such as eicosapentaenoic acid (20:5, EPA). EPA and DHA are common in aquatic animals.

THE PROBLEM: Are human enzymes that synthesize DHA and EPA from precursors adequate as a sole source?

Negative evidence suggesting DHA/EPA-synthesizing enzymes may be inadequate for complete nutrition. Although the issue of whether alpha-linolenic acid is essential for
humans is still controversial, it is clear that DHA, which can be produced by converting alphalinolenic acid, is essential in infants (mother's milk contains ready-made DHA). Although the human body will synthesize such long-chain fatty acids from precursors in the diet when not directly available, the rates of synthesis generally do not support the levels obtained when they are gotten directly in the diet. This is particularly critical in infancy, as human milk contains preformed DHA and other long-chain essential fatty acids, while plant-food-based formulas do not (unless they have been supplemented). Let's now review the research linking levels of EFAs in body tissues with EFA consumption via diet and/or supplementation.

Animal studies on impact of EFAs in diet: chickens. Anderson et al. [1990] studied the
effects of n-3 fatty acid (FA) supplementation in newly hatched chickens with an n-3 FA deficiency. Levels of DHA were measured in the brains and retina of deficient chicks fed 4 diets containing different levels of n-6 FA. The chicks fed a diet that included DHA and EPA recovered--that is, the levels of DHA/EPA returned to normal levels after 3 weeks. The researchers concluded that linolenic acid as the sole source of n-3 fatty acids is inadequate, and supplementation with DHA is advisable. Anderson and Connor [1994] provides additional insight into this research.

EFA studies on rats. Woods et al. [1996] tested the hypothesis that the "right" ratio of LA/LNA
(LA = linoleic acid, an n-6 fatty acid; LNA = alpha-linolenic acid, n-3) in infant formula could support the same levels of DHA production as results from breast-feeding (in rats). A series of milk formulations with LA/LNA ratios of 10:1, 1:1, and 1:12 were tested on infant rats, and the resulting levels of DHA in the rat's brains were measured. Woods et al. [1996, pp. 691, 693] conclude that: This suggests that there is no level of LNA that is capable of supporting the [levels of] neural DHA found in the dam-reared [breast-fed by mother rats] rat pups...

137

If it is accepted that human essential fatty acid metabolism is not more efficient than that of the rat, it follows that infant formulas like those in our study would not support proper human nervous system accretion of LC-PUFAs [long-chain polyunsaturated fatty acids]. Woods et al. [1996] do suggest that LNA might assist DHA synthesis in the brain, but not the retina, of the rats.

EFA studies on monkeys. Lin et al. [1994] studied the effect of various n-3 fatty-acid levels in
the diet on the retinas of rhesus monkeys. Monkeys on deficient diets, when fed a diet that included DHA, recovered to near-normal levels, except for certain lipids. In the n-3 FA-deficient monkeys, docosapentaenoic acid (DPA, 22:5n-6, a long-chain derivative in the n-6 FA family) partially replaced DHA in the monkeys' retinas. However, Lin et al. [1994, abstract] describe the replacement as "functionally incomplete," suggesting that the resultant DPA levels were too low, and/or did not provide equivalent functional support for the monkeys' visual systems (in comparison to the monkeys whose diet included DHA).

EFA levels in "living foods" adherents (raw vegans). Agren et al. [1995] studied the EFA
levels of vegans following the living foods diet, a vegan diet that emphasizes sprouted and fermented raw or "living" foods. The study began with 12 people who claimed to be following the living foods diet, but two were excluded because they were found not to be actually following the diet (it is a difficult and idealistic diet that few can follow successfully, in the long-term), and another subject was excluded who displayed fatty-acid levels that were 4-5 standard deviations from the means observed (again suggesting probable non-compliance with the claimed diet). Comparison of the EFA levels in erythrocytes, platelets, and serum lipids in living foods raw vegans vs. controls (standard Western diet) showed that the EPA levels in the raw vegans were only 29-36%, and DHA levels 49-52%, of the analogous levels in the control group. Additionally, the n-6/n-3 ratio in the raw vegans (5.55) was about double the ratio observed in the controls (2.70). (Note: Agren et al. [1995, Table 2, p. 367] report n-3/n-6 ratios; the n-6/n-3 ratios in the preceding sentence were obtained by inverting the ratios reported in Agren et al. [1995] for erythrocyte totals.)

EFA levels in conventional vegans. Krajcovicova-Kudlackova et al. [1997] analyzed EFA


levels in blood plasma, comparing several groups: vegans, vegetarians (whether lacto- or lactoovo-vegetarian was not specified), semi-vegetarians (those who ate fish) and a control group following the standard Western diet. The vegan group had significantly lower levels of both EPA (62.5%) and DHA (67%) compared to the control group, while the levels of EPA and DHA were not significantly different between vegetarians and the control group (computed from Krajcovicova-Kudlackova et al. [1997, Table 2, p. 367]). N-6/n-3 ratios were found to be 19.48 for vegans, 14.71 for vegetarians, and 13.07 for the control group. Note that the n-6/n-3 ratios for vegans and vegetarians are different (statistically significant, P = 0.001) from the control (standard Western diet) group. Although the difference between the vegetarian group vs. control group is statistically significant, in real-world terms the amount of the difference might not be very meaningful, as 14.71 and 13.07 are "close." However, the corresponding difference between the control group and vegans, whose ratio is 19.48, is considerably different enough that the result is likely to be meaningful in real-world terms as well.

Impact of DHA supplements on EFA levels in vegetarians. Conquer and Holub [1996]
studied the effect of DHA supplementation on the EFA levels in phospholipids, in vegetarians. (Phospholipids are compounds of lipids with alcohols, which also have a phosphate residue [Ansell 1973]. Lecithin is an example of a phospholipid [Strickland 1973]. Phospholipids are present in blood platelets, serum, and elsewhere in the human body.)

138

Two volunteer groups of vegetarians consumed either capsules providing 1.62 gms/day of DHA, or (control group) a similar quantity of corn oil capsules (placebo, no DHA content). The DHA in the capsules was from algae rather than animal sources. It was found that consumption of DHA capsules had a dramatic effect on DHA levels in phospholipids (increase of 225% in blood platelet, 246% in serum, phospholipids) and EPA levels (176% in blood platelet, 117% in serum, phospholipids). Additionally, the consumption of DHA capsules significantly lowered the n-6/n-3 ratio after 3 weeks (from 8.9 to 3.3); ratios were 3.1-3.3 for those taking DHA, 9.5-9.9 for those on the corn oil placebo. The consumption of DHA also improved a number of biomarkers associated with heart disease (cholesterol, triglycerides; see the study for details).

EFA levels in human milk. Sanders and Reddy [1992] compared the levels of EFAs in breast
milk from women who followed vegan, vegetarian, and standard Western diets (the latter serving as control group). Levels of DHA were found to be lowest in the milk from vegan mothers (37.8% of control group level), highest in the standard diet, with (lacto-)vegetarians in the middle (81% of control group level). The study authors note [Sanders and Reddy 1992, p. S71]: The proportion of DHA in erythrocyte total lipids of infants breast-fed by vegans was 1.9% compared with 3.7% in infants fed a [cow's] milk formula containing butterfat as the sole source of fat and 6.2% in infants breast-fed by omnivores [standard Western diet] at 14 weeks postpartum. That DHA levels in infants were lower in those who were breast-fed by vegan mothers than in those fed the cow's milk formula is somewhat surprising in light of the data that the vegan human milk contained more DHA as a percentage of energy. This may be caused by (relatively) high levels of linoleic acid (n-6) in the vegan breast milk suppressing synthesis of DHA. Sanders and Reddy [1992, Table IV, p. S74] report n-6/n-3 ratios of: 16.4 for vegans, 13.0 for vegetarians, 13.5 for control. (Note: Sanders and Reddy report that the ratio is higher for vegans, but the statistical significance thereof is not clear in their paper.) For a related paper, see Reddy et al. [1994].

EFA levels in tissues of infants. Farquharson et al. [1992] analyzed the brain tissues of infants
who died of crib death, also referred to as "sudden infant death syndrome" (SIDS). They found significantly higher levels of DHA and DPA in the cerebral cortex tissues of breast-fed infants vs. those fed formulas (that were not supplemented with DHA; breast milk contains DHA). They note that the higher linoleic acid content of the formulas might inhibit synthesis of DHA. Carlson et al. [1986] compared the EFA levels in blood cell phospholipids of preterm infants, for those receiving human milk vs. those fed with formula. Feeding with preterm human milk increased phospholipid DHA levels, while feeding with formula caused a decline. Salem et al. [1996] provides proof that infants are able to synthesize, in vivo, essential fatty acids from precursors. Also noted, however, is that the level of DHA synthesis is inadequate to meet needs in early life. Connor et al. [1996] studied the effect on DHA levels of supplementing the diet of pregnant women with fish oil capsules (high in DHA and EPA) in the latter stages of pregnancy. They found that blood DHA levels were higher in the blood of infants whose mothers consumed the fish oils. In the supplemented infants, DHA levels were 35.2% higher in red blood cells, and 45.5% higher in plasma, vs. the non-supplemented controls.

EFAs and infant development. Gibson et al. [1996] discusses a number of studies that show
improved visual function in breast-fed infants vs. those receiving formulas. The authors relate this to the presence of DHA in breast milk, and the lack (or lower levels) thereof in formulas. Also see Hoffman et al. [1993] and Uauy et al. [1996] in this regard.

139

Low conversion rates of LNA to DHA. Nettleton [1995] reports that alpha-linolenic acid
(LNA) can be converted to EPA and DHA, but the efficiency of the conversion is low [Dyerberg, Bang, and Aagard 1980; Sanders and Younger 1981; as cited in Nettleton 1995]. Emken et al. [1994], as cited in Conquer and Holub [1996], report a conversion rate of LNA to DHA of ~5% in adults. Salem et al. [1996] suggest (Table 1, p. 50; see also p. 51) a minimum conversion rate of ~1% from LNA to DHA in infants (0.9 mg of DHA produced from 100 mg precursor).

Flaxseed oil a suboptimal source for n-3 acids. Flaxseed oil is used by some vegans as a
source of n-3 oils. However, the study of Kelley et al. [1993] fed volunteers a diet in which 6.3% of calories were from flaxseed oil. (Given the cost of flaxseed oil, this is a large amount and would be expensive.) They observed a statistically significant increase in EPA levels in peripheral blood mononuclear cells (PBMNC) in those receiving flaxseed oil. EPA levels in serum were not affected by flaxseed oil. Regarding DHA levels, no increases (in PBMNC or serum) were seen from flaxseed oil supplementation. Gerster [1998] is a review article that looks into the issue of whether adults can adequately convert ALA to EPA and DHA. The tables in Gerster [1998, pp. 165-166, 168] list a number of studies that showed a pattern similar to the one described above. That is, flaxseed oil supplementation may increase EPA, but not DHA. o

Studies that found no changes in DHA levels from flaxseed oil supplementation: Dyerberg et al. [1980], Singer et al. [1986], Kelley et al. [1993],
Cunnane et al. [1995], Mantzioris et al. [1994], Layne et al. [1996]. Studies that reported DHA decreased with flaxseed oil supplementation: Sanders and Roshanai [1983], Allman et al. [1995]. Studies that reported DHA increased with flaxseed oil supplementation: Kestin et al. [1990], and Sanders and Younger [1981]. Note that the latter study found an increase in DHA for vegan subjects only, in plasma glycerides, but not in platelets (they found no differences in DHA in the non-vegan control group).

o o

The weight of the studies above suggests that flaxseed oil does not provide efficient support for DHA synthesis.

Preformed DHA may be necessary for optimum levels. Gerster also makes the point that
fish oils are superior sources for EPA and DHA, and concludes [1998, pp. 159-160]: These findings indicate that future attention will have to focus on the adequate provison of DHA which can reliably be achieved only with the supply of the preformed long-chain metabolite. For another perspective, Nettleton reports [1995, pp. 33-34]: The fact that LNA is less effective than EPA and DHA in enriching tissues with n-3 FA [fatty acids] means that foods rich in LNA, vegetables and seed oils will be less satisfactory sources of n-3 FA for human health than seafoods or other animal foods enriched with EPA and DHA. That is not to say, though, that oils such as canola and soy are not useful sources of n-3 FA... Finally, consuming oils with LNA helps reduce the amount of linoleic acid we consume. In rats, the ratio of n-3 to n-6 fatty acids consumed is more effective than the absolute amount of n-3 FA in inhibiting eicosanoid synthesis from arachidonic acid (Boudreau et al. 1991). It may be the same for people also...

140

Tissues that possess the enzymes needed to convert one form of n-3 FA to another will get by with available precursors, but others may require particular n-3 FA preformed. Dietary sources providing both EPA and DHA would appear to have advantages over those supplying only LNA.

Effects of balance of fatty acids. Langley [1995] reports that typical vegan diets have a high
intake of linoleic acid, which can suppress the production of DHA, i.e., the typical vegan diet has a poor balance of the EFAs. The effect of the balance of fatty acids is also discussed in a number of the studies cited above. The remarks above concerning fatty acid balance are cogent, as a number of companies strongly promote (and some vegan "diet gurus" recommend) flaxseed and/or hempseed oils as supplements to vegans to "make sure" they are getting the right balance of fatty acids in their diet. One must wonder how "natural" vegan diets are, if exotic, expensive oils are "required" to provide the right balance of linoleic and alpha-linolenic fatty acids in a vegan diet. If vegan diets are truly "natural," one would intuitively expect it to be relatively easy, as a vegan, to get the optimal balance of fatty acids. Leaving aside the criterion of naturalness, however, if (typical) vegan diets are held instead simply to be the "optimum," one would not expect the balance of fatty acids in a typical vegan diet to potentially suppress the synthesis of critically important EFAs, like DHA. The studies discussed above, however, raise serious questions on this point. In addition, it appears that despite flaxseed oil supplementation, for example, it may still be difficult to achieve the proper level of DHA. The studies above also show a strong link between DHA in the diet and DHA levels in body tissues, with vegans having very low levels of DHA when compared to those whose diet includes preformed DHA. This raises the question (discussed below) of whether the levels of DHA and EPA synthesis are optimal or merely adequate to prevent deficiency symptoms.

Positive evidence suggesting DHA/EPA-synthesizing enzymes may be adequate for complete nutrition. Okuyama [1992] presents limited evidence that the desaturation-elongation
enzyme activity is adequate to supply 20-carbon fatty acids like EPA and DHA. Okuyama [1992, p. 170] reports: Finally, it should be emphasized that strict vegetarians ingest only linoleate and alpha linoleate, and usually do not develop essential fatty acid deficiency (8). All these data suggest that humans are no exceptions among mammals in the metabolism of linoleate and alpha linoleate, although the desaturation-elongation activities may be relatively lower in most humans probably due to the negative feedback control of the enzyme systems by large amounts of highly unsaturated fatty acids in their diets (9).

DHA and EPA synthesis: minimally sufficient, or optimal? The obvious question that arises here
is whether the desaturation-elongation process provides adequate DHA for optimal health, or merely enough to prevent deficiency symptoms. As pointed out earlier in this paper in Part 4 ("Intelligence, Evolution of the Human Brain, and Diet"), recent research on the evolution of the human brain and encephalization has shown a drop in human brain size of 11% in the last 35,000 years and 8% in the last 10,000, a timeframe which coincides with steadily decreasing amounts of animal food in the human diet. Although correlation alone is of course not causation, this evidence strongly suggests further research is needed in the area of human requirements for the preformed long-chain fatty acids important to brain and central nervous system development.

EFAs: synopsis
The scientific evidence available to date is somewhat unclear. Synthesis of EPA and DHA (from plant source oils) may be adequate in some individual adults; whether that is true for the general (adult)

141

population is unclear. The apparent difficulty in getting a favorable fatty acid "balance" on a vegan diet; the sporadic efficiency of conversion of LNA to EPA/DHA; and the recent evidence of decreasing human brain size appear to support the idea that humans have adapted to diets that include preformed EPA/DHA (i.e., animal foods). Beyond this, however, the real question is to what degree humans may be dependent on obtaining these preformed n-3 fatty acids from the diet not simply to prevent potential deficiency but for optimal functioning. Conclusions are at this point equivocal and more research is needed in this area.

Bitter Taste Rejection: A Marker for Dietary Trophic Levels


This section is based on the excellent paper by Glendinning [1994]. The primary interest here is on how bitter taste rejection threshold levels vary according to dietary trophic levels (based on possible evolutionary adaptation). However, the article by Glendinning also provides hard information that addresses some of the myths about bitter taste promoted in fruitarian/raw circles, and that material will be examined here as well.

Fruitarian/raw vegan myth: bitter = toxic


One of the enduring myths of fruitarianism/raw veganism is that any food that tastes bitter absolutely must be toxic. Glendinning [1994, pp. 1217-1218] notes (with citations) that all naturally occurring toxins taste bitter to humans at some concentration level, i.e.: Proposition A: Natural toxin implies bitter taste. However, the claim that is promoted in raw circles is the logical converse, i.e., that: Proposition B: Bitter taste implies natural toxin. This section will focus on proposition B.

Lack of proof for the claim that bitter implies toxic. Glendinning [1994] defines the bitter rejection
response as the aversive reaction that occurs when one eats bitter foods. He then asks whether the claim that bitter implies toxic is valid, and reports that (p. 1217): The bitter rejection response has not been evaluated rigorously in terms of its value in avoiding poisons. For the bitter rejection response to be an effective poison detector, there should be a predictable relationship between the threshold concentration for bitterness and that for toxicity. If the bitter threshold is substantially higher or lower than the toxicity threshold for many compounds, the effectiveness of the bitter rejection response as a poison detector would be uncertain. To the author's knowledge, no investigator has explicitly tested for such a relationship across a series of bitter compounds. Evidence that the claim "bitter implies toxic" is false. The hypothesis is then presented that perhaps the bitter taste threshold co-varies with the level of toxicity of the bitter compounds, i.e., that proposition B above (the fruitarian/raw claim) may be true. Glendinning then goes on to summarize the evidence that indicates the claim is false: Genetic variation. The bitter taste threshold in mice varies according to the autosomal gene Soa. In particular, the bitter taste sensitivity of mice is not a function of the toxicity of the item being tasted. Inter-strain and inter-species studies with mice found wide variation in bitter taste sensitivity, and no relationship between bitter taste rejection and toxicity levels of the items tested. A comparison of the bitter taste threshold with the toxic level of the same compounds, in both mice and humans, found no relationship between the bitter taste and toxicity levels. (See figure 2, p. 1220 of Glendinning [1994] for the comparison plots--very interesting!)

142

Several mammalian species have been observed freely feeding on toxic plants, apparently indifferent to the toxicity. The inference here is that the toxins in these plants failed to produce the bitter rejection response.

Interested readers are strongly encouraged to read the original article for the citations and details. Glendinning [1994, p. 1218] concludes: Taken together, the above studies indicate that bitter taste sensitivity does not accurately mirror the body's systemic reactivity to naturally ocurring or synthetic [toxic] compounds in several species of mammal. Fallacious "proofs" that "bitter implies toxic" as cited in raw circles. The reality that there is no legitimate scientific proof for the fruitarian/raw claim provides yet another example of advocates' claims failing to meet the logical requirement of burden of proof. Instead, the claim may be presented with bogus "proofs" such as the following. A simple "proof" that one encounters is the claim, "Your taste buds evolved to reject bitter because toxic items are bitter." The preceding seems attractive, but it is actually a claim that proposition A somehow "proves" its own converse, proposition B. This is logically invalid. A more extensive crank science "proof" that one encounters is when the fruitarian/raw "expert" references a phytochemical database, picks out a toxic chemical (in this case, a bitter chemical would likely be chosen) that is present in the food and then announces that the food contains toxins, therefore there is "scientific proof" that the (bitter) food must be "toxic."

Space does not permit a full exploration here of the massive logical fallacies in the above crank science "proof." However, a short reply to the above is that practically every chemical is toxic in a large enough dose. If one is going to examine toxins in foods, one must consider dosages (harmful dosage of chemical vs. amount in food) and bioavailability as well. The crank science approach (common among fruitarian extremists) of equating the presence in a food of a chemical that is toxic (in isolation, and in large doses) as "proof" the food is "toxic" is intellectually dishonest, and borders on pathological fear-mongering. Bitter taste as trophic marker /Tolerance to bitter as a dietary adaptation. Turning our attention from fruitarian myths to more serious matters, Glendinning [1994] presents the hypothesis that the bitter taste might serve as a trophic marker. The argument is based on evolutionary adaptation. Inasmuch as plants contain many (unavoidable) bitter constituents, it could be suggested that animals with a plant-based diet would be likely to develop some degree of tolerance for the bitter taste, i.e., they will develop some degree of adaptation to it, hence have higher bitter taste rejection thresholds. Similarly, those animals whose diet almost never includes the bitter taste, i.e., carnivores, would rarely encounter the bitter taste and have a low tolerance for it, hence have very low bitter rejection thresholds. Omnivores would fall in between the two extremes in bitter taste tolerance. Glendinning [1994] hypothesizes that the bitter rejection thresholds would follow the order, from lowest to highest threshold (p. 1219): Carnivores Omnivores Herbivores--grazers (i.e., grass feeders) Herbivores--browsers (i.e., forb, shrub, and tree feeders)

Comparison of bitter rejection thresholds in mammals. Glendinning then examined the bitter
rejection thresholds of 30 mammal species to the chemical quinine hydrochloride (QHCL), using data from published studies. QHCL was chosen because it is widely studied, and sensitivity to QHCL has been found in a number of studies (see Glendinning for citations) to be correlated to sensitivity to many other bitter compounds. Next, Glendinning computed average QHCL taste thresholds by trophic group. The numbers he obtained [Glendinning 1994, pp. 1222, 1225] are as follows:

143

TROPHIC GROUP Carnivores Humans Omnivores Herbivores Grazers Browsers

QHCL Taste Threshold 2.1 x 10-5 M 3.0 x 10-5 M 3.0 x 10-4 M 6.7 x 10-4 M 3.0 x 10-3 M

Note: in above table, M = moles/liter. Note that the results obtained agree with Glendinning's hypothesis about the order of bitter rejection threshold values that would be expected based on the trophic level of foods consumed. Also of interest is that the number for human tolerance of QHCL lies between the numbers observed for carnivores and omnivores, and specifically is closest to the average carnivore value observed. The significance of Glendinning [1994] is that the study provides an actual analysis of bitter taste tolerance data for a wide variety of species, and reaches a result which suggests, or points to the possibility based on the data obtained (which is of course not proof by itself) that humans may be carnivores/omnivores. This coincides with the extensive evidence already given previously that humans are faunivores. No doubt certain fruitarian extremists will react to the above and claim that humans reject the bitter taste because our natural food is (nearly exclusively) sweet fruit. However, as discussed in previous sections, there is no legitimate scientific evidence for the claim that humans evolved as fruitarians (whether strictly so or not). Additionally, many wild fruits are quite bitter, and food shortages or competition for food would likely make the consumption of bitter fruit necessary at times--implying that "fruitarian humans" should have some tolerance for bitter. If humans were strict fruitarians, one would expect our QHCL tolerance level to be similar to the chimp. However, the QHCL tolerance level for chimps is 1.0 x 10-4 M [Glendinning 1994, p. 1226], a whole order of magnitude larger than the number for humans. Finally, many of the same fruitarian extremists claim (at times with intense emotion) that bitter foods must be toxic, the very claim that Glendinning neatly assesses and discredits.

Insulin Resistance: The Carnivore Connection Hypothesis of Miller and Colagiuri [1994]
THE PROBLEM: Incidence of adult-onset diabetes varies depending on recency of a population's exposure to agriculture.
This section is based on Miller and Colagiuri [1994]. The carnivore connection hypothesis they elucidate provides an explanation, based on evolution, of the following phenomenon: Populations that have a relatively "long" history of agriculture, e.g., those of European descent, have a relatively low incidence of NIDDM, that is, non-insulin-dependent diabetes mellitus, also known as adult-onset diabetes. Populations that adopted agriculture more recently or, in other words, discontinued the evolutionary hunter-gatherer diet more recently, have relatively higher incidence rates of NIDDM. In some former hunter-gatherer populations, incidence levels of NIDDM range from epidemic

144

levels (Nauruans; the Pima tribe of the U.S.), to rates that are "only" several times that of Europeans.

THE HYPOTHESIS: Overview


The carnivore connection is a hypothesis that explains the above in terms of insulin resistance and partial adaptation to the high-carbohydrate diets introduced via agriculture. The major points of the hypothesis are as follows (from Miller and Colagiuri [1994, p. 1280 (abstract)]: Our primate ancestors ate a high-carbohydrate diet and the brain and reproductive tissues evolved a specific requirement for glucose as a source of fuel... [T]he Ice ages which dominated the last two million years of human evolution brought a low-carbohydrate, high-protein diet. Certain metabolic adaptations were therefore necessary to accommodate the low glucose intake. Studies in both humans and experimental animals indicate the adaptive (phenotypic) response to lowcarbohydrate intake is insulin resistance... We propose that the low-carbohydrate carnivorous diet would have disadvantaged reproduction in insulinsensitive individuals and positively selected for individuals with insulin resistance. Natural selection would therefore result in a high proportion of people with genetically-determined insulin resistance. Note here that insulin resistance is a disadvantage in the high-carbohydrate diets provided by agriculture, as the metabolism of sugar (and starch) requires far more insulin than the metabolism of fats or proteins. That is, an agricultural diet requires more insulin than a hunter-gatherer diet. Hence, because of a relatively longer history of agriculture, selection pressure for insulin resistance was released sooner in European populations--resulting in lower incidence of NIDDM.

The hypothesis is illustrated in the following figure.

145

Courtesy link to Springer-Verlag's Springer Science Online graciously provided in exchange for permission to reproduce the above figure.

Evidence for the hypothesis


Miller and Colagiuri [1994, p. 1281] comment :

Our hypothesis hinges on four lines of evidence.

146

1. that during the last two million years of evolution, humans were primarily carnivorous, i.e., flesh-eating hunters consuming a low-carbohydrate, high-protein diet 2. that a low-carbohydrate, high-protein diet requires profound insulin resistance to maintain glucose homeostasis, particularly during reproduction 3. that genetic differences in insulin resistance and predisposition to NIDDM can be explained by differences in exposure to carbohydrate during the past 10,000 years 4. that changes in the quality of carbohydrate can explain the recent epidemic of NIDDM in susceptible populations. The beauty of the carnivore connection hypothesis of Miller and Colagiuri is that it provides an elegant, logical (and plausible) explanation for the increase in NIDDM that occurs when former hunter-gatherer societies adopt the high-carbohydrate diets of modern agriculture. At present, it remains a hypothesis, but is included here to stimulate thought on these issues, and to introduce readers to the powerful insight that evolutionary/Paleo research can provide into problems in modern nutrition and health.

Related studies on insulin resistance The "thrifty genotype" hypothesis of Neel [1962, 1982]. The earliest paper to examine the genetic
aspects of insulin resistance as it relates to diabetes mellitus is Neel [1962]. Neel hypothesized that the diabetic genotype was in a sense a "thrifty" genotype that encouraged the efficient utilization of large amounts of food. Such a feature might convey survival advantages in an environment in which both feast and famine were common (though, obviously, not at the same time). The apparent overproduction of insulin in modern times by certain individuals might be a remnant from evolutionary times, when it may have increased adipose fat storage to tide people over during lean times. Neel [1982] revisits the 1962 research, and notes that subsequent research invalidated several details of his 1962 paper. Neel's later paper provides three different approaches which might explain or support the original hypothesis of the "thrifty genotype" (as Neel's hypothesis is widely known). Two of the approaches revolve around having non-functional or insensitive insulin receptors, and the third approach suggests differences in insulin sensitivity between glucose and lipid pathways in each individual.

The "not-so-thrifty genotype" hypothesis of Reaven [1998a]. Reaven [1998a] hypothesizes that
the underlying function of insulin resistance is not to increase fat storage as Neel suggests, but to spare muscle tissue from proteolysis, i.e., to preserve muscle tissue in periods of famine. Reaven [1998a] claims (and provides one citation in support) that there is evidence that insulin-controlled glucose disposal is genetically determined. Reaven's hypothesis is called the "not-so-thrifty genotype." Ozanne and Hales [1998] challenge Reaven [1998a], and note that the evidence is weak in support of claims that NIDDM has a genetic basis. They also note that Reaven's approach suggests that individuals with NIDDM are sensitive to the anti-proteolytic action of insulin, while simultaneously being resistant to the glucose-disposal effects of insulin (analogous to the third approach of Neel [1982]).

Both hypotheses (Neel, Reaven) are based on a dubious assumption. Cordain et al. [1998]
criticize the approaches of both Neel and Reaven, as both are based on the assumption that famine was common in evolutionary times. However, Cordain et al. cite extensive evidence that starvation was/is less common in prehistoric peoples and among modern hunter-gatherers when compared to subsistence agriculturalists. Instead of assuming periodic famine, Cordain et al. [1998] argue for insulin resistance on the basis described in Miller and Colagiuri [1994] (as outlined above on this page). (Also see Reaven [1998b] for additional comments.)

Insulin Resistance: The Carnivore Connection Hypothesis (Part 2)

147

Syndrome X and an alternative vegetarian hypothesis on insulin resistance Description of syndrome X. The term "syndrome X" has two meanings in medicine. One usage of the
term applies to patients displaying angina and electrocardiographic evidence of heart disease without accompanying evidence of coronary artery disease. The term "microvascular angina" is slowly replacing "syndrome X" for this disorder [Reaven 1994]. This particular usage of the term has been described first so that we can focus on the meaning of the term "syndrome X" of interest here; from Reaven [1994, p. 13]: Indeed, there is substantial evidence [3, 5-8] that the combination of insulin resistance and compensatory hyperinsulinaemia predisposes individuals to develop a high plasma triglyceride (TG) and a low highdensity lipoprotein (HDL) cholesterol concentration, high blood pressure, and coronary heart disease (CHD). In 1988, I suggested that this cluster of abnormalities constituted an important clinical syndrome, designated as Syndrome X [3]. Reaven [1994] proposes that the term syndrome X be expanded to include, as possible symptoms, hyperuricaemia (increased serum uric acid concentration), the presence of smaller low-density lipoprotein (LDL) particles, and other conditions. The reason for discussing syndrome X here is two-fold. First, the term may be relevant to some (but certainly not all) of the health problems that occur when former huntergatherers switch from their traditional diets to the high-carbohydrate diets of agriculture. Readers should be aware that other diagnoses (including the diagnosis of "good health") may be relevant, and it is not suggested that all former hunter-gatherers suffer from syndrome X.

Alternative vegetarian hypothesis for insulin resistance? The second reason to discuss syndrome
X is that a recent hypothesis, Provonsha [1998], might possibly be promoted by raw/veg*n advocates as a vegetarian alternative to the insulin resistance hypotheses discussed above. Provonsha's hypothesis claims (in opposition to implications of the above hypotheses) that meat consumption causes insulin resistance and related syndrome X conditions. A brief summary of the major parts of the hypothesis of Provonsha is as follows.

Simplistic binary logic is a basic tenet of the hypothesis. Biochemical processes can be
classified in a binary manner: anabolic (associated with food intake) and catabolic (associated with fasting and recovery from injury). Implicitly, Provonsha appears to indicate that anabolic = good, and catabolic = bad, in discussing processes. The consumption of animal tissue (protein) results in the production of the hormones glucagon and cortisol, whose action opposes that of insulin. (Provonsha classifies insulin as anabolic, and glucagon and cortisol as catabolic.) Meat protein = "catabolic"; allegedly causes syndrome X. Meat protein is "catabolic" and will likely activate catabolic processes in the human body, causing insulin resistance and promoting syndrome X.

There are numerous problems with Provonsha's arguments. First, Provonsha tries to class all biochemical processes as either anabolic or catabolic, and he expresses amazement that both processes can be present in the body at once; from Provonsha [1998, p. 120]: "It has always amazed me that 'protein' activates both insulin and glucagon, hormones with completely opposite functions (Unger, 1974)."

Binary logic of hypothesis is an inadequate model of reality. It appears that Provonsha believes
that the human body should conform to a simplistic binary model: anabolic or catabolic, where the "or" is exclusive. However, the human body is a very complex system, with anabolic and catabolic processes underway simultaneously, and in balance. Black-and-white reasoning similar to what Provonsha appears to suggest can also be found in the simplistic approaches of certain raw vegan extremists. Suffice it to say that such reasoning does not reflect well on Provonsha's hypothesis. Second, Provonsha argues against glucagon and cortisol, yet his criticisms of them are unconvincing. He argues that both raise blood sugar, and claims that cortisol increases insulin resistance and promotes obesity (he cites three references, the titles of two of which relate to obesity). His criticism of glucagon similarly

148

discusses its role in promoting obesity, although he provides only one citation--a study of diabetic rats--that may indirectly support this claim. The basic problem with his arguments here is that since studies on the direct effects of cortisol and/or glucagon are not cited (with the possible exception of the rat study above), his argument is essentially a narrow "this might happen if we look at specific processes in isolation" [not a quote from Provonsha]. Such arguments are frankly not very compelling, as the human body is a complex, interrelated system, with catabolic and anabolic processes active simultaneously. Further, glucagon and cortisol, and catabolic processes in general, are necessary and are part of the balanced homeostasis required to support life.

Fallacy that meat = high-fat consumption recited. Also, Provonsha engages, on p. 122, in the
common fallacy of equating meat consumption with high-fat consumption--demonstrating ignorance of paleolithic diet research that has analyzed wild animal meats, which were part of humanity's evolutionary diet, and found them to be mostly very low-fat (muscle meats) or, at the least, low in saturated fat (organ meats), and that lean meats do exist. (The character of wild game is discussed in a later section.)

Unsupported claims in hypothesis. Third, Provonsha claims (p. 124), without evidence or citations,
that if one eats meat, one will absorb "untold" chemicals that are catabolic and can disrupt human body processes. Because he provides no citations in support of the preceding claims, Provonsha's claims that eating meat might disrupt body processes appear to be speculative. Provonsha also criticizes the fact that glucagon (produced by eating animal tissue) inhibits pancreatic enzyme production. However, he fails to mention that many grains and legumes--common plant foods, and the major caloric basis for most (conventional) vegan diets--contain significant amounts of enzyme inhibitors as well. (For more on antinutritional components in grains/legumes, see The Late Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance.)

Real-world dietary trials directly contradict the hypothesis. Provonsha's hypothesis that meat
protein causes insulin resistance ultimately fails the test of real-world dietary trials. While it is true that eating fat with carbohydrate may lead to lower glucose and higher insulin levels (see, e.g., Collier et al. [1988]), it does not follow that meat causes chronic hyperinsulinemia or syndrome X. To the contrary, two studies on this very issue provide evidence that dramatically contradicts the core of Provonsha's hypothesis.

Results of trial on nearly 100% animal food (seafood) diet. O'Dea and Spargo [1982] investigated
the plasma and glucose responses in 12 full-blooded Australian Aborigines both before and after 2 weeks on a diet of nearly 100% animal flesh (seafood). Prior to the trial, the diet of the Aborigines was, on an energy (caloric) basis, high-carbohydrate (40-50%), high in fat (40-50%), and low in protein (10% or less). The seafood diet observed for 2 weeks was, on an energy basis, 70-75% protein, 20-25% fat, and less than 5% carbohydrate. If Provonsha's hypothesis regarding animal protein causing insulin resistance were true, one would expect an increase in insulin resistance from such a high protein diet. However, the opposite was observed; from O'Dea and Spargo [1982, pp. 496-497]: Together these findings suggest an improvement in glucose utilization and insulin sensitivity after the high protein-low carbohydrate diet... The mechanisms by which low carbohydrate diets which are high in protein preserve glucose tolerance while those high in fat do not, is probably related to the gluconeogenic potential of high protein diets. Elevated glucose levels in response to the ingestion of protein would promote hepatic gluconeogenesis from amino acids entering the liver from the splanchnic circulation. In this way a low carbohydrate diet which was high in protein could maintain the necessary glucose supply to the body whereas one high in fat could not.

149

Reversion to high-protein, meat-based diet improves lipid and carbohydrate metabolism in Westernized, diabetic Australian Aborigines. O'Dea [1984] presents the results of a study on 10
diabetic and 4 non-diabetic full-blooded Australian Aborigines. They were tested both before and after living for 7 weeks in the traditional (hunter-gatherer) manner, in northwestern Australia. The 7 weeks consisted of 1.5 weeks of travel, 2 weeks at a coastal location living on mostly seafood, and 3.5 weeks inland living on land animals with some wild plant foods. During the first 1.5 weeks (travel time), the diet was 90% animal foods on an energy (caloric) basis. The coastal diet composition was nearly all animal foods, mostly seafood with some birds and kangaroo, and was approximately 80% protein, 20% fat, and less than 5% carbohydrate. The diet at the inland location was 64% animal foods (by energy) and 36% plant foods, the macronutrient content of which was 54% protein, 13% fat, and 33% carbohydrate.

Significant improvements noted in both glucose tolerance and insulin response. Once again, if
Provonsha's hypothesis that animal protein causes or aggravates insulin resistance were true, one would expect the above animal-protein-based diets to greatly aggravate the condition of the diabetics in the study. However, the opposite happened: the condition of the diabetics improved. From O'Dea [1984, pp. 601, 602]: The major finding in this study was the marked improvement in glucose tolerance in 10 diabetic Aborigines after a 7-wk reversion to traditional hunter-gatherer lifestyle. There were two components to this improvement: a striking fall in the basal (fasting) glucose concentration and a less marked, but nevertheless significant, improvement in glucose removal after oral glucose... The three most striking metabolic changes that occurred in this study, namely the reductions in fasting glucose, insulin, and triglyceride concentrations to normal or near-normal levels, were certainly interrelated. Note that the insulin response to glucose also improved [O'Dea 1984, pp. 596, 599]. The above studies that contradict Provonsha's hypothesis, plus the other problems as noted previously, suggest the hypothesis has little merit, and raw/veg*n advocates should not try to use it as a vegetarian alternative explanation to the more plausible carnivore-connection hypothesis for insulin resistance or syndrome X.

Insulin resistance and fruitarianism Diabetes-like symptoms. On adopting a fruitarian diet, anecdotal evidence (the only evidence available)
indicates than many individuals experience diabetes-like symptoms, specifically some of the following: excess urination, frequent thirst, fatigue, mood swings, intermittent blurred vision, pains in extremities (hands, feet), etc. The standard fruitarian "party line" explanation for such symptoms is that they are detox and will go away once you are "pure" enough. In reality, the excess urination and frequent thirst usually persist, so long as one is consuming large amounts of sweet juicy fruit. (Consider the physics involved: large amounts of water are ingested, in the form of sweet juicy fruits, and approximately the same amount of water must be excreted, as well.) A typical response in the face of this ongoing experience is to (mentally) adjust to the symptoms; that is, one may begin to consider it normal or healthy to urinate several times per hour, have frequent mood swings, experience intermittent fatigue (which may be considered as evidence of "detox"), and so on.

(Note regarding thirst symptoms: Thirst on a fruitarian diet may seem counterintuitive, since the level
of fluid intake is so high via all the fruit. However, some fruits [citrus, for example] are diuretic and increase urination. When used as staples, this can lead to thirst and/or the need to drink more water, at least

150

in my own former experience as a fruitarian. Or instead, one may become thirsty but suppress the urge to drink because they become tired of urinating every few minutes. This syndrome may be partially responsible for why some fruitarians "don't drink water," aside from just the high water content of the diet.) The carnivore connection hypothesis may explain these symptoms as well, i.e., the individual is consuming grossly excessive carbohydrate (sugar) given his or her level of insulin resistance. Of course other factors may also be involved (e.g., insulin inhibitors in the diet, plus other nutritional factors: zinc, B-12, taurine, insufficient calories, etc.), but the hypothesis that the "ideal" fruitarian diet is beyond the range of genetic adaptation (i.e., decidedly unnatural) for many individuals is not only tantalizing, but highly plausible as well. The wide incidence of diabetes-like symptoms among fruitarians may be (additional, circumstantial) evidence that although our prehistoric ancestors certainly consumed fruit (when available), strict fruitarian diets were never an important factor in human evolution.

Rationalizations by fruitarians. No doubt the defenders of fruitarianism will claim that a high
incidence of NIDDM in former (recent) hunter-gatherer populations is all the fault of the grains/legumes in their new diet, and that there would be no NIDDM if the former hunter-gatherers were to eat the "ideal" fruitarian diet. But where is the evidence (even including anecdotal evidence) for such a claim? More to the point, the extensive anecdotal evidence surrounding fruitarianism is that it is a massive failure, in the long run, and the diabetes-like symptoms (which may be due to excess sugar consumption) are common among those who try to subsist on a diet of mostly sweet fruit. Further, such evidence is found among people of European descent, i.e., those who have had a longer time to adapt to higher carbohydrate diets.

Comparative Physiology: Overall Synopsis


Some of the physiological evidence that humans are adapted to a diet that includes substantial animal products (fauna; i.e., we are faunivores) is:

Heme iron receptor sites. Our intestines contain receptor sites specifically for the absorption
of heme iron, which is found in nutritionally significant quantities only in animal foods. This is strong evidence of evolutionary physiological adaptation to animal foods in the human diet. B-12 considerations. Humans need vitamin B-12, but all current evidence suggests that plant foods were not a reliable, year-round source during human evolution. Geophagy and coprophagy are not plausible sources, leaving animal foods (including insects) as the sole reliable, plausible source. Taurine synthesis. Relative efficiency of synthesis: the synthesis of taurine is much less efficient in humans than in herbivorous animals. Beta-carotene to vitamin A conversion. Relative efficiency of conversion: the conversion of beta-carotene to vitamin A is much less efficient in humans than in herbivorous animals. Sufficiency and balance of EFAs. Common, staple plant foods generally do not contain the right "balance" of EFAs, and production of EPA, DHA from plant source fats may be inefficient. It's hard to understand why--if humans really are natural vegans--the "optimal" balance of EFAs is apparently so difficult to achieve with plant foods. Bioavailability issues. Relative efficiency of digestion/bioavailability: Although animal foods are generally easier for any mammal to digest than plant foods for structural reasons (e.g., cell wall considerations), the fact that many staple plant foods contain high levels of factors that inhibit the human digestive process suggests a long evolutionary dependence on animal foods as major nutrient sources. Examples of the relative bioavailability are as follows. o Iron in animal foods is more bioavailable than in plant foods. o Zinc is more bioavailable in animal foods than in plant foods. o Animal protein is digested more efficiently than plant protein. Analysis of bitter taste thresholds by Glendinning [1994] shows that the human bitter taste threshold is in the same range as faunivores.

151

Taken individually, many of the above points are equivocal. When considered collectively, however, they strongly point to animal foods having an important role in the human diet during evolution. Also, two important hypotheses relating diet and evolution were discussed here:

The incidence of hereditary hemochromatosis, a relatively common (in certain


populations) "iron overload" disease, may be an example of a partial genetic adaptation that promotes survival in the high-carbohydrate, lower-animal-food diets of agriculture, by increasing iron absorption. The carnivore-connection hypothesis of Miller and Colagiuri explains the high incidence of NIDDM in former (and only recently Westernized) hunter-gatherer populations as being due to insulin resistance; i.e., their insulin resistance level has not yet begun to adapt to the highcarbohydrate diets of agriculture.

Specific concerns for fruitarians. Additionally, specific hypotheses regarding fruitarianism were
presented:

Heightened B-12 risk. Strict fruitarianism might accelerate vitamin B-12 deficiency by
decreasing production of gastric acid. This may be a low-risk issue as it is very rare for anyone to strictly follow a fruitarian diet long-term; i.e., "cheating" and binge-eating are common on the diet. Low zinc and feelings of "euphoria." Zinc deficiency is a plausible potential explanation for the "euphoric" mental feeling reported by some fruitarians (also an explanation for the loss of libido reported by some). Diabetes-like symptoms. The carnivore-connection hypothesis of Miller and Colagiuri might explain the high incidence of diabetes-like symptoms among fruitarians, and the extremely high failure rate among those who try the diet. It seems plausible, given the predominant picture presented by the anecdotal record, that most people are not genetically adapted to a diet in which (approximately) 75+% of calories come from sugar, a simple carbohydrate that requires insulin for metabolism.

PART 8: Further Issues in the Debate over Omnivorous vs. Vegetarian Diets
Preface and overview. This section examines some of the related or ancillary claims, i.e., claims other
than those based strictly on comparative anatomy/physiology, that are sometimes included in the comparative "proofs" for specific diets. Inasmuch as there is a wide variety of such claims, and such ancillary claims are often lengthy, this section is primarily an overview or summary response to such claims.

152

This section first briefly considers two bogus fruitarian claims. Then, as an important prerequisite to discussing the use or interpretation of clinical data in ancillary claims, we examine hunter-gatherer diets to see if a healthy omnivore/faunivore diet exists. Then we examine some common logical fallacies in the interpretation of clinical and epidemiological data. After that, we examine the topic of instinct versus intelligence, and finally finish by addressing a long list of bogus claims about instinct made by a fruitarian extremist.

Two Fruitarian Claims

Anti-protein theories: prime examples of crank science Overview of anti-protein theories. These theories were mentioned in earlier sections, and only a brief
summary is presented here. The basic thrust of such theories is that: Actual protein/amino acid requirements are allegedly much lower than the national and international standards suggest. (Note: See the earlier section on protein digestion for pointers to published scientific papers that claim the opposite is true, i.e., the standards are too low.) "Excess" protein, i.e., protein above the very low level in fruits, will create metabolic by-products as a result of digestion. These metabolic by-products are allegedly harmful in anything above very tiny amounts.

Ergo, one could summarize the above as "protein is toxic" in figurative terms, although in literal terms, one might summarize the theory as "protein is toxic in the sense that consuming anything above minimal requirements causes the production of metabolic by-products that are toxic and harmful."

Defects of anti-protein theories. The problem with such theories is that:


The calculation of protein requirements in such theories may be dubious--may fail to include a safety margin, or assume that the individual is sedentary, and so on. The greatest flaw of such theories is that they are based on massive ignorance of the reality that the human body is very well-equipped to dispose of the metabolic by-products of protein digestion. That is, the theories assume the metabolic products are necessarily harmful without regard for the bioavailability of the metabolic by-product, (i.e., the levels actually produced and absorbed vs. known harmful levels) and the ability of our detox systems (liver, other organs) to handle the by-products. Such theories are implicitly centered around an intensely paranoid, pathological fear of any/all possible "toxins" in food (with no regard to the body's normal tolerances for, and abilities to safely dispose of, them). Such intense fear of toxins can easily become obsessive. Needless to say, pathological, obsessive fear is not the basis for a healthy diet or dietary philosophy; it can be, however, the basis for an eating disorder (a mental illness).

See the article, "Is Protein Toxic?" on this site (not yet available) for an in-depth analysis of anti-protein theories; also see the article Fruit is Not Like Mother's Milk for background information relevant to the theories.

The theories mentioned above are prime examples of crank science. They might look impressive
to the layperson, as they may include detailed calculations and citations to scientific journals. However, a close look at the logic and other details will reveal them to be crank science: bogus and fallacious. It should be mentioned that even conventional veg*ns might find the above site articles of interest, for they illuminate the crank science and bad logic that pervade the (vegan) fruitarian movement.

153

Raw irony. Another example of "raw irony" should be noted here. Some of the staunchest advocates of
the anti-protein theories report that they apparently have not managed to succeed, long-term, on the "ideal," ultra-low protein fruitarian diets that they advocate. It seems their message is, "Do as I say, not as I do."

Fruitarian straw argument: a pristine preindustrial world


Inasmuch as paleoanthropology and evolution provide powerful scientific evidence that humans are not natural veg*ns/fruitarians but are instead, natural omnivores/faunivores, some raw/veg*n advocates have resorted to straw arguments. The straw argument is to assert that those discussing the evolutionary diet of humans are claiming that humans who follow an evolutionary diet will be disease-free, and that prehistoric life was some kind of Eden. Of course, such assertions are distortions and exaggerations. No credible researcher of evolutionary diet would suggest that following such a diet will guarantee perfect health, or that prehistoric life was Edenic. In sharp contrast to the straw claims of an Eden, the fossil record points to high mortality rates from what could be described as occupational hazards (hunting, accidents, predators, and other kinds of violence) in prehistoric times. Disease (primarily acute) was also a major mortality factor. The use of such a straw argument provides yet another example of "raw irony," for it has become commonplace in certain fruitarian circles to claim that the fruitarian diet provides "paradise health." As mentioned in previous sections, extensive anecdotal evidence (all that is available) indicates that adherence to a strict (vegan) fruitarian diet in the long run often leads to serious physical and/or mental ill health. Let us now turn our attention to more important matters in the following sections.

Hunter-Gatherers: Examples of Healthy Omnivores


A major point often overlooked by dietary advocates eager to condemn all omnivore diets (based primarily on clinical research that by default normally utilizes the SAD/SWD diet as the "control" omnivore diet) in order to "sell" their favorite (raw/veg*n) diet is that healthy omnivore/faunivore diets do in fact exist. Besides the apparent anomaly that occasional individuals appear to thrive on the deservedly maligned SAD diet, hunter-gatherers (who have by now all but disappeared or assimilated Western lifestyles) provide examples of healthy omnivore/faunivore diets.

HEALTH AND DISEASE IN HUNTER-GATHERERS


Environmental factors

Malnutrition and starvation. Dunn [1968], a paper on health and disease in hunter-gatherers,
makes a number of relevant points. Dunn (p. 233) reports "patent (and perhaps even borderline) malnutrition [in hunter-gatherers] is rare." Dunn also reports that hunter-gatherers are often better nourished than nearby agriculturalists or urban residents. Dunn also notes (p. 223), "Starvation occurs infrequently." Agriculturalists who depend on a few crops for the bulk of their diet are more susceptible to starvation (e.g., caused by crop failure from climatic variation, insect attack, plant diseases) than hunter-gatherers who have a wider dietary base.

Occupational hazards of hunter-gatherer lifestyles. Dunn (pp. 224-225) also mentions


that: 4. Accidental and traumatic death rates vary greatly among hunter-gatherer populations...

154

7. Ample evidence is available that "social mortality" has been and is significant in the population equation for any hunting and gathering society. In the above, social mortality refers to warfare, cannibalism, sacrifice, infanticide, etc. These are rare or non-existent today, though they may have been significant factors among certain huntergatherer populations in the not-too-distant past.

Parasites and infectious diseases. On the topic of parasites, Dunn notes (p. 225):
8. Parasitic and infectious disease rates of prevalence and incidence are related to ecosystem diversity and complexity. Although many of these diseases contribute substantially to mortality, no simple, single generalization is possible for all hunter-gatherers.

Chronic and degenerative diseases

Chronic diseases rare in hunter-gatherer societies. Dunn (pp. 223-224) reports that:
3. Chronic diseases, especially those associated with old age, are relatively infrequent... Although life expectancies of hunter-gatherers are low by modern European or American standards, they compare favorably with expectancies for displaced hunter-gatherers, many subsistence agriculturalists, and impoverished urbanized people of the tropics today (Ackerknecht, 1948; Billington, 1960; Duguid, 1963; Dunn, MS-i; Maingard, 1937; Polunin, 1953). Few huntergatherers survive long enough to develop cardiovascular disease or cancer, major causes of mortality in America and Europe today. These diseases do occur infrequently, however, and preclinical manifestations of cardiovascular disorder may be detected in such populations (Mann et al., 1962). Occasional writers have claimed that certain primitive populations are "cancer-free" or "heart disease-free," but sound evidence to support such contentions is lacking, and evidence to the contrary has been steadily accumulating. Note: the term "MS-i" is as it appears in Dunn [1968]. No explanation is given for it there; possibly it might stand for "manuscript 1."

How "hunter-gatherer" is defined is an important point when determining disease incidence. (note regarding Mann [1962] reference cited in Dunn [1968] above) The comments
from Dunn [1968] above have been cited at enough length to provide the context that no human population is completely disease-free. However, at the same time, there are some important qualifications that need to be clarified regarding groups sometimes cited as "hunter-gatherers" who in actuality do not meet the relevant definition. The paper of Mann et al. [1962] referred to just above, which studied African Pygmies in what was then known as the Congo, is a case in point. The dietary survey component of Mann's study could not be completed due to political unrest, so dietary information had to be obtained from other, perhaps less reliable sources. Mann's paper claims both that the Pygmy diet was mostly wild foods, and that the second and third most common foods in the diet were sweet potatoes and rice, both of which are cultivated foods. The Pygmies also consumed palm oil and salt obtained via trade. By current research standards, of course, all of these foods are well outside the definition of a hunter-gatherer diet. The paper also claims that the Pygmy aboriginal culture has been preserved, while, simultaneously, the Pygmies are oppressed and used as laborers by nearby Bantu tribes. Further, the claims of heart disease were based solely on ECG (electrocardiogram) readings, which Mann et al. [1962] report are an unreliable diagnostic tool for Africans (p. 367). Finally, the abnormal

155

ECG readings that were a basis for the claim of possible heart disease should be interpreted in light of the serum cholesterol readings obtained for the Pygmies: 101 for adult men, 111 for women. These are extraordinarily low readings. Such inconsistencies in the Mann et al. [1962] paper raise considerable doubt concerning their finding of signs of possible heart disease, and also whether the Pygmies tested were following their traditional lifestyle. The lesson here, once again, is to check sources, particularly older ones, to see how well the research standards match currently accepted definitions and criteria.

Humans are not exempt from the reality of diseases. No doubt certain dietary extremists
will seize on quotations like the one above regarding the lack of complete freedom of disease in hunter-gatherers, and try to use it against the hunter-gatherer diet. The above remarks and qualifications notwithstanding regarding the care needed in assessing (reputed) hunter-gatherers, a couple of obvious points need to be remembered regarding any population: o Wild animals can and do suffer from degenerative diseases. McCullagh [1972] discusses the occurrence of atherosclerosis in wild African elephants; Goodall [1986] discusses the incidence of arthritis-like conditions (also colds and pneumonia, among other disorders) among the wild chimps of Gombe. Of course, wild animals can and do suffer from a wide range of non-degenerative diseases (hoof and mouth, Lyme disease, bubonic plague, polio, etc.). o No diet can guarantee "paradise health," and claims that a particular diet does are simply "dietary snake oil." The idea that a particular diet can guarantee "paradise health" (a bogus claim made by some fruitarian extremists) or prevent any/all diseases is nothing but a fairy tale. More specifically, such false claims are also "dietary snake oil" when used by dishonest dietary advocates to "sell" their "ideal" diet. In the real world of nature, diseases exist, and despite the considerably enhanced probability of better health by following a better diet, there are no guarantees in life. The point of the above is straightforward: wild animals can and do die from diseases, and there is no reason to assume humans are different in this matter. However--the absence of total immunity aside--the incidence rate of chronic and degenerative diseases is in fact very low in huntergatherer societies.

Incidence of specific diseases. Although hunter-gatherers are not immune to chronic diseases,
the available evidence indicates that such diseases are relatively rare in hunter-gatherer groups who follow their traditional lifestyles. However, the incidence of such diseases increases sharply once these societies begin to assimilate Western culture and lifestyle. Comments on specific diseases are as follows. o Cancer. As Eaton et al. [1994, p. 361] note: Medical anthropologists have found little cancer in their studies of technologically primitive people, and paleopathologists believe that the prevalence of malignancy was low in the past, even when differences in population age structure are taken into account (Rowling, 1961; Hildes and Schaefer, 1984; Micozzi, 1991). Eaton et al. [1994] also analyzed the factors involved in women's reproductive cancers and developed a model that indicates that up to the age of 60, the risk of breast cancer in Western women is 100 times the risk level for preagricultural (e.g., hunter-gatherer) women. o

Cancer in Africa. The famous medical missionary, Dr. Albert Schweitzer, writing in
Berglas [1957], reports the following [Berglas 1957, preface]:

156

On my arrival in Gabon, in 1913, I was astonished to encounter no cases of cancer. I saw none among the natives two hundred miles from the coast. I can not, of course, say positively there was no cancer at all, but, like other frontier doctors, I can only say that if any cases existed they must have been quite rare. Williams [1908] reports that cancer is extremely rare among Australian aborigines and in the aboriginal peoples of Africa and also North America. (Note the date of the citation: 1908, a time when there were far more hunter-gatherers than there are today.) o Cancer among the Inuit. Stefansson [1960] describes the search of George B. Leavitt, a physician on a whaling ship, who searched for cancer among the Inuit of Canada and Alaska. It took him 49 years, from 1884 to the first confirmed case in 1933, to find cancer. (Stefansson [1960] describes a possible but unconfirmed case of cancer in 1900, and Eaton et al. [1988] describe cancer in a 500-year-old Inuit mummy.) Schaefer [1981] reports that breast cancer was virtually unknown among the Inuit in earlier times, but was one of the most common forms of malignancy by 1976. o

Cardiovascular disease. Moodie [1981] reports on evidence of hypertension among


Australian aborigines, from 1926 to 1975. The data support an association between increasing Westernization and hypertension, but there are some inconsistencies. However, citing additional, more recent data, Moodie reports further evidence of increasing hypertensive disease among Aborigines. Moodie [1981] also reports that prior to the 1960s, arteriosclerosis and ischemic heart disease were rare among the Australian aborigines. Schaefer [1981] reports that hypertension and coronary heart disease are extremely rare among the less-acculturated Inuit, but are increasing markedly among the acculturated groups. A 1958 survey of Alaskan natives found no hypertension; however, a 1969 survey found that native Alaskan women showed so-called normal levels of hypertension (i.e, comparable to Western women).

Diabetes. Moodie [1981] reports that diabetes was rare in Aboriginal communities prior
to the 1970s; nowadays the prevalence of diabetes is more than 10% in some Aboriginal communities. Schaefer [1981] reports there are no cases of diabetes among the Inuit who still live the traditional lifestyle. However, cases are now being reported among the acculturated Inuit of the Mackenzie delta area (Canada). Diabetes is also increasing in incidence in the Inuit of Alaska and Greenland.

The above indicate that chronic degenerative diseases are rare when traditional hunter-gatherer lifestyles are followed, but the incidence of disease increases as Westernization occurs. This suggests that at least from the viewpoint of chronic degenerative diseases, the hunter-gatherer lifestyle and diet were quite healthy.

Biomarkers
The biomarkers for hunter-gatherers following traditional lifestyles reflect a high level of health and physical fitness. Eaton et al. [1988] report that the aerobic fitness of male hunter-gatherers is in the superior to excellent range. The triceps skinfold measurements for young males range from 4.6-5.5 mm in hunter-

157

gatherers compared to 11.2 mm for Westerners. Diabetes prevalence is 1.2-1.9% in hunter-gatherers versus 3-10% for Westerners. Finally, serum cholesterol levels for hunter-gatherers are astonishingly low by modern standards--in the 101-146 mg/dl range. (See Eaton et al. [1988, pp. 742-745] for the tables the preceding data comes from.)

Health status of the Aborigines of Australia: O'Dea [1991]


O'Dea [1991] provides additional insight into the health of Australian aborigines, pre-Westernization, when they followed their traditional hunter-gatherer lifestyle. Below is Table 1 from O'Dea [1991, p. 234]:

Recognizing that we are all the descendants of hunter-gatherers, the negative impact on modern huntergatherers when they adopt Western diet and lifestyles may provide insight into factors behind the health problems that afflict those of us whose hunter-gatherer ancestry is more recent. The standard Western diet is dramatically different from the hunter-gatherer evolutionary diet. O'Dea [1991, p. 237] provides the following summary comparison table:

158

The hunter-gatherer diet of the Aborigines was low in fat (lean meat from wild animals) and simple carbohydrates like sugar. Further, considerable energy was required to obtain such foods (i.e., those high in fat or sugar) in the wild, especially in the arid outback of Australia. In contrast, the diet of Westernized Aborigines is high in fat from domesticated animals and in simple carbohydrates (refined sugars, e.g., sucrose, fructose from corn syrup). Also, little effort is required to obtain such foods in today's modern society--just go to the store and buy it (much easier than hunting for your meal).

Impact of Western diet on Aborigines. Once Aborigines switch to a Western diet and lifestyle, they
frequently develop the problems common on Western diets: adult-onset diabetes, coronary heart disease (CHD), hyperinsulinemia, high blood pressure, obesity, etc. The incidence of such diseases increases as the degree of Westernization increases in Aborigine communities. As an example, O'Dea [1991, p. 239] observes that: In the 20-50 year age group, the prevalence of diabetes is ten times higher in Aborigines than in Australians of European ancestry. Although fewer data are available, the prevalence of CHD and hypertension is also considerably higher in Aborigines (Wise et al. 1976; Bastian 1979), and hypertriglyceridemia [high

159

triglycerides] is a striking feature of the lipid profile in all Westernized Aboriginal communities in which it has been measured (O'Dea et al. 1980, 1982, 1988, 1990; O'Dea 1984).

In summary, despite implicit assumptions to the contrary by many raw/veg*n diet advocates, healthy omnivore/faunivore diets do in fact exist, and the diets of hunter-gatherers preWesternization are good examples thereof.

Which Omnivore Diet? The "Omnivorism = Western Diet" Fallacy


Fallacy prevalent among scientifically oriented vegans as well as extremists. Perhaps the most
grievous of all fallacies promoted by extremist and conventional vegans alike is the implicit equation of Western diets with omnivorism in general. Ironically, this nearly omnipresent fallacy (in the vegan community) seems to be widespread even among those for whom clinical research is the "gold standard," and among whom proper epidemiological protocols are fine points of discussion. Likely this is because-among such advocates--other avenues of research such as evolutionary/anthropological studies tend to be given short shrift or denigrated in comparison to clinical studies. Thus important knowledge with a crucial bearing on the interpretation of clinical studies that would be obvious to someone versed in hunter-gatherer research, for instance, can be completely missed, as it is in the perpetuation of this fallacy.

Clinical studies implicitly use those eating Western diets as the "omnivorous control" group. Typically, study after study or "reams of evidence" are cited as showing that veg*n diets are
superior to "omnivorous" diets. But in fact, what the cited studies really indicate is simply that veg*n diets are superior to the deservedly maligned standard American/Western diet (SAD or SWD) with respect to certain degenerative diseases. That is, when not otherwise mentioned, by default those eating the SWD (or variants thereof) implicitly serve as the "control" group to represent "omnivores" in such studies.

Omnivorous hunter-gatherer diets differ from the SAD/SWD in two important respects. Yet
just as typically, no mention is made of the fact that Western diets are considerably different than omnivorous hunter-gatherer diets, for example, in terms of the overall proportions and composition of foodstuffs besides meat. Equally as indiscriminate, there seems to be virtually zero awareness (since the subject is rarely if ever mentioned) of the considerable differences in the type of meat itself in such diets: that is, domesticated or feedlot meat in the typical Western diet vs. the wild game of hunter-gatherer diets. Divergences in overall composition between Western vs. hunter-gatherer diets are covered in the section just below. The disparities between domesticated meats vs. wild game are discussed in the subsequent section, "Logical Fallacies and the Misinterpretation of Research," along with other issues that relate to the "Omnivorism = Western Diet" fallacy.

Hunter-Gatherer Diets vs. Western Diets: Wild vs. Domesticated/Cultivated Foods


An important point to be noted concerning hunter-gatherer diets is that they are composed of wild rather than cultivated foods. Hunter-gatherer diets also generally exclude grains, legumes, and dairy, as they are the products of agriculture--something that hunter-gatherers do not practice.

The hunter-gatherer diet is generally composed of:


Wild fruits, nuts, greens, tubers--raw and/or cooked. The lean meat and organs of wild animals. Insects (wild, though some tribes, e.g., the Ache of Paraguay, practice semi-cultivation of insects; see Clastres [1972]).

160

Other foods--e.g., honey, as an example. (Note that wild honey usually includes the bee brood-bees in the form of larvae; reportedly it is the best-tasting part--at least that is what a friend who is a former instinctive eater relates.)

Most modern Western diets, in sharp contrast, consist of:


Domesticated grains (wheat, corn, rice), tubers, and legumes (which present the problems of gluten, phytates, other antinutrients, and a high glycemic index for some tubers/refined grains). Meat from domesticated animals (high in saturated fat). Dairy (lactose intolerance, high in saturated fat). Cultivated fruits (bred for high sugar content; high glycemic index for some fruits). Hybrid vegetables. Manufactured and/or highly processed foods: o Hydrogenated oils (synthetic fat, very difficult for the body to process). o Heavily processed foods (additives, artificial ingredients). o Large amounts of refined sugar (candy, cookies, other sweets).

Comparison of the above lists with the earlier table (herein) from O'Dea [1991] shows the dramatic differences between a potentially healthy omnivore/faunivore diet, i.e., the hunter-gatherer diet, and a diet generally regarded as unhealthy: the SAD/SWD. The differences between wild fruit and cultivated fruits are discussed in the article, Wild vs. Cultivated Fruit, available on this site. The substantial differences between the meat of wild and domesticated animals is discussed in, "Disparities between Wild Game and Domesticated Meat in Fat Distribution and Composition," on this site (not yet available); additional information is provided in Speth and Spielmann [1983] and Naughton et al. [1986]. Eaton and Konner [1985] discuss the differences in protein and fat content for a Paleolithic diet of wild foods versus the SAD diet of cultivated and processed foods; see Eaton and Konner [1985, table 5, p. 288]. Their data estimate a Paleo diet as containing 34% protein and 21% fat by calories, while the SAD diet is 12% protein and 42% fat. The table also shows a ratio of polyunsaturated fat to saturated fat of 1.41 for a Paleolithic diet versus a ratio of 0.44 for the SAD. Hence, one might characterize a diet of feedlot meat (SAD) as high-fat, medium-protein, versus a diet of lean meat from wild animals (Paleo) as low-fat, highprotein.

Logical Fallacies and the Misinterpretation of Research


CLINICAL STUDIES, ANIMAL EXPERIMENT STUDIES, AND EPIDEMIOLOGICAL STUDIES Introduction
Some of the more recent and more advanced comparative "proofs" of diet may cite as supportive evidence clinical studies, epidemiological studies such as the Cornell China Project, and/or animal experiment studies. A large number and variety of studies may sometimes be cited in comparative "proofs." However, sheer number of studies cited means little if the interpretations drawn from study results are based on fallacious logic and/or misinterpretation.

Common logical fallacies. Typically, such studies are presented as allegedly indicating that veg*n diets
are healthier than omnivore diets. However, nearly all of such claims are actually logical fallacies. A short summary of such commonly encountered fallacies is as follows.

161

Results from studies using domesticated or feedlot meat cannot be generalized to all omnivore diets. Results of experiments in which test subjects eat domesticated/feedlot meats are
often interpreted--usually unknowingly, but at the least implicitly--as reflecting the results of all meats and/or all omnivore diets. However, the composition of domesticated/feedlot meat is very different from the composition of the lean wild animal meat (in terms of both overall fat level as well as type of fats contained) that was a part of humanity's evolutionary diet. (Not yet available: "Disparities between Wild Game and Domesticated Meat in Fat Distribution and Composition" explores these differences in detail.) Eaton et al. [1996, p. 1735] note that the level of fat in domesticated meat compared to wild game is on average five times higher. Just as tellingly, the proportion of saturated fat as a percentage of total fat is five to six times higher in domesticated meat compared to wild game. And there are other significant differences as well. Eaton and Konner [1985, p. 285] point out that EPA, an essential omega-3 fatty acid, constitutes approximately 4% of total fat in wild game, while in domesticated beef there is almost none. Eaton et al. [1996, p. 1736] further note that game meats are also high in other essential fatty acids such as arachidonic acid [AA] and docosahexaenoic acid [DHA], with the result that the character of meats consumed in hunter-gatherer diets results in high plasma levels of the aforementioned essential lipids compared to Westerners, and helps to maintain a dietary omega-6 to omega-3 fatty acid ratio ranging from 1:1 to 4:1--far below the 11:1 ratio typical of Westernized countries. The significance of the difference here as discussed by Eaton is that the high omega-6 to omega-3 ratio that prevails in Western countries may be linked with the promotion of cancer.

Logically invalid extrapolation from SAD/SWD diets to all omnivore diets. Likewise,
clinical studies comparing the SAD (standard American diet) or SWD (standard Western diet) to veg*n diets often assume that the poor results on the SAD/SWD imply all omnivore/faunivore diets would give similar results. That is, bad results on the SAD/SWD are cited as "proof" that virtually all omnivore/faunivore diets are "bad." There is no logical basis for this; it is an example of "unsupportable extrapolation." Results of epidemiological studies may be exaggerated. The importance and relevance of epidemiological data, especially the China Project, may be exaggerated by dietary advocates. This will be discussed in greater depth in a later section.

Let us now begin our summary exploration of these topics.

How reliable are animal studies that use domesticated/feedlot meat?


Due to the significant differences in composition of wild vs. domesticated meats, animal studies that use meat from feedlot animals cannot be used to reliably project the results of all possible omnivore diets. That is, the results of such studies are not representative of hunter-gatherer diets, or, for that matter, any other omnivore diet that excludes domesticated/feedlot meat. In other words, to project from animal studies using domesticated meats to all omnivore/faunivore diets is a logical fallacy--a fallacy one may find commonly practiced by raw/veg*n diet advocates. Some additional remarks on this issue are:

Hunter-gatherers generally consume all (or nearly all) of the animals they harvest.
(O'Dea [1991] discusses this point regarding the Aborigines of Australia.) Preference is given to internal fatty organs over the lean muscle meats. In contrast, test animals in experiments may be fed nothing but marbled muscle meats. (The nutritional composition of organs is itself different from the composition of muscle meat, not to mention marbled, domesticated/feedlot muscle meats.)

In some tests, animals are put on a diet that may be far beyond their range of natural diets, e.g., putting a natural vegetarian (like a rabbit) on a diet of, say, cooked bacon. As
the gut morphology of a rabbit is not adapted to a pure meat diet, the results of such tests must be interpreted with caution.

162

In some tests, the animals used have been domesticated extensively, e.g., "nude" mice
that are considered desirable because they readily show results to drugs. Obviously, the results of tests on animals that were bred to "yield results" must be interpreted carefully. The issues of reliability and projection in cross-species comparisons deserve some consideration. See Freedman [1988] for a discussion of the (technical) statistical issues involved in developing cancer models based on data from mice.

The above points suggest the following conclusions:

Studies using domesticated or feedlot meat are not representative of all omnivore diets. Due to the many differences in the composition of wild vs. domesticated meats, the results
of animal studies based on the latter do not necessarily reflect results one might get on a huntergatherer type of diet, or any other omnivore diet that excludes feedlot meats. Obviously, however, animal studies that use feedlot meat have relevance for the SAD/SWD diet, as such diets generally include feedlot meat.

Thus we observe that extrapolating from animal studies using domesticated/feedlot meats to all omnivore/faunivore diets is an unsupportable extrapolation and a logical fallacy.

Clinical studies based on the SAD/SWD diet Narrow claims: Conventional veg*n vs. SAD/SWD. One does not have to look far to find
raw/veg*n advocates citing clinical studies that show the negative health effects of the SAD/SWD diet. The more savvy/honest advocates are specific in their claims, and state something like, "Study X shows the (conventional) vegan diet promotes better health than does the SAD/SWD." Such precision in language is good; however, the fact that healthy omnivore/faunivore diets do in fact exist should also be mentioned, at least occasionally, for completeness and for honesty. (It is rare to find a raw or veg*n advocate who bothers to mention that healthy omnivore diets do exist.)

Fallacious claim: One type of veg*n diet vs. all omnivore/faunivore diets. The less savvy/honest
raw/veg*n diet advocates point to clinical studies of veg*ns vs. SAD/SWD consumers, and then make the massive logical fallacy of saying that such evidence proves or indicates that the veg*n diet tested is better than all omnivore diets. It is a logical fallacy to assume that test results for one type of vegan diet versus one type of omnivore diet (the SAD/SWD, usually) indicate that the tested vegan diet (or, even worse, all vegan diets) are better than all omnivore diets. The SAD/SWD is only one of a large variety of possible omnivore diets. For example, all of the diets in the China Project (discussed in the next section) are omnivore diets, and are generally different from the SAD/SWD diet. Obviously, omnivore diets vary considerably--just as vegan diets do.

Unconscious double standard. One criticism veg*n advocates make concerning some of the clinical
studies that show negative health effects on veg*n diets is that such studies may use non-representative veg*n diets, like macrobiotics, as their sampling base. Along the same lines, note that the "conventional" vegan diet (one that makes use of grains, legumes, etc., usually cooked) is radically different from the 100% raw fruitarian diet. Further, 100% raw fruitarian diets have a dismal record of failure in the long run. The usual long-run result of 100% raw fruitarian diets (per anecdotal evidence) is ill health. Assume that a long-term clinical study of fruitarians existed, and it showed the negative health results so common in anecdotal reports. Would it be fair to extrapolate from the fruitarian diet and condemn all vegan diets? Not really--that would be a logical fallacy. In a similar manner, raw and/or conventional veg*n diet advocates who use clinical studies based on the SAD/SWD diet to condemn all omnivore/faunivore diets are engaging in a logical fallacy.

Examples of citing clinical studies based on the SAD/SWD diet

163

For examples, let's look at two of the studies cited by Ted Altar in his paper, What Might Be the "Natural" Diet for Human Beings. [Note: See reference list for link to paper on Internet]. Altar cites Friedman et al. [1964] as proof that red blood cells become "sticky and sludge" after a meal of animal flesh. Armstrong et al. [1981] is cited as evidence that anti-inflammatory hormones and sex hormones increase after eating animal flesh. Altar then claims these "immediate body reactions to meat" suggest that meat is not in accord with our evolutionary diet.

Study cited does not compare vegetarian vs. non-vegetarian diets. However, the above
suggestion does not follow from the studies cited by Altar. Both studies used the SWD diet (Armstrong et al. [1981] was done in Australia). Friedman et al. [1964] compared two groups of volunteers, all eating the SAD diet, one group consistenting of "personality type A," the other group personality type B." Both groups were fed a meal of ~1900 calories, 67% fat (bacon, eggs, butter, milk, cream). Despite the massive dose of fat, sludging was observed in 10 of the 12 members of group A, but only 3 of group B. Thus Friedman et al. [1964] is focused on comparing two personality groups, A and B, given identical nonvegetarian diets. If a conclusion is to be drawn, the results suggest personality may correlate with the body's ability to handle ingested fat. More important, the Friedman et al. [1964] study is not a vegetarian vs. SAD comparison study; it did not attempt to compare animal vs. plant fats, and so on. Also, the study used large amounts of feedlot meat and dairy from domesticated animals. Hence the results cannot be projected onto hunter-gatherer diets, other omnivore diets, or even veg*n diets (as none of the subjects were veg*ns, and the foods tested were nonveg*n).

SAD used incorrectly to impute results for all omnivore diets. As for Armstrong et al. [1981], the
study compared levels of reproductive hormones in vegetarian and non-vegetarian Seventh-Day Adventist women in Australia. It is appropriate to let Armstrong et al. speak for themselves ([1981, pp. 761, 765)]: The dietary data were such as to permit only qualitative assessment of total dietary fat intake... In accordance with our original hypothesis, the vegetarians had a lower excretion of estrogens than did the non-vegetarians. The difference was small but significant and largely due to a lower excretion of E3 [estriol, a type of estrogen]... The E3 ratio was less in vegetarians than in non-vegetarians, which is the opposite of what might have been expected from the evidence of other studies relating E3 ratios to risk of breast cancer... The total urinary estrogen values were lower in the vegetarians than in the nonvegetarians, the difference being mainly due to the lower E3 excretion. Armstrong et al. then suggest that the lower estrogen excretion in vegetarians may be due to metabolic differences in estrogen pathways, or to lower estrogen production in vegetarians. Note the remark above that implies the E3 ratio in vegetarians might be associated with an increased risk of breast cancer. Suddenly, the lower estrogen levels of the vegetarian diet are not as attractive as before. Inasmuch as the Armstrong et al. [1981] study uses the SWD--in this case, the standard Australian diet--as comparison, once again the results cannot be projected to all omnivore diets.

Claims are not supported by the studies. Thus we observe that Altar's claim that these "immediate
body reactions to meat" are suggestive of meat not being in accord with our evolution is fallacious. The evidence presented by Altar is based on feedlot meats in the context of the SWD diet--foods that are far removed from our evolutionary diet.

164

Please note here that I have used Altar's paper as an example because it cites only a few studies, and it is possible to examine those in detail. With a large number of citations, it is difficult to check them all. It is not my intention here to "pick on" Ted Altar.

Drawbacks to Relying Exclusively on Clinical Studies of Diet


Exclusive reliance on clinical studies is narrow-minded and discounts other important evidence
There is a tendency among most conventional veg*n dietary advocates to rely heavily and/or exclusively on clinical studies in discussing the health effects of diets. This is unfortunate and, quite frankly, narrowminded, as it ignores the other important scientific information currently available, e.g., the evidence of evolution and anthropology. (Of course, some formal dietary studies are ecological [epidemiological] and are based on the aggregation of large amounts of data. However, such studies require followup clinical studies to confirm hypotheses suggested by the ecological data. So, one ends up right back [again] with clinical studies.) One example of the information to be gained from evolutionary and/or "paleo"-type studies is the data available from research on hunter-gatherers regarding the health effects of their omnivorous diets (which are much different than various versions of the SAD/SWD omnivorous diet utilized in most clinical research), which was discussed earlier here in Part 8. Another area of research, one closely related to evolutionary studies, is the newly emerging biomolecular study of genes and their effects. These offer the promise of not simply new information but also a new way of thinking about dietary problems.

Genetic studies offer a new paradigm for giving insights into optimal diets for INDIVIDUALS, rather than blanket recommendations based on clinical studies of GROUPS. With the mapping of the entirety of the human genome currently in progress, insights will be
(and are already beginning to be) gained into the actual purpose and design of the human organism's functioning at the level of genes/molecular biology (the most fundamental "physiological" level possible). In conjunction with paleoanthropological insights regarding evolutionary adaptation, this understanding of the body's actual genetic design--when applied to nutritional questions--stands to give us a newly detailed understanding of the consequences of various eating patterns at the lowest nuts-and-bolts level. Eventually this will result in the ability to ascertain the actual step-by-step mechanisms governing the consequences of how foods are handled in the most unmistakable, fundamental way, and how their operation may vary from one individual to another.

Because they are based on studies of groups, clinical trials are vulnerable to yet-to-bediscovered confounding factors based on individual (genetic) differences. By definition, of
course, these confounding factors can unfortunately be discovered only after the fact. Nevertheless, clinical studies are based on the assumption that with tightly-enough-controlled protocols, eventually all confounding variables can be eliminated. The desired objective in this is to be able to make dietary recommendations based on the outcomes of such trials that can be applied more or less equally to all individuals fitting the parameters controlled for. In itself, this is certainly logical. However, controlling for all--or progressively more and more--confounding variables may eventually show that beyond a certain level of commonality, individual uniqueness becomes more important. Indeed, in Part 7 of this paper, we saw how there may be significant, telling differences between different individuals at the genetic level in regard to the problems both of insulin resistance and hereditary hemochromatosis, depending on one's evolutionary heritage. It's important to note here, also, that these types of genetic differences are not "genetic disorders"--they are normal (polymorphic) variations in genes that can occur between human populations or individuals depending on their evolutionary background. (Also see The Late

165

Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance for a few other examples of genetic differences that may play a role in diet and disease.)

Genetics should greatly individualize the study of diet. One outcome of the unique insights to be
had from the new paradigm of evolutionary and genetic studies should be the potential to greatly individualize the study of diet. This is something that clinical studies by their nature--as studies of groups based on statistical averages, rather than of individuals based on the interaction of actual (and unique) genetic mechanisms--are not as readily geared to explore. Of course, clinical studies may still be required to test hypotheses about the interaction of genes with diet (by including controls for genetic differences between individuals) if such effects cannot be ascertained from genetic research at the physiological level alone. However, ultimately, the consequence of increasing genetic insights into diet will be the ability to evaluate which diets, or dietary patterns, work best for different individuals. To some extent, this places limits on the currently prevailing paradigm of clinical studies based on the statistically averaged results of groups. In so doing, it highlights the tendency of dietary idealists to use clinical studies as fodder for blanket recommendations meant to apply equally to everyone, or that assume one type of "ideal" diet can, should, or will work optimally for everyone.

Common sense and anecdotal evidence deserve consideration in certain circumstances


Relying exclusively on clinical studies also ignores anecdotal evidence. Although anecdotal evidence is indeed unreliable, at times it is the only evidence available. Despite its unreliability, it may be useful, under some circumstances, when no published studies are available on the topic of interest or when clinical data is skimpy or equivocal. Again, similar to what was noted above in regard to genetic issues, such circumstances may well depend on individual differences that escape the notice of clinical trials.

Potential self-selection effects in long-term adherents of special diets. An often


unaddressed issue involved when ruling out anecdotal evidence in the case of vegetarian and/or vegan diets (whether raw or conventionally vegan) that may be relevant is the problem of selfselection bias that can exist in some of the more formal studies. Even by the acknowledgment of those in the veg*n community interested in scientific assessments, the ranks of long-term veg*ns are comprised of the few left among a continuing stream of people trying such diets but later dropping out. (See archives of the Sci-Veg internet listgroup for past discussions of problems in arriving at reliable demographic or other more mundane statistics about veg*ns in the general population, for instance, due to sampling problems of this sort. One relevant thread runs from 6/17/98 to 6/29/98, on the topic "Survey Results"--search for the phrase "stock-vs-flow" to locate the most relevant passages on self-selection bias.) This can be a potentially large source of bias in long-term longitudinal studies that follow a chosen group over time.

Dropouts from veg*n diets that may be due to "failure to thrive" (FTT) go largely unaccounted for. (Note: See the later section in Part 9 on failure to thrive for a brief discussion
of this syndrome.) The problems of social pressures that keep people from adhering to diets that differ significantly from cultural norms, over the long term, are much discussed and often cited as the dominant factor responsible for dropouts; and such pressures can certainly be acknowledged as looming large among the potential reasons. (This is actually a problem with any diet significantly different from the norm, and not unique to veg*nism.) However, at the same time, a continuing blind spot in the veg*n community seems to be a refusal to seriously consider the possibility some people simply may not do their best health-wise on a veg*n diet no matter what manipulations of it are tried. We include here among such people normal individuals without congenital enzymatic deficiences or other such syndromes. (This is mentioned because citing such syndromes, or carping over nitpicking-type details, is often the standard reply given in the veg*n community--in response to counterexamples--why the failures are held not to be indicative of possible problems with the diet itself.) Here, basic common sense should suggest the possibility that those who do the best on veg*n diets may be the ones who

166

would tend to predominate among the ranks of long-term adherents. (Again, this could also be expected to apply not just to veg*nism but to any special diet significantly different from a culture's norm.)

Anecdotal reports are of potential value in assessing "failure to thrive" given longitudinal studies are of those voluntarily veg*n. Thus, long-term studies of selfselected veg*n populations (rather than studies that randomly select individuals from the general population) may not be reliable in impartially assessing the full range of possible physical effects of veg*n diets that may occur in a more random sample of individuals. Consequently, when longterm studies of people who remain on vegan or vegetarian diets (or any diet, for that matter) for fairly significant periods of time are composed of those who consciously have chosen to be on the diet themselves, self-selection effects cannot logically be ruled out. (This of course is one reason why random or other sampling techniques designed to eliminate bias are so important in setting up target and control groups.)

Moral ostracism as a masking effect obscuring awareness of "failure to thrive."


Additionally, one of the thorniest problems that, in the opinion of this writer, continues to face the study of veg*n diets--given that they are often ideologically/morally based (in addition to whatever scientific merit they may also have)--is that those who "fail" or otherwise "abandon" the diet normally face significant ostracism from their former peers. Given this situation, there is justifiable merit in considering the value of anecdotal reports as potential indicators of what the full spectrum of physical outcomes on veg*n diets over the long-term may actually be, which would include equal attention to those who abandon diets. (Note: This caveat obviously does not apply to studies [usually much shorter-term, of course] composed of randomly selected individuals to make up a pool of test subjects.)

Most who abandon veg*n diets do not come forward publicly without encouragement. This problem of morally based social disapproval from other adherents of
veg*n diets is an additional factor beyond just the self-selection effect that compounds matters, making it very difficult procedurally for those wanting to study the situation to get an accurate sampling of the vegan dropout population to check the incidence of possible negative long-term problems. Often such former adherents understandably do not want to come forward very publicly or make known any "failure" they may have experienced, for obvious reasons (personal attacks, shame, being stigmatized by former associates, etc.). Rather than seeking to understand to what degree FTT might exist, most often one can sense an almost palpable eagerness on the part of staunch adherents to explain away such problems or assign them into one or the other kind of irrelevance where the actual adequacy of the diet itself may be concerned. Given this prevailing atmosphere, it generally takes a proactively supportive environment before people will willingly and openly discuss any problems they may be having without withholding information.

Examples of FTT. Within the rawist vegan community (a population that has yet to be
scientifically studied) it is only very recently that anecdotal reports of the physically based problems that are one aspect of what, apparently, leads to the low long-term adherence rate have become more commonplace. This is primarily due to the emergence of communication groups and other "safe," "support" forums that have been created in recent years for people on such diets (some on the Internet). As mentioned above, without such proactive encouragement, it is unlikely those in the mainstream of vegetarianism will ever hear about more than a dribble of such cases. (See The Psychology of Idealistic Diets on this site for an in-depth discussion of one pool of such anecdotal reports from the Natural Hygiene movement--a well-known subculture of vegans, some of whom eat raw vegan diets, others a more conventional-style vegan diet. Also see our Dietary Problems in the Real World page for first-person accounts of problems some individuals have experienced on various forms of such diets, some of which relate to possible FTT.)

167

"Cheating" and dietary "exceptions" as a potential confounding variable to vegan research. It would be remiss at this juncture not to mention another important and potential
"confounding variable" that faces those who would study the effects of diets such as veg*nism (particularly strict veganism) that restrict or eliminate an entire food class (animal foods) the body has been genetically programmed to expect by evolution. Anyone who has ongoing access to personal conversations with a wide range of individuals practicing vegan diets, and who has had the chance to gain their confidence and cross-examine them in a friendly, sympathetic way, and is honest, will tell you that "cheating" on vegan diets (making occasional "exceptions" or eating foods not strictly "allowed") is not that unheard of, depending on the individual. Not that any given individual(s) may not be a perfect adherent. However, in some instances these dietary "exceptions" can be fairly regular and significant (anywhere from weekly to monthly "exceptions," perhaps) such that they add up over time. Despite the best of intentions, then, some individuals find themselves craving non-veg*n foods and cannot stick as faithfully to the diet as they might wish.

Role of dietary "exceptions" a particular concern with vegan research when deficiency questions are at issue. This is (or ought to be) a significant issue given that vegan
research is often concerned with potential deficiency concerns (as opposed to problems of excess). Being able to track or account for behaviorial "exceptions" thus represents a serious obstacle to robust experimental design when attempting to evaluate the adequacy of strict vegan diets (for a range of individuals) over the long-term. If the diet is supposed to be strict, but the behavior may not be, and you do not have a way of rigorously checking for it, then your research methodology has problems if such noncompliance is implicated in preventing deficiencies. Certainly the issue of the reliability of self-reports given by test subjects about food consumption in interviews or on questionnaires is a potential concern with any dietary research. However, with veganism it assumes special significance given the restrictive nature of the diet--in the sense that animal foods are eliminated, yet these are the very foods (usually dairy or eggs) partaken of in "exceptions" to the diet. Where deficiencies may be at issue, driblet amounts can sometimes be significant, in that it may not take a huge amount of a food to have an effect vis-a-vis deficiency. For vegan research, therefore, this possibility represents an important potential source of confounding error to address. Unfortunately, this is not a problem that appears to be openly acknowledged or discussed much among those relying entirely on clinical research about vegan diets. Here again, therefore, anecdotal reports are not without their value in getting some idea of the importance that unmeasured "exceptions" may have, or the role they may play, in individuals following diets that are nominally vegan but perhaps may not always be completely so, in reality.

Hateful approach of certain fruitarian extremists. In contrast to the reliance on clinical studies by
conventional vegans, the approach of many raw veg*n advocates differs. Some quote clinical studies very selectively, ignoring any that challenge their dogma (the usual rationalization is that it is from a "mixed" diet, and hence non-representative of results one might achieve on a raw vegan diet), while aggressively promoting any studies that appear to support their dogma (even if the results are for "mixed" diets). A few fruitarian extremists reject (effectively) all clinical studies and science, and certain fruitarian crank science promoters selectively reject scientific research that challenges their dogma, on the hateful grounds that (the particular) science in question is the product of the minds of people who eat cooked foods (i.e., is "cooked science"), hence is wrong, worthless and/or cannot be trusted. The analogy to racism is obvious here; recall Hitler denouncing "Jewish science" if the analogy is not clear to you.

Other factors in evaluating clinical studies

168

Some of the limitations and problems inherent in clinical studies that may be relevant depending on the situation are:

Statistical deficiencies. Some studies are based on small samples, short-term experiments
(when long-term experiments are needed), and/or the statistical analysis is deficient (e.g., no control group, failure to measure covariates, uncontrolled confounding by external factors, errors in the choice of methods used for data analysis, etc). Creative intrepretation possible. One can usually find a (single) study, or interpret the results from a study in a way that supports a particular viewpoint--even if the view is known to be false. Potential for bias. Some studies are paid for by special-interest groups, and the objectivity of such studies may be dubious. One wonders if there may be some bias in the research.

Thus one should be cautious in interpreting the results of clinical studies. It is good to use multiple studies, review papers (those that look at or analyze the overall data from a wide range of studies), or standard reference books whenever possible, to avoid the problems of relying on only one study.

In summary Clinical studies are a tool, to be used in an appropriate manner. Like any other tool, they can be used incorrectly. Finally, no number of clinical studies based on SAD/SWD data can overcome the logical fallacy (common in raw/veg*n circles) of claiming that results from such studies (SAD/SWD data) apply to all possible omnivore/faunivore diets.

The Cornell China Project: Authoritative Proof, or Misinterpretation by Dietary Advocates?


EXAMINING THE VEGAN CLAIMS Introduction to the China Project
The terms "China Study" and/or "China Project" will be used here to reflect the research published in Junshi et al. [1990] (and in related research papers, although due to the large volume of such material the discussion here is limited to a select few papers). This is a large ecological study of the diet and lifestyle of adults aged 35-64 years, in 65 counties in China. The ecological data were collected in 1983-1984, and included information on diet, smoking, consumption of alcohol, as well as analysis of urine and blood samples. The ecological data from 1983-1984 were aggregated on a county level, and supplemented with county data from a nationwide mortality survey in 1973-1975 as well as select demographic information from the Population Atlas of China. The size and scope of the China Study are impressive. As a result, some dietary advocates have aggressively promoted the China Study as "proof" that vegan diets are optimal or best. However, a closer look at the study reveals important limitations that impact the reliability, usefulness, and interpretation of the study results. Many dietary advocates are quick to cite the China Study without discussing the limitations inherent in such a study.

Limitations of the China Study


Let us now briefly examine some of the limitations of the China Study and its results. Quotes from the principal (China Study) authors are used liberally below, so you can learn about the limitations from the study authors themselves.

Level of aggregation of the study data yields, at most, 65 observations (data points) for analysis. The data in the China Study are aggregated at the county level. The result is that for

169

most health/disease/dietary factors to be analysed, there are 65 (or less) observations (data points). This is important for two reasons: o The study is often described as authoritative and reliable--characteristics that are usually associated with "large" data sets. When one learns there are only 65 observations (and hundreds of variables), it suddenly seems far less authoritative. Note that the term "large" is relative; for simple analysis of a very few variables, 65 data points may be adequate, but for sophisticated models involving several variables, hundreds (or even thousands) of data points may be appropriate. o The limit of 65 observations places limits on the number of variables that can be analyzed simultaneously (via multivariate--that is, multiple-variable--techniques).

Side note: (statistical, can skip if you wish) Even a simple technique like some of the
regression methods based on splines may be seriously limited on a data set of only 65 points. Spline methods are becoming increasingly important, because unlike traditional regression techniques they do not assume the functional form of a relationship between variables. (That is, they do not assume ahead of time what particular mathematical relationship may exist between one variable and another.) Such limits are quite frustrating on a data set that includes hundreds of variables. Campbell [1989] appears to acknowledge this (p. 2): Although uniquely comprehensive... it is not yet clear how satisfactory the analysis of multiple factor effects will be upon disease risk, given the limited number of survey counties (65). More complete evaluations of the virtually unlimited interactions between different disease causes may have to await the addition of still more dietary and lifestyle studies of disease mortality rates.

Limits on the use of geographical correlations, the primary data of the China Study monograph (Junshi et al. [1990]). The China Study monograph [Junshi et al. 1990] provides
geographic correlations which are of limited direct use. Although it is possible to develop statistical models in which the dependent variable is a correlation, models constructed using the underlying variables from the relevant correlation may be far more meaningful and useful. Peto, writing in Junshi et al. [1990, pp. 76-77] notes: Although geographic variation in particular disease rates can provide clear evidence that, in areas where those rates are high, that disease is largely avoidable, they may provide frustratingly unclear evidence as to exactly how it can be avoided... An even more striking example of the limitations of geographic correlations is that oesophageal cancer in China has no clear geographic association with smoking, and has a significantly (P < 0.01) negative association with daily alcohol intake. Peto and Doll [1981], as cited by Peto in Junshi et al. [1990], also remind us that attempts to separate causality from confounding factors in geographical data via the technique of multiple regression are often unsuccessful (or, my comment--misleading, which is even worse).

The China Study report lists only 6 statistically significant correlations between meat-eating and disease mortality. Further, 4 of the correlations are negative, which
indicates that the mortality rate for that disease decreased as meat consumption increased. The two diseases that had positive correlations with meat consumption are schistosomiasis, a parasite, and pneumoconiosis and dust disease. Thus, the direct evidence of the study is hardly the condemnation of meat consumption that veg*n dietary advocates may claim it to be. It should be noted here that correlation is a measure only of linear relationships, and other analytical methods may yield different results. Despite the

170

possibility of the existence of more complicated statistical relationships, it seems quite odd, given the interpretations of the study made by veg*n dietary advocates, that meat intake generally did not correlate with disease mortality. (See table 5033, pp. 634-635 of Junshi et al. [1990].)

Ecological studies (like the China Study) generate hypotheses, they do not prove them. Campbell and Junshi [1994] concisely state this limitation (p. 1155S):
First, this study is ecological and includes 6,500 individuals residing in 130 villages. Thus according to widely held assumptions, any inferences concerning cause-and-effect relationships should be considered to be hypothetical only, with validation to be provided only by intervention or prospective analytic studies on individuals. Thus we note that the China Study requires backup clinical studies before making inferences or drawing conclusions. The main hypothesis of the China Study is whether diets that are predominantly plant foods reduce chronic diseases. However, some veg*n advocates go far beyond the main hypothesis of the study, and claim it proves that veg*n diets are "better" than all omnivore diets. Further, such claims may be made without supporting clinical studies, and without regard for the actual range of diets included in the study. (The latter point is discussed later herein.)

The ecological fallacy, and its impact on ecological inference. Freedman [1999, p. 1]
provides a brief overview of the ecological fallacy (boldface emphasis below is mine): In 19th century Europe, suicide rates were higher in countries that were more heavily Protestant, the inference being that suicide was promoted by the social conditions of Protestantism (Durkheim 1897). A contemporary example is provided by Carroll (1975): death rates from breast cancer are higher in countries where fat is a larger component of the diet, the idea being that fat intake causes breast cancer. These are "ecological inferences," that is, inferences about individual

behavior drawn from data about groups... The ecological fallacy consists in thinking that relationships observed for groups necessarily hold for individuals: if countries with more Protestants have higher suicide rates,
then Protestants must be more likely to commit suicide; if countries with more fat in the diet have higher rates of breast cancer, then women who eat fatty foods must be more likely to get breast cancer. These inferences may be correct, but are only weakly supported by the aggregate data. ...However, it is all too easy to draw incorrect conclusions from aggregate data.... For example, recent studies of individual-level data cast serious doubt on the link between breast cancer and fat intake (Holmes et al. 1999).

Eliminating group-level confounding has no necessary relation to individual-level confounding. Greenland and Robins [1994a] provide an interesting and insightful statistical
critique of ecologic studies, furnishing examples that demonstrate how ecologic and individuallevel studies are subject to different forms of confounding and errors. In conclusion, they state that [Greenland and Robins 1994, p. 749], "...conditions that guarantee no confounding in an ecologic study are logically independent of the conditions that guarantee no confounding in an individuallevel study..." That, of course, sharply limits the relevance and reliability of efforts to use ecological data to make inferences about individuals. They also discuss how the techniques commonly used for "adjusting" ecological data for covariates (other variables) might not reflect the true underlying conditions due to non-additivity, non-linear relationships, and other factors (i.e., the common practice of a linear adjustment may be inappropriate and/or inadequate for many covariates). Further, they point out that having a large

171

number of regions in an analysis does not guarantee a randomization of the underlying relationship (if any) between covariates and regions.

Averaging at the aggregate level prevents assessment of individual-level relationships. Commenting on Greenland and Robins [1994a], Piantadosi [1994, p. 763] makes
the cogent remarks: It is impossible to study ecologic analyses very carefully and come away with much confidence in them. Averaging is a process that necessitates some loss of information... When analyzing such aggregated data, we not only lose all ability to extend inferences reliably to less aggregated data but we even lose the ability to estimate the direction and magnitude of bias... I encourage epidemiologists to understand the deficiencies of group-level analyses and limit them to the role of hypothesis generation. Readers with some knowledge of statistics will find the above papers, plus the related papers of Greenland and Robins [1994b] and Cohen [1994] to be of interest, and to provide a good introduction to the statistical limitations (and problems) that are relevant when one tries to make ecological inferences. This underscores the earlier point (above) that the China Study merely generates hypotheses, it does not prove them.

Results of a study done in one culture or society do not necessarily apply to other cultures or societies. (They potentially can, but they also might not.) Campbell and Junshi
[1994] make this point (p. 1155S): A second general comment on comparing residents of rural China with residents of highly industrialized societies is that it may not be appropriate to extrapolate diet-disease relationships across cultures. Campbell and Junshi [1994] then go on to briefly discuss the linkage between diet and health, and how the incidence of chronic disease changes when groups change diets. In that regard, the discussion in the previous section of chronic health problems among the Aborigines of Australia, after adopting high-carbohydrate diets, is relevant. Wenxun et al. [1990] provide actual examples of differences in Chinese and Western disease patterns (p. 1033): An examination of the correlations among CVD (cardiovascular disease) mortality rates, erythrocyte [red blood cell] fatty acids, blood lipid indices, and selected environmental factors indicates that some relationships reported for Western populations were absent in China whereas other relationships were observed. For example, there was no correlation between county means [averages] of plasma cholesterol and county mortality rates for any of three CVD types. Thus, factors other than total dietary lipid or plasma cholesterol may be important in explaining the geographic distribution of CVD mortality rates within China. One probably will not find a consensus on the reasons why extrapolation of study results from one culture to another is unreliable. However, a number of probable factors include: differences in income and lifestyle, differences in level of industrialization causing differences in exercise levels and exposure to chemicals, minor but significant differences in genetics, differences in food preparation practices, and so on.

Lack of actual income data for the survey participants is a serious flaw. It makes adjustment of the data for the effect of income less reliable. Ideally, one would like to

172

adjust data for income in statistical analyses, as it is presumably a good proxy variable for the degree of Westernization. (Many of the degenerative diseases common in Western societies are in effect, diseases of affluence, hence adjusting the data for actual income may be important in statistical analysis.) However, the China Study has no direct measures of the income of the participants. Instead, the study includes demographic data that could be utilized to yield estimates of county average, per-capita income. However, such an estimate is a poor substitute for actual income data on the study participants. Also, per-capita income may be skewed by the presence of a few, very high-income individuals living in an otherwise (very) poor county.

Attempts to use the China Study to prove that all omnivore diets are bad are yet another logical fallacy. Ultimately, attempts to claim that the China Study "proves" all
omnivore/faunivore diets are bad fail as yet another logical fallacy. Basically, none of the county diets in the China Study were vegan diets, and none were evolutionary diets (and, by the way, none were the SAD/SWD diet). Most were high carbohydrates, grain-centered diets (though one county reported high consumption of both meat and dairy--reminder: dairy was never a part of humanity's evolutionary diet). Campbell, writing in Junshi et al. [1990], reports (p. 63): The national mean [average] percentage energy intake obtained from animal foods was observed to be 5.7%, with a range of 0.1-59.4%. Thus we observe that extrapolation to strict vegan or evolutionary diets (or even the SAD diet) go beyond the range of the China Study data, and hence such projections are less reliable statistically. Also, as none of the China Study diets were evolutionary diets, and the meat consumed came from domesticated rather than wild animals, the results from such (Chinese) diets cannot be extrapolated to evolutionary diets (i.e., yet another logical fallacy).

Cancer, veg*n diets, and the China Study


Veg*n dietary advocates sometimes cite the China Study as indicating that it suggests veg*n diets may provide increased protection from cancer when compared to "omnivore diets." Along these lines, the following points are of interest.

Recent research in Australia found that a high-starch Chinese diet did not reduce risk of colon cancer. The paper of Muir et al. [1998] describes a randomized, crossover dietary
intervention study in which 12 people (of European descent) in Australia followed for 3 weeks each a high-starch diet similar to the diet of low-income comunities in China, and later the SAD (standard Australian diet, in this context). The Chinese-style diet lowered serum cholsterol and fecal pH. However, for all other fecal markers tested, the results from the Chinese diet were worse than for the SAD. Muir et al. conclude [1998, p. 372]: These results suggest that consumption of high-starch diet alone is insufficient to reduce the risk of developing colon cancer. As the above reflects results of only one study, the usual cautions on interpretation apply.

Collaborative analysis of 5 large studies finds no difference in death rates from cancer, vegetarian vs. non-vegetarian. A recent collaborative analysis of 8,300 deaths
among 76,000 people in 5 prospective studies [Key et al. 1998] compared death rates of vegetarians and non-vegetarians for a number of diseases. The studies involved included large proportions of vegetarians. Key et al. [1998] found that vegetarians were less likely to die of ischemic heart disease than non-vegetarians, but there were no differences in the death rates for a number of types of cancer: stomach, large bowel (colon), lung, breast, and prostate.

173

Note: In the five studies analyzed by Key et al. [1998], four of the studies defined vegetarians as people who did not eat any meat or fish, with non-vegetarians being all other people (i.e., this would include those who did eat dairy and eggs in their diets). The other study in the analysis classed as vegetarians those who claimed to be vegetarians; a followup on that study 5 years later found that only 66% of the self-identified vegetarians ate meat or fish less than one time per month. (This suggests some mis-classification error in that study or, as the followup was 5 years later, it may reflect people who abandoned the veg*n diet.) The above results suggest that those who claim veg*n (or Chinese-style high-carbohydrate) diets may provide protection from cancer may be premature in their assessments .

In Summary The China Project is often cited in an inappropriate manner by veg*n dietary advocates. It does not "prove" vegan diets are the "best" diet. Strict vegan diets, hunter-gatherer (evolutionary) diets, and even SAD/SWD diets are not in the set of diets in the China Project, i.e., are outside the range of the data from the China Project. Claims by dietary advocates that the China Study "proves" all omnivore diets are bad and (some) vegan diets are better are a logical fallacy. It would be better if the (interesting) results of the China Project were not misinterpreted or misrepresented by the "popular" health media or by dietary advocates.

Instinct vs. Intelligence in Diet: Where is the Line?


This section will briefly address the interactions between instinct and intelligence; and then a long list of claims (about instinct) made by raw-vegan/fruitarian extremists will be discussed.

Introduction: The difficulty of distinguishing between instinct and intelligence


One claim often made by those advocating comparative proofs of vegetarianism is that eating plant foods is instinctive, while eating animal foods is non-instinctive or unnatural. Accordingly, the question of whether particular actions are due to instinct or intelligence is controversial and a source of major disagreement. The problem lies in "confounding"--that is, how to separate instinct from intelligence in the arguably "unnatural" modern world we live in. There appear to be no hard answers to the questions here, although many raw/veg*n advocates claim otherwise.

Individual intelligence can override instinctive restraints. A good introduction to the topic is
provided by Itzkhoff [1985, p. 171]: All our difficulties, as well as all our possibilities have come about because the sapient brain overrode the last restraints, indeed the directiveness, that instinct gives to animals. In the animal world, intelligence is guided by instinct to achieve clear-cut survival needs... Man, the generalized intelligent ape, has a brain that establishes the rules of the game, almost irrespective of the individual's bodily or even grossly survivalistic needs... What remains to man after these basic conservational restraints of instinct are extinguished is a super-intelligence that filters all major categories of human behavior through the cortex. He is a thinking animal with a complex brain, a supremely energized mammalian brain that must now control, direct, guide his behavior. The old passions, energies, and drives no longer have built-in censors. A number of raw/veg*n advocates openly criticize intelligence because it allows you to override your instinct, which the advocates claim (with virtually no credible proof or logic to support them) is that of a veg*n animal. The argument is that intelligence allows you to eat the "wrong" foods (where "wrong" is usually equivalent to whatever the dietary advocate dislikes), and these "wrong" food choices often get

174

institutionalized into culture. Some of the more extreme raw dietary advocates (fruitarians, mostly) bitterly and hatefully denounce cultural eating patterns, and culture in general, because of this.

Eating Animal Foods, Part 1: Instinct or intelligence?


The major question to address here is whether eating meat or animal foods is instinctive or not. A number of raw/veg*n advocates allege that everyone is repulsed by the act of killing another animal and eating its flesh. It should be noted that no real proof is ever offered to back up such claims; what proof--other than the advocate's personal feelings--is there on this subject? Let's consider the claims that people are "naturally repulsed" by the act of killing and eating other animals. Such claims are clearly contradicted by the following:

No veg*n (hunter-) gatherer societies. Revulsion against the killing of animals is apparently
(for the most part) absent in hunter-gatherer societies, none of whom are veg*n, and all of whom rely on animal foods for part of their diet. Note also that humans have been hunter-gatherers for 99% of the time since the human genus Homo appeared. We are all the descendants of (nonveg*n) hunter-gatherers. Sport (and survival) hunting. Claims of universal revulsion are contradicted by the considerable popularity of sport hunting in many countries and societies. The claims are also contradicted by the millions who hunt for food (including fishing) to ensure their survival. The meat-animal death connection is clear to many consumers. In many countries, animals are sold alive (and killed at home), or they are killed (in the full view of the consumer) at meat markets. The animal death/meat connection is crystal clear to millions (and probably billions) of meat-eaters. Remark regarding the two preceding points: Although many of these people could presumably be veg*ns, they choose to eat animal foods. So much for the claim of universal revulsion at killing/eating animals.

Chimps prize animal flesh. Claims of biologically inherent revulsion may be contradicted by
our evolutionary cousins, the chimps, who eat--and greatly prize (when it is available)--animal flesh. For an interesting account of a previously inexperienced modern individual's experiments with killing a chicken and immediately thereafter eating the fresh meat--raw--see Zephyr [1997].

So, next time a veg*n dietary advocate claims that our instincts prevent humans from killing and eating animals, ask them for credible proof of their claim(s). The advocate may simply be projecting his/her personal moral and emotional preferences onto others. However irritating--or enjoyable--the preceding analysis may be to you, it still does not directly answer the question of whether eating meat is instinctive or not, of course. Obviously, if eating meat (or veggies) is absolutely necessary for survival (survival is certainly an instinct), then it is, by definition, instinctive. However, in today's modern society, with a huge variety of foods available, how often is it absolutely necessary to eat any one food or food type (as substitutes are usually available)?

Eating Animal Foods, Part 2: An evolutionary view


Another approach to answering the question is to consider our evolutionary history, and to note that, since the very inception of the human (Homo) genus, ~2.5 million years ago, the human diet has included meat, and our metabolic and morphological makeup appear to reflect varying degrees of adaptation to animal

175

foods in the diet. Thus one can argue that eating animal foods is instinctive, because it is natural behavior-behavior that we have followed long enough so that evolutionary adaptation has taken place.

The association of increasing brain size with animal food consumption. For many readers, the
preceding paragraph is a convincing argument. However, other readers will quickly ask: What about the confounding effects of intelligence? Here the expensive tissue hypothesis of Aiello and Wheeler [1995], and related research, is relevant. Recall that the major point of the expensive tissue hypothesis is that the human brain increased in size (and our intelligence increased) via brain evolution fueled by a switch to a diet that included very significantly increased amounts of meat, and which allowed our gut (digestive system) to shrink thereby freeing metabolic energy (to support the increase in brain size). This hypothesis, and the related research discussed in section 4 herein, suggests that the consumption of meat and the evolution of intelligence are closely interrelated. To summarize, the evidence of evolution is as follows.

Eating some animal products (e.g., the lean meat of wild animals) is natural because humans have adapted to the behavior by evolution. Eating animal foods (may have) fueled brain evolution and hence fueled the increase in human intelligence.

Given the above information, the obvious answer to the question, "Is meat-eating instinctive or driven by intelligence?" is that both apply. That is, one can argue that eating (wild) animal products is both instinctive and intelligent, for humans, from the evolutionary point of view.

Individual intelligence makes the final decision. On the other hand, at the individual level,
intelligence may motivate some people to be raw/veg*ns, and to avoid animal products. That is an example of the power of human intelligence, acting at the individual or personal level.

Eating Animal Foods, Part 3: Morality and naturalism


It is appropriate to remind readers of some important points here.

Survival is amoral. Evolution is driven largely by survival--an amoral concept. The moral
arguments for vegetarianism are irrelevant in an evolutionary context. Individual intelligence implies individual choice. The overriding power of human intelligence allows us to choose to be raw/veg*n, or any other diet that appeals to us. You are not required to eat meat or animal foods. Instinct not an argument for SAD/SWD diet. Even though one can argue that eating animal foods was instinctive and intelligent in the context of evolution, readers are reminded that modern domesticated/feedlot meat is dissimilar in composition to wild animal meats, and diets high in feedlot meats are known to be relatively unhealthy. That is, the instinct and intelligence arguments do not support a diet of domesticated meats, or the SAD/SWD diet. Moral dietary decisions an exercise in individual intelligence. If your decision to be a veg*n is based on moral grounds, then the entire instinct vs. intelligence argument is of little or no relevance to you. Your morality, as an exercise of your intelligence, can override other considerations, if you so choose. Veg*n "instinct" part of false naturalism claims. The primary relevance, in this context, of the claim that veg*n diets are instinctive is that it is part of the false naturalism claims made by many raw/veg*n dietary advocates. The research available on this site provides evidence that raw/veg*n diets are not the natural diet of humanity but are, at best (in terms of concerns having to do with naturalism), a restriction of such diets.

176

Examining Fruitarian Claims about Instinct in Food Selection


Introduction. This section catalogs and addresses a long set of claims about instinct made by a few
fruitarian extremists. The claims are summarized and paraphrased here, to respect copyrights. The material below includes supplementary information on primates (hunting habits, fauna consumption, and use of medicinal herbs); the hunting skills of prehistoric humans; child/infant food preferences; and other topics of potential interest to rawists and/or veg*ns. If you are short on time, you might skim the material below, reading only the items of interest, or if you prefer, skip to the next section.

CLAIM: Humans are limited to a narrow diet (nearly 100% fruit) by our genetic code.
Instinct is a function of genetic code.

REPLY: Extensive real-world evidence indicates humans thrive on a broad spectrum of diets. Morphology and physiology are functions of DNA, and the existence of structural limits is implied.
However, the obvious (and overwhelming) evidence of individuals thriving on a wide variety of diets-ranging all the way from hunter-gatherer to conventional veg*n--indicates that humans can succeed on a wide array of diets. (We are not limited to ~100% fruit diets, and fruitarian diets have a dismal record of failure in the long-term.) One must wonder whether humans are really limited to "narrow" diets, or is it the case that the extremist making the claim simply limits themselves emotionally to their own narrow view? One further point here: Intelligence plays a role in morphology and physiology, both via brain evolution and via the culture/evolution feedback loop discussed in an earlier section.

CLAIM: The "instinct" to hunt and kill animals is not found in every human, hence it cannot be an instinct. Example: Most all domestic cats still have an instinct to hunt. REPLY: The above has already been addressed to a certain extent (instinct vs. intelligence discussion), so
we'll only make a few additional points here.

The claim is logically invalid. In some hunter-gather societies, the males hunt and the females
gather plant foods. Application of the "logic" of the above claim would lead to the result that eating plant foods is not instinctive either.

Many behaviors that are described as (allegedly) "instinctive" are actually learned behavior patterns. Cats (including domestic cats) and other predators are usually taught to hunt
by their mothers. (Obviously many human behaviors that are common across all cultures are learned behavior patterns as well.) Calorie deficiency and loss of libido: are raw vegan diets instinctive? Speaking of instinct, one wonders if it is instinctive to eat a diet that fails to provide adequate calories? (Many raw vegans are emaciated from insufficient caloric intake.) What about a diet that lacks sufficient vitamin B-12? Or a diet that (per numerous anecdotal reports) frequently causes loss of libido? (Raw vegan diets, yet again.) As reproduction is tied to libido, those who lose libido will, in the long-run, be selected out of the evolutionary gene pool.

CLAIM: Children instinctively choose sweet foods, like fruit. Parents bribe their children to get
them to eat (repulsive!) animal foods.

REPLY:
1. Sweetness is a sign of carbohydrates--calories/energy--in food, and there is reason to believe that for increased evolutionary survival, the desire for sweet may be instinctive or evolutionary (but see points 4 and 5 below). But sweetness is not the only naturally attractive taste characteristic in foods that appeals to children (point 4 below).

177

2. One should not discuss sugar without mentioning that fruit is of limited and/or sporadic
availability in most environments, even the rainforest. The evidence of orangutans (Knott
[1998], Mackinnon [1974, 1977] as cited in Chapman and Chapman [1990]) suggests that fruit is not always available, even in remote, pristine rainforest habitats. Sugar consumption may impact opioid receptors in the brain; see Blass ([1987], a heavily-referenced review article) for a discussion of this topic. Market acceptance is an important factor in the breeding programs for modern fruits, and sweetness (high sugar content) is an important factor in such breeding programs. Research on infant food preferences contradicts some fruitarian claims. Story et al. [1987] review the research of Clara Davis done in the 1920s and 1930s. Davis gave infants a free choice of a wide array of foods on a tray. Story et al. [1987] reports the favorite foods chosen by the infants (p. 104): Bone marrow was the largest single source of calories (27%) for one infant, whereas milk provided the bulk of calories for the other two (19% and 39%). All three infants shared a low preference for all 10 vegetables, as well as for pineapple, peaches, liver, kidney, ocean fish, and sea salt. These foods constituted less than 10% of the total energy intake. Davis observed that the infants ate much more fruit, meat, eggs, and fat than pediatricians typically advised. The results of Davis obviously confirm some preference for sweet foods. However, the results of Davis also indicate additional taste preferences, and discredit the claim that children are repulsed by animal foods; after all, bone marrow was the favorite food for one infant and prized by the other two infants as well. (Once again, it can be seen that the real limitations here exist in the highly emotional, narrow thinking processes of the extremist rather than in nature.) It should be clearly noted that the results of Davis are based on small samples. Despite this, they are intriguing, as they contradict most so-called "wisdom" about children's food preferences.

3.

4.

5. First reaction to sugar can be negative. If the response to sugar is truly instinctive, and
instinct is universal per a previous claim, then one would expect those who live without sugar to react in a very positive way when they finally get to (first) taste it. The quote below from Stefansson [1960 (p. 86)] suggests that this is not necessarily the case (italicized emphasis mine, in the following). The Copper Eskimos, so named because many of their weapons and tools were of native copper, had never dealt with any traders before 1910. They did not even know tea, used no salt, and lived exclusively on flesh foods, eating roots and such only in time of famine. In 1910, they for the first time tasted sugar, given them by the first trader to reach Coronation Gulf, Joseph Bernard. They disliked it.

CLAIM: Eating meat is a learned behavior, and not instinctive. Humans have been eating meat for "only" 2 million years. REPLY: This has already been discussed extensively earlier in this paper. CLAIM: Some humans are disgusted at the thought of eating meat. How could that happen to a true carnivore? REPLY: (Sarcasm) Some humans are disgusted at the thought of eating durian (a smelly tropical fruit
revered by some, hated by others). How could that happen to a true frugivore?

178

More seriously, the suggestion that we are true carnivores is a straw argument. Instead, the
evidence presented here supports the claim that humans are natural omnivores/faunivores, not total carnivores. (Although it may be possible humans might be able to survive--and survive well--as carnivores, e.g., the Inuit.) As for the disgust factor, given the role of animal foods in evolution and hunter-gatherer societies, is not such disgust merely the result of one's own: (a) moral/religious views, or (b) self-conditioning with veg*n dogma?

CLAIM: True carnivores often eat (only) their prey's internal organs and leave the muscle for
the vultures. Why don't human meat-eaters behave this way?

REPLY: Animal organs are widely consumed in many cultures, and by hunter-gatherer groups (see O'Dea [1991] for a discussion of this regarding the Aborigines of Australia). Humans are
different from lions in a number of ways, as follows. We are a different species. We are intelligent and use tools (technology). Our feeding behavior is different; it is an extension of the primate pattern (see Butynski [1982] for further information on this point).

The next set of claims deals with form, function, and primates.

CLAIM: Primates have adaptations exclusively for fruit eating--vision, hands, etc. True
carnivores usually do not have adaptations for fruit-eating.

REPLY: Here, another form of straw argument comparing humans to true carnivores is erroneously
brought up. However, on the subject of fruit-eating adaptations, as discussed in an earlier section, chimps and orangs have special adaptations for tree-climbing, which are very handy for a frugivore. Humans lack these important (frugivore) adaptations for efficient fruit-collecting. The idea that our vision and hands are exclusively for fruit-eating is utterly ridiculous--such claims were examined (and found lacking) in earlier sections herein (e.g., a single form can support multiple functions; adaptation is not limited to one specific form; etc.).

CLAIM: True carnivores hunt by smell alone; they don't need technology. REPLY: Repeating the straw argument, "Humans are not like true carnivores," does not make it relevant. Chimps, and (omnivorous, not carnivorous) primates in general, hunt by sight and not
by smell. (This point is discussed further later in this section.) Humans are a special kind of primate; we are not lions or tigers.

CLAIM: The great apes, except for chimps, are strict vegetarians. REPLY: This claim is simply in error and is typically based on outdated research. Modern
research from roughly 1970 onward has revealed that the apes other than chimps also eat a modicum of insects and/or other animal foods in their diet. See the earlier section on ape diets for details.

CLAIM: Hunting by chimps is not instinctive, because:


1. Female chimps do not hunt.

179

2.

The males kill prey only by knocking it to the ground. 3. Chimps have smaller bodies than humans, hence need concentrated foods to support higher metabolism. 4. Chimps get sick from eating meat and must resort to "toxic" herbs.

REPLY: The above is a good example of the sort of misinformation and half-truths frequently dispensed
by fruitarian extremists. Let's look at the above claims one at a time.

1. In fact, female chimps do hunt, and males generally share their kills with females. The paper
of Galdikas and Teleki [1981] provides the best answer to this claim (pp. 241, 245, 247): Long-term research at Gombe National Park indicates that chimpanzees, more than any other nonhuman primate in existence today, practice an incipient form of labor division; one age/sex class, adult males, focuses on exploiting certain [food] resources, especially vertebrates, to the benefit of other members of the social unit, including individuals of other age/sex classes (Teleki 1973a, Wrangham 1975)... The specialization is not absolute: chimpanzee females occasionally hunt game and males certainly spend considerable time collecting insects... Further since female chimpanzees benefit from male predatory activities through the extensive sharing of meat within a community, while males benefit little if at all from female collecting activities because insects are rarely shared among adults, it is possible that the annual intake of fauna is greater among chimpanzee females than among males (McGrew 1979)... Modern field studies are demonstrating that monkeys, apes and humans are, with some exceptions, basically omnivorous mammals that share many adaptive responses to resource availability (see Harding and Teleki 1980).

2. Male chimps are not limited to knocking prey to the ground; they often use the method
3. of flailing to kill their prey. See van Lawick Goodall [1973] and Butynski [1982] (table 2, p. 425) for a summary discussion of chimp hunting methods. The claim about body size was examined and found wanting in an earlier section herein. By the way, to claim that any primate smaller than humans needs more concentrated food would suggest that orangs, bonobos, and gibbons probably all need to eat meat also. Medicinal herb use by chimps. The claim that chimps get sick and use herbs has some truth in it. However, herb use does not constitute proof that meat-eating by chimps is somehow noninstinctive. The claim (#4) made above implicitly includes two myths promoted by raw fooders: First, the false myth that wild animals who eat their "natural" diet never get sick. This myth was assessed earlier, and a short discussion of the topic is given in the article, Selected Myths of Raw Foods. The second myth is that herbs (and spices and meat and anything else the fruitarian extremist dislikes) are "toxic." The field of self-medication (using herbs) by animals, known as zoopharmacognosy, is discussed in Huffman [1997], which provides an overview of the topic. Self-medication in primates is primarily used to control parasites and gastrointestinal disturbances. Huffman [1997] also discusses possible self-medication by bears. The obvious logical fallacy of the extremists is to assume (without any credible scientific proof) that such herb usage is unnatural. Inasmuch as some raw fruitarian extremists are quick to label almost everything (except fruit, and perhaps a few greens) as "toxic" and "bad," the following quote from Huffman [1997, pp. 171172] confronts such extremist thinking with the reality that primates have the general ability to handle (detox) a wide range of plant compounds.

4.

180

Among primatologists a major focus of concern about plant secondary compounds in the diet has been on how and why primates can cope with their presence (Glander, 1975, 1982; Hladik, 1977a,b; Janzen, 1978; McKey, 1978; Milton, 1979; Oates et al., 1977, 1980; Wrangham and Waterman, 1981). An extreme case in point is the golden bamboo lemur (Hapalemur aurensis) of Madagascar, which is noted to consume over 12 times the lethal adult dose of cyanide in a day, without ill effect (Glander et al., 1989). The cyanide comes from the tips of a bamboo species, Cephalostachyum sp. (Graminaea), consumed as part of the lemur's daily diet at certain times of the year (Glander et al., 1989). In this case, the cyanide is thought to be detoxified mostly in the liver (Westly, 1980).

CLAIM: Meat-eating by humans cannot be instinctive because humans don't eat the specific monkeys that chimps hunt. REPLY: The above claim reflects truly amazing ignorance of chimp (and human) hunting behavior. It suggests that the only meat that chimps eat comes from monkeys. This is both false and
absurd. Hamilton and Busse [1978, p. 764] note: Predation upon mammals by chimpanzees and baboons is usually opportunistic, i.e. prey animals are encountered at close range and are captured quickly with a high probability of success, a typical scavengehunting tactic. Van-Lawick Goodall [1973] provides a list of prey killed by the chimps of Gombe; it includes many vertebrates that are not monkeys. Teleki [1981, table 9.3, p. 314] reports that of the identified mammalian prey killed by Gombe chimps, 68% were primates, and 32% were non-primates. Having ascertained that the basic premise of the claim is false, let's look at reality. Many humans live in temperate climates where there are no monkeys. Why does the extremist suggest that humans should eat only monkey meat, and no other kind of meat? Especially when it is contradicted by the reality of wild chimp diets, and the fossil record of the prehistoric human diet?

CLAIM: Humans cannot eat meat because we lack fangs, claws, sharp teeth. Human teeth are
good only for eating fruit, and not for tearing flesh or chewing leaves.

REPLY: The above claims were investigated thoroughly in earlier sections and found specious. CLAIM: Humans are fully upright and bipedal, and this makes us ineffective at hunting. REPLY: Such a claim might be hilarious if it were intended as humor. Unfortunately, the extremist here is
serious. As for hunting, we are all the descendants of hunter-gatherers. There are no veg*n gatherers, and no evidence that any ever existed. If humans were really ineffective and/or incompetent at hunting, then one of the following would be true: Humans would be extinct now because of our status as inefficient hunters, or There would be many veg*n gatherer societies (which would also be clearly reflected in the fossil record).

However, we are alive today to discuss the fine points of diet, which indicates that our hunter-gatherer predecessors were effective enough at hunting to pass the critical test of survival of the fittest.

Humans are the only primate to prey on vertebrates larger than self (body size). In reference
to the above, Butynski [1981, p. 427] notes:

181

No primate other than man has been observed to prey upon animals larger than itself... In contrast, man frequently kills mammals many times his size. Milton [1987] (citing Rodman and McHenry [1980]) reports that bipedalism is more energy-efficient over land than is quadrupedalism. Humans, of course, are the only fully upright bipedal primate.

Prehistoric humans hunted many species to extinction. The considerable skill of humans at
hunting is summarized by Allman [1994, p. 207]: Our modern human ancestors' hunting abilities are strikingly apparent in the Americas, where Paleoindians hunted to extinction nearly 70% of the species of large mammals on the continent, including mammoths, camels, and giant sloths... The most important factor in our ancestors' hunting prowess, however, was not their new tools, but their psyche. Large, relatively clumsy, and slow of foot, our ancestors could not stalk their prey and swiftly chase them down, like many predators, but instead had to rely on working together as a team to bring down their prey--something at which modern humans excelled. Note the reference in the above quote to the social and cultural behavior of prehistoric humans hunting in cooperative groups. This is an example of adaptive behavior that exerted selective pressure on human morphology and physiology via evolution. Recall the simplistic comparative "proof" arguments about human dental structure, body structure; such arguments are clearly fallacious because they implicitly assume that humans hunted solo, without technology (e.g., like cats). For additional information on human hunting skills and species extinctions, see Martin [1990].

CLAIM: Due to body size "rules," large mammals (like humans) don't have to eat flesh. REPLY: The body size rule has been discussed already. The above claim would suggest that lions, tigers,
killer whales, and polar bears don't need to eat flesh. Remember too that some very large land carnivores are now extinct: sabertooth tigers, prehistoric lions (about twice the size of modern day lions), etc. They didn't need to eat flesh either? Are such claims an example of denial of reality, or science fiction?

CLAIM: Cooking was needed because eating raw meat introduced parasites. REPLY: One can contract certain parasites from any raw food, including raw plant foods.
All that is needed is the presence of parasite eggs or bacteria, and these can be carried by water, insects, birds, are found in animal dung, and so on. If indeed cooking was universally required to neutralize parasites in meat, this implies that cooking would have been universally needed for plant foods as well. Of course, that contradicts the extremist view that humans evolved on a diet of raw fruit.

CLAIM: Humans are not adapted to be omnivores. The writings of D.J. Chivers are then quoted
by the extremist, in a misleading way.

REPLY: This is discussed in an earlier section. In my opinion, to use quotes in a deliberately and grossly
misleading way is intellectually dishonest. (If it is not deliberate, it shows how seriously the extremist's fanaticism filters their perception of what they read, and how it predisposes them to screen out or distort, perhaps subconsciously, what they do not want to hear.)

Epilogue I hope the preceding set of claims and replies was interesting--at least in part--to you. The mixture of half-truths and twisted "logic" found in the above claims is typical of crank science. Unfortunately, a significant part of "fruitarian science" is composed of these types of distortions.\

182

PART 9: Conclusions: The End, or The Beginning of a New Approach to Your Diet?
Contrary Facts vs. Vegan Dogma: Facing the Honesty Problem
SYNOPSIS OF THE PRIMARY EVIDENCE (CONCLUSIONS) Humans can be regarded as natural omnivores, so long as one uses the common definition of the
term: a natural diet that includes significant amounts of both plant and animal foods. (Humans might not qualify as omnivores if one uses the definition of omnivore as advocated by D.J. Chivers and associates, and discussed in earlier sections herein.) To use terms that are linked to gut morphology, humans are either faunivores [meat-eaters] or frugivores with specific (evolutionary) adaptations for the consumption of animal foods. This, of course, means that humans are not natural vegetarians. A short summary of some of the evidence supporting this follows (the material below was discussed in depth in earlier sections of this paper).

The fossil record. Approximately 2.5 million years of human omnivory/faunivory are apparent
in the record, with genetic adaptation to that diet the inevitable and inescapable outcome of evolution. The supporting evidence here includes isotope analysis of fossils, providing further evidence of consumption of animal foods. Comparative anatomy of the human gut. The best scientific evidence available to date on gut morphology--analyzed using two different statistical approaches--shows evidence of adaptations for which the best explanation is the practice of faunivory. (Faunivory as an explanation is also supported by optimal foraging theory in hunter-gatherer tribes.) Further, the human gut morphology is not what might be expected for a strict vegetarian/fruit diet.

Comparative physiology (metabolism) o Intestinal receptors for heme iron. The existence of intestinal receptors for the
specific absorption of heme iron is strong evidence of adaptation to animal foods in the diet, as heme iron is found in nutritionally significant amounts only in animal foods (fauna).

183

B-12 an essential nutrient. Similarly, the requirement for vitamin B-12 in human

nutrition, and the lack of reliable (year-round) plant sources suggests evolutionary adaptation to animal foods in the human diet. o Plant foods are poor sources of EFAs. In general, the EFAs in plant foods are in the "wrong" ratio (with the exception of a very few exotic, expensive oils), and the low synthesis rates of EPA, DHA, and other long-chain fatty acids from plant precursors point to plant foods as an "inferior" source of EFAs. This strongly suggests adaptation to foods that include preformed long-chain fatty acids, i.e., fauna. o Taurine synthesis rate. The low rate of taurine synthesis in humans, compared to that in herbivorous animals, suggests human adaptation to food sources of taurine (fauna) in the human diet. o Slow conversion of beta-carotene. The sluggish conversion rate of beta-carotene to vitamin A, especially when compared to the conversion rate in herbivorous animals, suggests adaptation to dietary sources of preformed vitamin A (i.e., a diet that includes fauna). o Plant foods available in evolution were poor zinc and iron sources. The plant foods available during evolution (fruits, vegetative plant parts, nuts, but no grains or legumes) generally provide low amounts of zinc and iron, two essential minerals. These minerals are provided by grains, but grains are products of agriculture (i.e., were not available during evolution), and contain many antinutrients that inhibit mineral absorption. This suggests that the nutritional requirements for iron and zinc were primarily met via animal foods during human evolution. o Bitter taste threshold as a trophic marker. An analysis of the human bitter taste threshold, when compared to the threshold of other mammals, suggests that our sensitivity to the bitter taste is comparable to that of carnivores/omnivores. There is no such thing as a veg*n gatherer tribe. And there are no records to indicate that any such tribes ever existed; also no evidence of any vegan societies either. The actual diets of all the great apes includes some fauna--animal foods. Even the great apes that are closest to being completely vegetarian, gorillas, deliberately consume insects when available. Chimps and bonobos, our closest relatives, hunt and kill vertebrates and eat occasional meat.

Many of the ancillary claims made in comparative "proofs" of veg*n diets are logical fallacies:
o o o The misinterpretation of animal studies using domesticated or feedlot meats to condemn all omnivore diets. The misinterpretation of clinical studies showing negative results for the SAD/SWD as indicating negative results for all omnivore diets. The misinterpretation of the results of the China Project to claim it "proves" vegan diets are best and all omnivore diets are bad.

John McArdle, Ph.D., an anatomist and primatologist, a vegetarian, and scientific advisor to the American Anti-Vivisection Society, summarizes the situation clearly [McArdle 1996, p. 174]: Humans are classic examples of omnivores in all relevant anatomical traits. There is no basis in anatomy or physiology for the assumption that humans are pre-adapted to the vegetarian diet. For that reason, the best arguments in support of a meat-free diet remain ecological, ethical, and health concerns.

Veg*n diets are not the natural diet of humans


The data available on humanity's evolutionary diet leads to the conclusion that veg*n diets are not the natural diet of humanity, although a veg*n diet that excluded dairy, grains, and legumes could be described as a restriction of the evolutionary diet. The evolutionary or hunter-gatherer diet (discussed in earlier sections) consists of a diet of wild plant foods (fruits, nuts, some leaves/stems, starchy tubers--possibly cooked), insects, and the lean meat and organs of wild animals.

184

Note that grains, legumes, and/or dairy are generally not available to hunter-gatherers; such foods are provided in significant quantities only via agriculture, and have been a significant part of the human diet for only about 10,000 years or less. The extent of human genetic adaptation to such foods is a controversial point, but the majority view is that the genetic adaptation that has taken place in the last 10,000 years is quite limited. (See the discussions earlier herein regarding hereditary hemochromatosis, and the carnivore connection hypothesis.) Similarly, modern processed foods have been with us for only a few generations, and genetic adaptation in such a short period is highly unlikely.

Failure to Thrive (FTT)


Your health is more important than raw/veg*n dietary dogma
There is a phenomenon known as failure to thrive (FTT). This occurs when one carefully and strictly follows a specific, recommended dietary regime, in this case raw/veg*n, but one suffers poor or ill health anyway. Such poor health may range all the way from clinically diagnosed nutrient deficiencies to a mild but noticeable general malaise--e.g., loss of libido, lassitude and fatigue, emaciation and weakness, etc. Poor health is not limited to physical symptoms, but may include mental problems as well (e.g., depression). Signs of poor mental health that one may (commonly) observe or encounter in the raw/fruitarian community and vegan communities (more often in the raw vegan community) include: severe obsessions with food and/or dietary "purity," eating-disorder behavior patterns (binge-eating), and, sometimes, incredibly hateful fanaticism (e.g., some of the fruitarian extremists), and so on.

Awareness of FTT obscured by self-censoring due to moral ostracism. Hard statistics are not
available on the incidence of FTT. Many if not most of the people with FTT self-censor themselves; that is, they switch to a diet that works for them, and leave the raw/veg*n community. Anecdotal evidence suggests that FTT is very common--the usual result--of following a strict fruitarian diet for a long term, while it presumably is much less common with conventional (and relatively diverse) cooked vegan diets. (The narrower the diet, the more likely FTT, per anecdotal evidence from raw diets.) Note also the possibility, though, that FTT may be more common among veg*ns in general than one might think--consider the number of former veg*ns around, and the ostracism (discussed earlier) that usually meets anyone speaking out about why they may have abandoned the diet. The ostracism syndrome very commonly--and also very effectively--shuts out awareness of FTT in the raw vegan community. That it may also do so in the wider vegan community is well worth considering, especially given the similar prevalence of reflexive pat answers in response to any mention of examples that could raise the possibility of FTT.

Psychological patterns: Feelings of superiority and recitation of mantras result in blocking of contrary information. Observation of this type of behavior pattern--at its most visible, perhaps, in
rawism--suggests a more general rule: Habitual recitation of such standardized replies or catechisms has a predictable long-term psychological effect, similar to ostracism, on those who engage in it. It tends to screen out awareness of the topic even as a worthy question open for discussion at all, let alone serious consideration. Finally, contributing to both the patterns of ostracism and screening of awareness can also be a circular, self-fulfilling prophecy that fosters a superiority complex which feeds on itself, further perpetuating matters. The form it takes is the following: Since any failures to thrive must be failures to comply with the recommended dietary guidelines, then those who fail must not be complying, because they are not intelligently following the guidelines like those of us who are succeeding. Thus, those who fail tend to be assigned second-class status as lesser, unworthy information sources. Those who succeed, on the other hand, are obviously the ones doing it right and the ones to be believed. Yet both sources are just as "anecdotal."

185

Such a sense of arrogance, then, sets up another psychological block to really listening to those who might have contrary information. Thus, in some cases, not only is the information reflexively rationalized, it may not even be fully "heard" to begin with.

Excuses and rationalizations the "party line" reaction to FTT. The most common
response to FTT among raw/veg*n diet advocates is to blame the victim or outside factors, i.e., the "party line" response is excuses and rationalizations. A few of the common excuses raised are as follows. o Detox--the favorite, and unfalsifiable, rawist excuse. Your problems are all detox--you have not been on the diet long enough to become "pure." The standard, almost religious, excuse is that unless you faithfully follow the "one true religion and science of the raw vegan diet" you will never reach the "promised land" or "afterlife" of "paradise health." As wild and as crazy as this characterization may sound, it is a fairly accurate (albeit sarcastic) depiction of typical extremist claims. Also note that some conventional veg*ns make similar, though less outrageous, claims. o Deficiencies on the "ideal" diet? It may be a minor deficiency--you need to take supplements; or, an alternate form: it may be a minor deficiency--be sure to eat enough of certain (quite) specific plant foods. o Overlooked minor details. You need to fine-tune your diet (more, or yet still more), or change some single one or another of a potentially lengthy list of minor details to succeed. (If everything has to be just exactly right with little margin for error, however, what this shows instead is the narrowness and relative unworkability/impracticability of the diet.) o If all else fails, blame other factors. The FTT is of course due to outside factors: you didn't get enough rest, you didn't get enough exercise, you are under stress, you need expert advice which I will happily provide if you will pay my high fee, you lack sufficient faith in the dietary regime or are too impatient, and so on. Many of the rationalizations and excuses given by raw vegans are discussed in a number of the articles on this site: Idealism vs. Realism in Raw Foods, Assessing Claims and Credibility in Raw and Alternative Diets, and others. Similar excuses, if more sophisticated, are commonly proffered by conventional veg*n "diet gurus" as well.

How often are diet gurus honest and humble? However, the one thing I have not observed
to date is for a (prominent) raw/veg*n advocate to actively admit the obvious: that the raw/veg*n diet is arguably unnatural, and while such diets may be helpful or beneficial for some people (particularly in the short run), other people may experience FTT in the long run, because the diet might not be suited for them. What I am suggesting raw/veg*n diet advocates do is to openly admit the following:

The advocate does not have all the answers in the field of diet/nutrition, and there is no "perfect" or "ideal" diet that applies to everyone. 2. The diet the advocate promotes might not work for everyone (or, more accurately, probably does not work for everyone). If the proposed diet does not work for you, then you should try another diet; after all, the diet must serve you, and not the other way around. Your personal health and well-being are far more important than raw/veg*n dietary dogma or any other idealistic dietary dogma. In particular, your health and well-being are more important than someone else's obsession with eating a (100%) raw vegan diet, or with someone else's idea that "meat is murder," and so on. For some diet gurus, dietary dogma is more important than the health of their followers. It is my opinion that many raw/veg*n advocates regard dietary dogma as being more important than the health and well-being of their followers. (Not that this is conscious, of course-actions speak louder than words in signaling such behavior.) This opinion is based on seeing many 1.

186

raw diet advocates rationalize the massive failure (in the long-term) of the dietary programs they advocate. At some point, one must open up to reality and admit that raw diets can be extremely good short-term diets (at least for some people), but that their record as long-term maintenance diets is very poor. Diet gurus may blame victims if their diet does not work. However, instead of admitting the obvious, i.e., that their dietary dogma is flawed, many raw diet advocates rationalize and blame the victim, often ignoring the serious health problems their followers face. To a lesser extent, the same attitude--dietary dogma is more important than people--appears to be present in the conventional vegan community as well. I hope that you share my opinion that such circumstances reflect badly on the raw/veg*n "diet gurus" who engage in the above.

No doubt the comments here will be aggressively challenged by raw/veg*n diet advocates, as studies on FTT are quite limited (it's an area where most data is anecdotal). No doubt the defense will be raised that "(more) studies are needed in this area." This is certainly true; however, any study that assumes FTT can always be fixed while staying within the confines of veg*n diets is a study based on the assumption that vegan diets can work for anyone. That would be logically invalid, as it would constitute assuming the end result of a study on vegan FTT. Personally, I would love to see the results of long-term studies on vegan FTT; I suspect that raw/veg*n advocates might find the results of such studies to be rather surprising.

Difficulty of resolving the question with nutritional calculations done "on paper," opinions from nutritionists, or with clinical studies done to date. It appears that the assumption vegan diets
can/will work for everyone is often based on the fact that the diet (assuming it is well-planned) meets nutritional requirements "on paper." Or it may be based on the opinion of vegan nutritionists dealing with vegan patients; though here one must be skeptical given the inherent vested interest of the nutritionist, and the above-mentioned tendency to characterize not just some, but all, problematic outcomes as due to "failure to comply" (or "didn't implement the diet intelligently or meticulously enough") rather than "failure to thrive." Also, occasionally one will see a pro-vegan nutritionist make statements to the effect that after a year, the majority of people on vegan diets are doing fine. However, accuracy questions aside, as pointed out elsewhere repeatedly on this website, successful short-term results--even for a year or two (a year is a very short period of time in the context of long-term success)--do not necessarily predict long-term success on vegan (particularly raw vegan) diets; and to presume they do is naive at best. (If the patient has previously been eating the SAD/SWD diet, then in contrast, any one among a range of possible sweeping dietary changes based on adding a large percentage of more natural foods to the diet stands to be an improvement in the short-term.)

Self-selection effects present a barrier to accurately assessing incidence rates. Alternatively,


the assumption vegan diets will work for everyone may be based on extrapolations from results of clinical studies done to date. The problem here, however, is that there seem to be no clinical/epidemiological studies that have been performed on long-term vegans that are not composed of self-selected groups. (See comments on FTT in the earlier section, Drawbacks to Relying Exclusively on Clinical Studies of Diet, for a discussion of how the self-selection effect, by default, "censors" data that might be obtained from any failures on vegan diets.) This may continue to be a problem for the obvious reason that suitable vegan populations to study are small to begin with, and long-term veganism involves a major lifestyle change. To avoid self-selection effects and accurately assess incidence rate, randomly selected test subjects must maintain far-reaching dietary changes for the lengthy time periods required (at least a few years, as a starting point) to arrive at meaningful results. At present this represents a major hurdle to scientifically assessing incidence rates of FTT. Note: As of approximately 1997/1998, vegan advocate Michael Klaper, M.D. had begun attempting to put together a study looking at both successful and non-successful vegans, which appears to be the first-ever of

187

its kind. See http://www.vegsource.com/klaper/study.htm for additional information. Although not a longitudinal study, it does aim to compare groups who have been on various vegetarian diets--including vegans who have been successful and ex-vegans who have not--with the objective of uncovering the factors and potential physiological mechanisms behind why some individuals thrive on a vegan diet and others do not.

Potential reactions to information in this paper: Part 1

Rationalizations about the Evidence for Omnivorous Diets from the Fossil Record & Comparative Anatomy
Diet gurus excel at spinning rationalizations
One thing many raw/veg*n advocates are very good at is quickly spinning new rationalizations and excuses when faced with new, contrary factual information. It's very easy to spin new rationalizations (which are usually presented with little or no evidence), whereas investigating such rationalizations (only to often find they must be discredited, ultimately) is often very hard, detailed work. (The papers on this site are evidence of this, and it's probably why there are few such investigations to be found that go into more than a cursory level of explanation or detail.) Some of the potential rationalizations that may be raised against the overall thrust of this paper can be fairly easily predicted. A likely list follows below, with brief replies. (Some of the rationalizations are revisions or defenses of prevalent current rationalizations.)

RATIONALIZATION:

1. The fossil record is irrelevant because we were created, rather than evolved!
Evolution is nonsense!

2. Our prehistoric ancestors were acting "against their nature" by eating meat for 2.5
million years! It's obviously maladaptive behavior! REPLY: Both of the above have been covered in previous sections of this paper, so just a few additional
brief remarks are given here. 1. As mentioned previously, to adopt creationism solely because it supports your "lunch philosophy" is intellectually dishonest.

2. Adaptation to a universal habit (meat-eating) is the ultimate and unavoidable result


of evolution. The human genus (Homo) has gone through 3-4 separate species since its inception;
this provides direct evidence of evolutionary adaptation to environmental (and universal cultural) selection pressures. Those who make claim #2 above simply have no credible evidence to support their views. Further, such claims suggest a truly massive ego on the part of the extremist; i.e., the raw/veg*n advocate supposedly knows better than all the people over 2.5 million years of evolution who ate animal foods for survival. Inasmuch as eating animal foods was clearly necessary due to ecological limits in most (and probably, for the most part, all) locations, the claim that "eating meat is maladaptive" could be restated as "survival is maladaptive." The latter statement clearly delineates the extremist ego.

RATIONALIZATION: The evidence of modern ape diets is irrelevant because we are a unique
species, and we evolved in a different environment (the African savanna) than the forest-dwelling great apes.

188

REPLY: There is much truth in the above; the differences in human vs. ape evolution have been
discussed in earlier sections. However, the significance of ape diets and the above rationalization about them stems from two points:

1. The 180-degree about-face from previous stance on ape diets. The similarities between
apes and humans have long been used to support the claim that humans are natural vegetarians. Those who have used the argument previously, and then not only drop it, but make a point of proactively targeting it for dismissal--upon hearing that the great apes are not vegetarians--are behaving inconsistently in a fashion that, again, highlights the intellectual dishonesty (perhaps not entirely conscious) implicit in such aspects of the raw/veg*n "party line." Ape diets may give insight into early evolutionary diets (pre-human). Despite the significant differences between apes and humans, the diet of apes is of some relevance as it may provide insight into the diets of the early hominids, i.e., our "early" evolutionary diet. The statement that we evolved on the African savanna is of interest, because plant densities are lower on the savanna, and the density of animal life higher (especially when migrating herds visit) than in the rainforest. Foley [1982] reports the surprising but very interesting information that of the three major habitats--arid regions, savanna/grasslands, and forests--the savanna habitat contains the least plant foods that are edible to humans. (It's quite surprising that an allegedly "vegetarian human" would evolve in the habitat that provides the lowest levels of plant foods!) Thus, the differences in relative availability of foods (savanna vs. rainforest) suggests that the human evolutionary diet was lower in plant foods, and higher in animal foods, than the diet of the great apes (which a number of other, separate lines of evidence also support).

2.

RATIONALIZATION: The human gut is far too elastic for comparative anatomy to tell us anything about our natural diet. REPLY: After years of citing the dubious analyses of gut morphology found in Fit Food for Humanity or
Mills' The Comparative Anatomy of Eating, those who suddenly adopt this view when faced with contrary evidence are, again, behaving in an inconsistent and intellectually dishonest manner. That said, although the best scientific research to date indicates we are faunivores--meat-eaters--by anatomy, it is not "hard proof." However, the gut morphology data is only one part of a large set of data (from multiple, separate lines of evidence) that points to humans being natural faunivores.

RATIONALIZATION: The evidence of comparative physiology is irrelevant:

1. We can get adequate B-12 from unwashed produce.


2.

3.

The low bioavailability, in plant food, of protein, iron, zinc, and EFAs is because some/all of these things are "toxic," except in tiny (nutritional) quantities. Physiological measurements made on (degenerate!) meat-eaters are invalid! Only measurements done on fruitarians are valid, and such measures will be different for "real" humans, i.e., fruitarians.

REPLY: Point #1 is common in the vegan movement; versions of points #2 and #3 above are occasionally
made by fruitarian extremists. Let's briefly look at these.

1. Insufficient evidence to support claim #1. Despite the research of Mozafar [1994] discussed
2. earlier herein, there is insufficient data at present to support a claim that wild plant foods (alone) could provide adequate B-12 to sustain a (hypothetical) vegan gatherer tribe. This view is certainly not supported by real, honest science, though there are fruitarian extremists who present elaborate crank science "proofs" that allege protein in more than extremely modest

189

3.

amounts (extremely modest even by conservative estimates approved by the more mainstream conventional-vegan community) is effectively "toxic" because its metabolic by-products are allegedly toxic. Point #3 is not only speculative, it is an example of "dietary racism." Note the implicit hatred in point #3, something that, unfortunately, is far too common in the fruitarian movement (i.e., dietary racism). This view is also thoroughly egocentric as well. While it is true that levels of zinc, iron, B-12, taurine synthesis, and beta-carotene conversion will likely differ between fruitarians and non-vegetarians, without actual studies, we cannot say much about such assumed differences. This claim is basically a diversion by fruitarian extremists.

Rationalizations in Response to the Evolutionary and Hunter-Gatherer Evidence for Omnivorous Diets
Similar to rationalizations offered in reaction to the comparative anatomy evidence for omnivorous adaptation are the following--sometimes more sophisticated--diversionary ploys that may be offered in response to evolutionary and hunter-gatherer evidence.

RATIONALIZATION: What happened back in the Paleolithic age doesn't really matter. We are different people today, in most every way. The diet of the Paleolithic is irrelevant nowadays
given the new conditions we live under that must be coped with.

REPLY: Genetically, we are in fact quite closely similar to the hunter-gatherers of the late Paleolithic era. As Eaton et al. [1988, p. 740] note:
Accordingly, it appears that the [human] gene pool has changed little since anatomically modern humans, Homo sapiens sapiens, became widespread about 35,000 years ago and that, from a genetic standpoint, current humans are still late Paleolithic preagricultural hunter-gatherers. Because of the close (nearly identical) genetic similarity, the diet of Paleolithic times is relevant. Of course we do live under different circumstances, and we adapt accordingly. However, for evolutionary adaptation to be reflected in the gene pool is generally believed to require a very long time--considerably longer, in any event, to produce more than but minimal changes since the time humans took up agriculture and ceased to be exclusive hunter-gatherers (roughly 10,000 years ago; much less for some groups). RATIONALIZATION: Evolution is concerned with reproductive success, not longevity. An evolutionary diet does not have to provide excellent health, it only has to be good enough to allow one to survive to reproduce. We can improve on evolution, and raw/veg*n diets are a good example thereof. After all, veg*ns live longer than non-veg*ns!

REPLY:

Proposing veg*n diet as an improvement on evolution is a reversal of the logic implicit in veg*n naturalism claims. The above is probably one of the major defenses to be
employed by raw/veg*n advocates. First, the claim that we can improve on evolution can be restated as, "We can improve on nature." This rationalization neatly reverses the logic that underlies the bogus but popular claim that raw/veg*n diets are best because they (allegedly) are "most natural" for humans. Anyone who uses the naturalism argument for raw/veg*n diets, but then uses the rationalization above, is being inconsistent.

The claim that raw/veg*n diets are an improvement on nature has little if any real supporting evidence. There are some vegan diet advocates who are aware of the evolutionary
and paleoanthropological evidence and honestly admit that vegan diets are not natural. That aside, the claim that veg*n diets are an improvement over evolution is simply speculative. There is certainly ample evidence to suggest that conventional veg*n diets are better than SAD/SWD diets, but no comparisons of veg*ns with hunter-gatherers or other groups today attempting to follow

190

modern versions of an evolutionary diet such as Paleodiet advocates in clinical trials or epidemiological studies. As previously discussed in this paper, to make an implicit equation of the SAD/SWD with evolutionary diets just because both are omnivorous is fallacious. Hence one cannot say that veg*n diets are "healthier" than natural, evolutionary, hunter-gatherer-type diets. There is extensive anecdotal evidence for raw vegan diets that indicates long-term success stories are rare. On the other hand, anecdotal evidence for conventional vegan diets suggests they are much more successful, in the long-run, than raw vegan diets. This doesn't tell us anything about how such diets would fare against evolutionary diets, however--long-term longitudinal studies of conventional veg*ns are lacking.

The claim conveniently avoids the failure-to-thrive issue. Another obvious issue here as
discussed earlier--or at least that is obvious to those ex-vegetarians who have made concerted, intelligent attempts at vegan diets (even conventional vegan diets, i.e., including grains/legumes) and who have failed--is the problem of "failure to thrive." (See earlier section on Drawbacks to Relying Exclusively on Clinical Studies of Diet, starting about one-third to one-half the way down the page, for a discussion of how mechanisms of built-in structural bias operate in the vegan community to minimize/prevent awareness of failure to thrive or its relevance.) This is an issue that, by and large, has yet to be faced squarely and honestly by very many in the vegetarian movement (including most of its scientifically oriented advocates). Thus, claims that vegan diet is an improvement over evolution ring hollow when so much remains unknown about actual rates of failure to thrive, and when the question continues to be largely glossed over by all but a very few in the vegetarian movement. Evolutionary discordance. An important additional consideration here is the concept of "evolutionary discordance"--that is, negative repercussions as a result of behavior contrary to the organism's genetic design. Our genetic makeup reflects an evolutionary compromise between multiple, competing selective pressures. The organism's physiology is therefore a reflection of adaptations (to such multiple pressures) that must function simultaneously in concert, and that mutually and intimately affect one another. As a result, the body, as determined by its genetic "blueprint," is a densely interwoven mesh of interdependent physiological systems that are very tightly integrated.

Disregarding the genetic dietary range may cause negative repercussions due to the physiological interdependencies determined by evolution. The range of dietary
adaptation supported by our genetic code is also the result of multiple selection pressures with their resulting physiological interdependencies. The problem with straying too far from the behavior/environment (which includes diet) that the body has adapted to genetically is that an alteration in one facet is likely to cause unanticipated repercussions, via the body's tightly woven and interdependent physiological systems. Attempted manipulation of one factor or attribute in the mix may occur at the expense of others, because of the multiple evolutionary selection pressures that have resulted in the particular "balance" of physiological factors that act as "constraints" on each other in the organism's functioning. Despite the fact that evolution may not (in the case of any given species) select for longevity, therefore, it does select for highly integrated physiological designs where the component systems mutually constrain each other. This places limits on how much the environment/behavior/diet of the organism can be changed without negative consequences to the majority of individuals in the gene pool. (Excepting of course, those individuals with the requisite genetic polymorphisms-variations--to benefit from such changes.) It may be possible to alter one's diet significantly, i.e., to adopt an "evolutionarily discordant" diet, and to see short-term, positive results. A good (if extreme) example is provided by vegan forms of fruitarianism. In the short run, such a diet may actually enhance the health of the person who

191

follows the diet. (Warning: fruitarian diets are high-risk; consult a qualified health professional before undertaking a fruitarian diet for therapeutic purposes.) However, in the long run, perhaps because the diet may be beyond the range of adaptation, undesired consequences can occur. In the case of fruitarianism, these consequences can include diabetes-like symptoms, severe emaciation, fatigue, mood swings, loss of libido, and serious mental consequences: life-controlling food obsessions, eating-disorder behaviors, incredibly hateful fanaticism, and so on. The loss of libido reported by many fruitarians (and some regular raw veg*ns) suggests that those who follow such a diet will fail to reproduce, and quickly die out in evolutionary terms. That explains one way that evolutionary selection pressure can quickly eliminate a discordant diet. In contrast to the above real-world anecdotal experience, if following a fruitarian diet granted long-term robust health and increased sexual virility, it would enhance reproduction and thrive in the gene pool via survival of the fittest. Further, the enhanced reproduction inherent in such a diet would make such a diet predominant, over evolutionary time.

Side note: The above provides yet another argument against the claim that humans evolved as
fruitarians. If the diet were even half as good as claimed, those who followed the diet (or the closest approximation thereof, per local conditions) would thrive and out-reproduce those on meat-based diets, and the fruitarian diet would have become the standard diet at some point in our evolution. So, although an individual might adopt a discordant diet and benefit in the short run, unless the diet conveys--consistently--significant survival/reproduction advantages in the extant environment, the genes that support the diet will not survive to become "natural" or standard. Further, the data available to date (anecdotal data) does not support the idea that raw/veg*n diets convey a major survival advantage, in the long run.

Where is the credible, long-term longitudinal data on longevity for vegans vs. nonvegans? As Corder [1998, p. 130] observes, "Most of the literature which addresses diet and
longevity is promotional, testimonial, observational or editorial." One wonders how many of the claims regarding veg*n longevity are promotional. It appears that many of the claims that veg*n diets promote longevity are based on short-term (cross-sectional) studies, and/or are extrapolations from studies that show trends toward better biomarkers (or lower disease rates) for people consuming more plant foods. The extrapolation of such studies from low levels of meat consumption to zero meat consumption is done at a considerably higher risk of statistical error, as such extrapolations go beyond the range of the data. Also, as mentioned previously, comparisons with the standard Western diet tell one nothing about likely comparisons with evolutionary or hunter-gatherer diets. (A further problem in comparing hunter-gatherer diets with veg*n diets is the low level of sanitation and high "occupational risks" of traditional hunter-gatherer lifestyles.) Even more relevant here is the apparent paucity of long-term, longitudinal data on the longevity of strict, long-term vegans. Claims based on biomarkers are certainly of interest, but actual longevity data, collected over a long period, would be more credible. And of course, one needs other, supplementary data to properly analyze longevity, e.g., data on smoking, alcohol use, accidental deaths, etc. Until such long-term longitudinal data sets are available, claims of veg*n longevity may be subject to challenge. [Note: Readers are invited to contact the author with citations for long-term longitudinal studies of veg*n longevity.]

RATIONALIZATION: Hunter-gatherers may eat some meat but they are not that far from being vegetarians. Many raw/veg*n advocates say that: (a) hunter-gatherers usually have a diet in which
plant foods are predominant, and also that (b) hunter-gatherers rarely hunt, and when they do they are usually unsuccessful. Ergo, their diet is nearly vegetarian. (Note: This general idea, in varying forms, is

192

held not just by raw extremists, but tends to be shared by those who identify themselves as conventional vegans as well.)

REPLY: This claim can probably be traced back to early (since superseded) anthropological work on hunter-gatherers, and may also be based on a flawed and outdated huntergatherer survey interpretation. The claim is a false myth, and one that certain "diet gurus" (and, as well, more scientifically oriented conventional vegans who should probably know better) are quite happy to propagate.

Popularized early studies of San Bushmen as one source of the myth. The myth of the
minimally-meat-eating hunter-gatherer may have its roots in early publicity surrounding one of the first well-studied hunter-gatherer societies to gain widespread attention (in the 1960s/1970s) in the anthropological community--the San tribes of the Kalahari Desert in Africa. In particular, the ! Kung San, widely reported on as a mostly peaceful, egalitarian tribe who ate much more plant food than meat, went on to become something of a cause celebre as the prototype of huntergatherers--as well as a Rorschach blot for anyone with a theory about humanity's inborn sociocultural inclinations to utilize as a jumping-off point. (For an overview of early romanticism about the !Kung, and how knowledge and views of hunter-gatherers since then have grown and changed, see Lewin [1988].) But as it was to happen, further research and analysis has since shown that the San are unrepresentative of the majority of hunter-gatherer tribes where plant-to-animal-food ratios are concerned. (The !Kung San derive 37% of their food from animal foods, the Kade San 20% [Eaton 1985]; and in this connection, it's worth remarking here that even 20% is a significant amount--in reality not really that close to being "nearly vegetarian.")

Flawed early survey of worldwide hunter-gatherers as another source. Richard Lee


did some of the influential early (and still well-regarded) anthropological work studying the ! Kung. (See Lee [1979] for a report of his anthropological observations.) However, it is not just the diet of the !Kung San or related Bushmen that may be the source of the widespread (erroneous) "mostly gatherer" view of pre-agricultural people, but perhaps also a hunter-gatherer survey of Lee [1968]. The survey of Lee did claim that hunter-gatherers gathered more than they hunted. However, as Ember [1978] explains, Lee's interpretation of the survey data (not the survey itself) was seriously flawed and unrepresentative of hunter-gatherers. The flaws in Lee's survey interpretation are as follows: 1. He reclassified shellfish-collecting as gathering, rather than hunting. 2. He (arbitrarily) excluded many North American hunter-gatherer tribes from the survey (which was based on the Ethnographic Atlas [Murdock 1967]). The result of this was, as described by Ember [1978], to arbitrarily increase the importance of gathering in the survey.

More recent and detailed analysis of survey data finds hunting more important than gathering. Ember [1978, table 2, p. 441] describes a more thorough survey utilizing all of the
tribes in the Ethnographic Atlas (including the North American cases excluded by Lee), which found that in 77% of the hunter-gatherer societies, gathering contributed less than half the food calories. Hunting and gathering contributed approximately equal calories in 13% of the societies, and gathering contributed more calories than hunting in only 10% of the societies. (Also see the subsection on hunter-gatherers in "Metabolic Evidence of Human Adaptation to Increased Carnivory" see below. Thus we note that the claim hunter-gatherers are incompetent hunters and rely primarily on gathering is false, and that hunter-gatherer societies where gathered plant food is a more important food source than hunting are a very small minority.

193

Outdated, incorrect claims that Australian Aborigines were "nearly vegetarian."


The San Bushmen are not the only hunter-gatherer group whose diet has been misunderstood or misrepresented. One can also find inaccurate claims that the Aborigines of Australia were predominantly vegetarian. Lee [1996, p. 7] summarizes the situation nicely: For many years it was believed that the Aboriginal diet was predominately vegetarian, and a statement that the central Australian desert diet was composed of "70-80% plant foods" [93] had been widely accepted [87, 101-103]. This is no longer believed to be the case. Meggitt [93] compared foods by estimated weight, which probably overassessed the importance of the vegetable component of the diet. Support for the vegetarian basis of the traditional desert diet had also been interpreted from botanical lists, observations illustrating the ripening of different plant food species throughout the year [96] and evidence of storage of plant foods. However, none of these sources attempted to quantify actual dietary intake, and the effects of climate, seasonality and specific location must also be considered [37, 70, 96]... There is increasing evidence that both tropical savanna/coastal and desert [86, 90] diets were meat-oriented; vegetable foods provided an important supplement, rather than an alternative to animal foods, with proportions changing throughout the seasons [10, 25, 36, 82, 84, 100]. For example, the coastal people of Cape York concentrated so much on the procurement of marine mammals that vegetable foods were considered a luxury [7]. Desert Aborigines have described themselves specifically as meat eaters [106]. Early observers commented on the intake of meat of smaller mammals as a "mainstay" of the diet of central Australian tribes [80]. The above also reminds us to be cautious in evaluating earlier studies, and to use all the available evidence when evaluating the diets of hunter-gatherers. Otherwise, one may end up making fallacious claims, as here, e.g., that Australian Aborigines (or other hunter-gatherers) were "nearly vegetarian."

RATIONALIZATION: There are no vegan gatherer tribes because they have not been exposed to the "enlightened" philosophy of veg*nism. They are living in ignorance and have not evolved spiritually. REPLY: That there are no vegan hunter-gatherer tribes suggests the diet is neither feasible for them nor natural. Stop and think carefully about the nature of claims that a particular diet somehow
makes you "enlightened" or "superior." Stripped of their idealistic rhetoric, the nature of such claims is, quite simply: "My lunch is better than yours, and that makes me a better person than you!" Even worse, a few fruitarian extremists actively promote this nonsense in dishonest and incredibly hateful ways. I hope that you can see that rather than "enlightened," such attitudes are really self-righteous, egotistical, and the dietary equivalent of racism. Because of this, it is best to not view veg*n philosophy as

"enlightened." RATIONALIZATION: The hunter-gatherer diet is not feasible for people living in modern times; it's just a bunch of academic "ivory-tower" theorizing about diet and nutrition. Vegan diets are real
and work! REPLY: The earlier section regarding FTT (failure to thrive) indicates that veg*n diets don't seem to work for everyone. As for the claim that hunter-gatherer diets are all academic, a partially tongue-in-cheek reply here is: Whether paleodiets are fully achievable in today's world at this point might be debated-however, is it not obvious that attempts to approximate them will certainly be much closer than attempts not to? More seriously, the section Should You Eat Meat? (later herein) briefly describes how it may be possible to approximate a hunter-gatherer diet today.

194

Evolution and Vegetarian Choice: Continued Dogma or a New Honesty?


Based on past experience (some of which was quite unpleasant), it is possible that some "spiritually advanced, peaceful, compassionate" raw/veg*n diet advocates will react to the information in this article with some of the following.

DIVERSIONARY TACTICS: PERSONAL ATTACKS AND NITPICKING Attacking the messenger while ignoring the message
Hateful personal attacks by dietary extremists--e.g., pejorative name-calling, claiming that I am antivegetarian, anti-raw, etc.--can be fairly easily predicted. One fruitarian extremist group (that, in my opinion, behaves like a hate group--an opinion shared by many others) has attacked me on numerous occasions in the past with defamatory lies and threats of violence. I have also been attacked by other fruitarian extremists (e.g, promoters of crank science theories) and even a conventional veg*n extremist (a notorious Internet "kook"). Please note that: I am both pro-vegetarian and pro-raw. Readers should be aware that I am a long-time vegetarian (since 1970), a former long-time (8+ years) fruitarian (also a former vegan), have followed naturalhygiene-style and living-foods-style raw vegan diets, and am both pro-vegetarian and pro-raw. However, I am definitely not a promoter of, or a "missionary" for, any specific diet. In reality, I am tired of seeing raw and veg*n diets promoted in negative ways by extremists whose hostile and dishonest behavior is a betrayal of the positive moral principles that are supposedly at the heart of veg*nism. (See the site article Assessing Claims and Credibility in the Realm of Raw and Alternative Diets for insight into the behavior of extremists.)

Nitpicking inconsequential issues while ignoring the relevant ones


Any small errors in the material here (and, given the size and scope of the material, there are likely to be a few errors despite a considerable amount of fact-checking) do not by themselves (alone) invalidate the major points or conclusions given here. No doubt extremists will greatly exaggerate the presence (and importance) of any small errors they can find. Past experience with certain fruitarian extremists who have dishonestly misrepresented my views so they could nitpick them suggests that the material on this site will be misrepresented and subject to nitpicking as well. Again, see the site article, Assessing Claims and Credibility for a discussion of related extremist tactics.

How will you react?


As a reader of this paper, you have the choice of how you react to the information here. Will you use it as an opportunity for self-examination regarding the information you use to back up your dietary philosophy? Will you try to ignore it, using nitpicking and rationalizations as an emotional shield to avoid considering the material herein? Or, will you behave like an extremist and threaten or attack me or the other writers on this site, thereby promoting hatred and negativity? If you are tempted to follow that route because the material here challenges your "lunch philosophy," then please stop and think: Will raw/veg*n hatred make the world a better place? Are hatred and threats simply a new type of raw/veg*n "compassion"? I suggest that readers ignore personal attacks and nitpicking, and focus on the actual issues rather than getting sidetracked by such diversions. Instead, focus on evaluating the myths, fallacious logic, crank science, and unsubstantiated claims that are prevalent in the raw/veg*n movement, and ask yourself: Do I really want to participate in (or condone) promoting the raw/veg*n movements using such dubious means?

Dishonest raw/veg*n diet gurus


Every time I have raised the issue of the dishonesty, crank science, and even hostility that are common in the promotion of raw/veg*n diets, many people quickly rally to the defense of the diet gurus. In so doing, they are making certain implicit assumptions (though perhaps not consciously). The assumptions are that negative, dishonest means are acceptable (or may be condoned by silence or lack of opposing comment

195

when one is aware of it) and/or may even be necessary to promote raw/veg*n diets. (This is pretty much an "ends justifies the means" argument.) Needless to say, such assumptions are outrageous. I hope that you share my view that raw/veg*n diets can and should be promoted in honest, positive ways, with legitimate science, and with complete respect for those who choose other diets. There is no need for myths, crank science, or dietary racism in promoting raw/veg*n diets.

Should You Eat Meat?


The objective of this paper has been to examine the claims made in various comparative "proofs" of diets, and not to promote one diet over another. Similarly, this website does not seek to promote a particular diet. Instead, the focus here is on providing you with information for your consideration and evaluation. With that in mind, the question of whether you should eat meat is a question that only you can answer. The role of spiritual or moral factors in such a decision is at your discretion, i.e., you choose whether to include such factors in the decision process. As a veg*n for moral/spiritual reasons, I strongly encourage you to consider such factors, if appropriate in your case. It is certainly not my intent here to promote meat-eating, only to clarify the scientific facts, and to hopefully dispel some of the myths/crank science associated with raw/veg*n diets. However, ultimately, diet is

your personal responsibility, and your personal decision, as well.


A few comments are relevant here, in context. The fact that domesticated/feedlot meat is potentially harmful has been mentioned a number of times in this paper. SAD diet not a good idea. As the SAD/SWD diets have (deservedly) bad reputations and use domesticated/feedlot meats, adopting or following such a diet is not a very good idea. Propensity to consume fauna might not serve best interests in modern society. Hamilton and Busse [1978, p. 765] note: The human propensity to expand dietary meat consumption seems to be a legacy from our omnivorous primate heritage. Humans apparently share with most primates a tendency to increase the proportion of dietary animal matter whenever it is economical to do so. We inherited dietary preferences for animal matter, which historically have been limited by economics and a hierarchical society. Under current luxury diet circumstances, no such balance of diet to resource availability prevails and preference betrays best interest.

Times have changed, but the calorie content of fat has not. Allman [1994] summarizes
the dietary paradox we face rather nicely (p. 204): The human body absorbs 95% of the fat we ingest, suggesting that this compact source of calories was extremely hard to come by in ancient times. This legacy of our ancestor's eating has created a psychology of food preference that is celebrated in the local burger palace. Having existed for eons in an environment where only lean meat was available, our ancestors would have found a fast-food hamburger a gustatorial paradise. It is precisely what our ancestors loved about fat--its incredibly rich content of calories--that makes it so bad in modern times, when fat is available in great quantities. This evolutionary legacy in our food psychology is the reason fatty foods cause so much trouble for those of us who who live in the food-rich industrialized West today: Having evolved in an environment where fat was scarce, our modern-day minds have a hard time knowing when to stop.

196

Possible approximation of hunter-gatherer diets, in modern societies. Although the


evolutionary hunter-gatherer diet is not feasible for most people at present, it may be possible to approximate it via a diet that includes: o Grass-fed buffalo, yak, emu, duck, or other less-domesticated animals (perhaps even pastured beef, assuming you can find it). o Wild fish are a possibility as well since in many cases they fairly closely approximate key characteristics of land game (low saturated fat, abundant in EPA/EFAs, appropriate omega-3 to omega-6 ratio, etc.), though fish are not regarded as completely strict evolutionary foods in general, as many of our ancestors lived inland and lacked fishing technology. (The assertion that fish are not completely within the definition of an evolutionary diet is somewhat controversial, however, and subject to some debate.) o Non-hybrid vegetables and wild plant foods, as available (using commercial plant foods-but no grains or legumes, when wild is not available). As well, many commercial plant foods, depending on the item in question (low glycemic index, for example, as one criterion), may be close enough to the profile of wild plant foods to include in a paleostyle diet. o Insects, as an animal-food source, are also within the evolutionary definition.

The comments above are provided to stimulate discussion and research. The above is not intended as an individual dietary prescription or recommendation.

A New Beginning?
Newer knowledge can help in self-assessment. I hope that the information in this paper has been of interest to you. I also hope that you don't react with denial (rationalizations, excuses) or hostility (attacks, threats). Instead, I hope that you use the process of "mentally digesting" the information here as an opportunity for in-depth self-examination of the attitudes you may hold toward diet, and as a real opportunity to expand your vision about the human species'--your own--actual dietary heritage.
This common heritage we all share has had a profound impact on human physiology and anatomy, and also exerts its effects internally (mentally) in how we experience our bodies' programmed biological reactions to different foods. A helpful result of knowing this is that it more thoroughly explains why we crave certain food types, giving a better perspective on the challenges facing those of us in modern times eating diets that may significantly differ from the species' evolutionary diet.

The pervasive nature of the false claim that "humans are natural veg*ns" has had its own, long-term, impact on the veg*n movement. Unfortunately, many veg*ns have developed very strong
attachments to their "lunch philosophy," and it has become a pseudo-religion for some (with the dietary extremists being, in effect, hateful and fanatical followers of the pseudo-religion). However, if you follow a veg*n diet because of underlying moral or spiritual factors, please stop and think:

You really don't need the naturalness claim to be a veg*n! That is, moral/spiritual reasons alone are adequate to justify following a veg*n diet (assuming the diet works for
you, of course).

Further, if the motivation for your diet is moral and/or spiritual, then you will want the basis of your diet to be honest as well as compassionate. In that case, ditching the
false myths of naturalness presents no problems; indeed, ditching false myths means that you are ditching a burden.

The information presented in this paper should, hopefully, make clear that many raw/veg*n "diet gurus" are promoting misinformation, myths, and logical fallacies as part of the raw/veg*n "party line." Unfortunately, junk science and crank science are far too common in the raw/veg*n community, and some extremists have earned their very bad reputations for hostility and dishonesty, as well.

197

Myths and crank science: a good basis for the raw/veg*n movements? And now, consider
whether these factors--myths, logical fallacies, crank science, hostile behavior by some raw/veg*n extremists--are a good basis for the long-term growth of the raw/veg*n community. Are these the factors that will expand the community and build a stable basis for the future? Or, rather, are such factors simply the seeds of a long-term, self-destructive erosion of the raw/veg*n community's credibility? An erosion that only stands to spread as the increasing influence on nutritional science of emerging evolutionary knowledge sets the record straight and exposes the above myth-making to a wider public for what it is?

Personal choice: myths and crank science, or reality and honesty? Each of us has a choice: we
can cling to false myths and crank science, or we can open up to reality and strive for an honest basis for our own diets, and for the larger raw/veg*n communities as well. I hope that you will give serious thought to your choice.

EPILOGUE: A Personal Note to the Reader


Writing this paper has been a major effort for me. It consumed most of my spare time for months: evenings, holidays, Saturdays, Sundays, and so on. Believe it or not, this paper was a real labor of love. My motive for having written it is to educate individuals about the myths of raw/veg*n diets, and thereby to help in what way I can to encourage reform on these issues in the raw/veg*n movements; to encourage the raw/veg*n movements toward a more consistently honest basis; and to encourage people to be skeptical of the dishonest and/or delusional crank science and myths that are promoted by far too many "diet gurus." I hope that you have enjoyed this paper. If so, I encourage you to read more of the material on this website, and also to invite your friends to visit the site and read the material here. Thank you for reading, and I wish you good health, and good thinking! P.S. I encourage all to also read Appendix 1. It is relevant to the subject and will likely be of interest to many readers. --Tom Billings

APPENDIX 1: The Carnivorous/Faunivorous Vegan


Acknowledgment: This appendix, including its title, is inspired by Moir [1994]. The objective of this brief appendix is to remind vegans, especially those who fall into the ego trap of selfrighteousness, that, by and large, no one is really a "pure" or "true" vegan. Even the most obsessive, nitpicking, label-reading vegan is not a "pure" vegan. The reason is that virtually everyone not only indirectly uses animal products, but everyone also consumes some fauna or animal foods, even if inadvertently.

198

Breast-fed "vegan" infants are actually carnivores. This potentially distressing state of affairs (to
the extremist) usually begins at birth, at least for those who are breast-fed. Moir [1994, p. 88] informs us (boldface emphasis mine): Lactivory is an adapted form of carnivory in that the food source, milk, is derived directly from animal cells (Moir, 1991). It follows that no young mammal is initially herbivorous (White, 1985); all are carnivores for some period. While this is a general rule for mammals, there is ample evidence that it applies to a large range of invertebrates as well as diverse vertebrates; the initial development of the young borne of eggs, whether oviparous or viviparous, is explicitly dependent on succour from the female in the form of yolk or nutrient transfer. Moir [1994] points out that most herbivorous mammalian neonates cannot use or obtain the normal food of adults. Further, most single plant foods are inadequate to support growth to maturity--they lack amino acids, typically lysine, and other nutritional factors (e.g., vitamin B-12). Finally, the significant energy and rapid growth requirements of a juvenile animal require food that is very high quality and easy to digest. Plant foods cannot meet the demanding requirements, but carnivory--including insectivory and lactivory (milk) can. (The amino acid profile for milk is said to be "close" to that for animal tissues.)

Carnivorous diet for infants is a necessary survival tactic. Moir concludes [1994, p. 99]:
The apparent carnivory of juvenile herbivores is a necessary tactic to satisfy the requirement for available amino acids, particularly the lysine and methionine, required in a highly concentrated and digestible form that is constantly and universally available. While plant proteins, particularly rubisco [ribulose bisphosphate carboxylase/oxygenase, a protein], have the required amino acid structure, digestibility, biological value and concentration required, the levels and availability fluctuate because of seasonal and structural changes (Minson and Wilson, 1980) and other interfering substances (Carpenter, 1960), except in very specific niches. Therefore, the only way for young herbivores to realize their growth potential and survive to reach reproductive maturity is to behave nutritionally as "carnivores." Thus we see that every "vegan" child who is breast-fed really starts off his or her life as a carnivore. And so does every other mammal, whether folivore, frugivore, or faunivore.

Note: The above is not intended to discourage breast-feeding by vegan mothers, or to increase
the negative attitudes (i.e., hatred) frequently displayed by allegedly compassionate vegans towards dairy.

Every vegan is actually a faunivore. Finally, readers should know that every vegan is actually a
faunivore because of the inadvertent consumption of insects. From Taylor [1975, p. 33] (italic and boldface emphases both mine): There's one last point I might call attention to in this consideration of our indirect eating of insects: strictly speaking, there is no such thing as a vegetarian, for vegetarians generally eat more insects than do non-vegetarians, simply because a larger percentage of their diet consists of food of plant origin. The more fruits, nuts, and vegetables in one's diet, the more insects one is going to eat. Since insects are animals, and their presence in our foods of plant origin is ubiquitous--although generally undetected-we cannot escape the fact that the so-called vegetarian is no vegetarian. Man is omnivorous

[faunivorous], whether he wants to be or not.

Paleolithic Diet vs. Vegetarianism: What was humanity's original, natural diet?
A 3-Part Visit with Ward Nicholson Copyright 1998 by Ward Nicholson. All rights reserved.

199

INTRODUCTORY REMARKS
The text of the interview is republished here much as it originally appeared in Chet Day's Health & Beyond newsletter, with a few small modifications necessary for the present web version. See the introductory remarks on the front page of the interview series for specifics.

For those unfamiliar, the term "Natural Hygiene," which appears periodically in these interviews,
is a health philosophy emphasizing a diet of mostly raw-food vegetarianism, primarily fruits, vegetables, and nuts, although for revisionists eating some cooked food, it can also include significant supplementary amounts of grains, legumes, and tubers.

Ward transferred coordinatorship of the Natural Hygiene M2M to long-time member Bob Avery in 1997, and is no longer associated with the Natural Hygiene movement. To learn more about the N.H.
M2M (now called the Natural Health M2M), or for information about getting a sample copy, you can find out more here.

Introduction
Our guest this issue is Ward Nicholson, [former] Coordinator of The Natural Hygiene M2M--a unique "many-to-many" letter group forum created for correspondence between Natural Hygienists in a published format, which has been operating since 1992. [Note: Long-time M2M participant Bob Avery took over the Coordinatorship in December 1996 after Ward resigned and left the Natural Hygiene movement.] Participants in the M2M write letters about their Hygienic experiences, debate viewpoints, and offer support to each other. Each issue, Ward collates the letters together (exactly as-is with nothing edited out) into a 160- to 180-page bimonthly installment of the M2M, a copy of which is sent out to everyone in the group for ongoing response.

Two primary issues to be covered. What we'll be discussing with Mr. Nicholson in H&B are two
things:

Real-world accounts of how long-term vegans do on the Natural Hygiene diet. One of
these consists of the ideas and conclusions Ward has reached about Hygienists' actual experiences in the real world (based on interacting with many Hygienists while coordinating the N.H. M2M)-which are often at variance with what the "official" Hygienic books tell us "should" happen. The human evolutionary/dietary past. And the other is the meticulous research he has done tracking down what our human ancestors ate in the evolutionary past as known by modern science, in the interest of discovering directly what the "food of our biological adaptation" actually was and is--again in the real world rather than in theory.

Given the recent death of T.C. Fry, I consider Ward's analysis of special importance to those who continue to adhere strictly to the fruits, vegetables, nuts and seeds diet. We'll tackle this month the question of humanity's primitive diet. In two subsequent issues, we'll wrap that topic up and delve into what Ward has learned from coordinating the Natural Hygiene M2M about Hygienists' experiences in real life. You'll find that will be a recurring theme throughout our discussions with Mr. Nicholson: what really goes on in real life when you are able to hear a full spectrum of stories from a range of Hygienists, as well as what science says about areas of Hygiene that you will find have in some cases been poorly researched or not at all by previous Hygienic writers. Not everyone will agree with or appreciate what Mr. Nicholson has to say. But, as I've written more than once, I publish material in H&B that you won't find anywhere else, material and sound thinking that

200

interests me and calls into question my ideas and my assumptions about building health naturally. In this series of three interviews, I guarantee Ward will challenge many of our mind sets. Mr. Nicholson has a lot of ground to cover, so without further ado, I happily present our controversial and articulate guest for this issue of H&B.

Personal experiences with fasting, Natural Hygiene, and veganism

Health & Beyond: Ward, why don't we start out with my traditional question: How was it that you became involved with Natural Hygiene?
Ward Nicholson: I got my introduction to Natural Hygiene through distance running, which eventually got me interested in the role of diet in athletic performance. During high school and college--throughout most of the 1970s--I was a competitive distance runner. Runners are very concerned with anything that will improve their energy, endurance, and rate of recovery, and are usually open to experimenting with different regimens in the interest of getting ever-better results. Since I've always been a bookworm, that's usually the first route I take for teaching myself about subjects I get interested in. In 1974 or '75, I read the book Yoga and the Athlete, by Ian Jackson, when it was published by Runner's World magazine. In it, he talked about his forays into hatha yoga (the stretching postures) as a way of rehabilitating himself from running injuries he had sustained. He eventually got into yoga full-time, and from there, began investigating diet's effect on the body, writing about that too. At first I was more interested in Are Waerland (a European Hygienist health advocate with a differing slant than Shelton), who was mentioned in the book, so I wrote Jackson for more information. But instead of giving me information about Waerland, he steered me in the direction of American Natural Hygiene, saying in his experience it was far superior. I was also fascinated with Jackson's experiences with fasting. He credited fasting with helping his distance running, and had a somewhat mind-blowing "peak experience" while running on his first long fast. He kept training at long distances during his fasts, so I decided that would be the first aspect of the Hygienic program I would try myself. Then in the meantime, I started frequenting health-food stores and ran across Herbert Shelton's Fasting Can Save Your Life on the bookracks, which as we all know, has been a very persuasive book for beginning Natural Hygienists.

Initial experiences with fasting. So to ease into things gradually, I started out with a few 3-day "juice"
fasts (I know some Hygienists will object to this language, but bear with me), then later two 8-day juicediet fasts while I kept on running and working at my warehouse job (during college). These were done--in fact, all the fasts I've experienced have been done--at home on my own. Needless to say, I found these "fasts" on juices difficult since I was both working, and working out, at the same time. Had they been true "water" fasts, I doubt I would have been able to do it. I had been enticed by the promises of more robust health and greater eventual energy from fasting, and kept wondering why I didn't feel as great while fasting as the books said I would, with their stories of past supermen lifting heavy weights or walking or running long distances as they fasted. Little did I realize in my naivete that this was normal for most fasters. At the time I assumed, as Hygienists have probably been assuming since time immemorial when they don't get the hoped-for results, that it was just because I "wasn't cleaned-out enough." So in order to get more cleaned-out, I kept doing longer fasts, working up to a 13-day true water fast, and finally a 25-day water fast over Christmas break my senior year in college. (I had smartened up just a little bit by this time and didn't try running during these longer fasts on water alone.)

201

I also tried the Hygienic vegetarian diet around this time. But as the mostly raw-food diet negatively affected my energy levels and consequently my distance running performance, I lost enthusiasm for it, and my Hygienic interests receded to the back burner. I was also weary of fasting at this point, never having reached what I supposed was the Hygienic promised land of a total clean-out, so that held no further allure for me at the time.

Health crash. After college, I drifted away from running and got into doing hatha yoga for a couple of
years, taught a couple of local classes in it, then started my own business as a typesetter and graphic designer. Things took off and during the mid to late 1980s, I worked 60 to 80 hours a week, often on just 5 to 6 hours of sleep a night, under extreme round-the-clock deadline pressures setting type at the computer for demanding advertising agency clients. I dropped all pretense of Hygienic living, with the exception of maintaining a nominally "vegetarian" regime. This did not preclude me, however, guzzling large amounts of caffeine and sugar in the form of a half-gallon or more of soft drinks per day to keep going. Eventually all this took its toll and by 1990 my nervous system--and I assume (in the absence of having gone to a doctor like most Hygienists don't!) probably my adrenals--were essentially just about shot from all the mainlining of sugar and caffeine, the lack of sleep, and the 24-hour-a-day deadlines and accompanying emotional pressures. I started having severe panic or adrenaline attacks that would sometimes last several hours during which time I literally thought I might die from a heart attack or asphyxiation. The attacks were so debilitating it would take at least a full day afterwards to recover every time I had one.

Post-crash fasts and recommitment to Natural Hygiene. Finally, in late 1990/early 1991, after I
had begun having one or two of these attacks a week, I decided it was "change my ways or else" and did a 42-day fast at home by myself (mostly on water with occasional juices when I was feeling low), after which I went on a 95%-100% raw-food Hygienic diet. The panic attacks finally subsided after the 5th day of fasting, and have not returned since, although I did come close to having a few the first year or two after the fast. Soon after I made the recommitment to Hygienic living, when I had about completed my 42-day fast, I called a couple of Hygienic doctors and had a few phone consultations. But while the information I received was useful to a degree with my immediate symptoms, it did not really answer my Hygienic questions like I'd hoped, nor did it turn out to be of significant help overcoming my health problems over the longer-term. So in 1992 I decided to start the Natural Hygiene M2M to get directly in touch with Hygienists who had had real experience with their own problems, not just book knowledge, and not just the party line I could already get from mainstream Hygiene. With this new source of information and experience to draw on, among others, my health has continued to improve from the low it had reached, but it has been a gradual, trial-and-error process, and not without the occasional setback to learn from.

Improvements on fasts alternating with gradual downhill trend on vegan Natural Hygiene diet. One of the motivating factors here was that although fasting had been helpful (and continues to be),
unfortunately during the time in between fasts (I have done three subsequent fasts on water of 11 days, 20 days, and 14 days in the past five years), I just was not getting the results we are led to expect with the Hygienic diet itself. In fact, at best, I was stagnating, and at worst I was developing new symptoms that while mild were in a disconcerting downhill direction. Over time, the disparity between the Hygienic philosophy and the results I was (not) getting started eating at me. I slowly began to consider through reading the experiences of others in the M2M that it was not something I was "doing wrong," or that I wasn't adhering to the details sufficiently, but that there were others who were also not doing so well following the Hygienic diet, try as they might. The "blame the victim for not following all the itty bitty details just right" mentality began to seem more and more suspect to me.

This leads us up to the next phase of your Hygienic journey, where you eventually decided to remodel your diet based on your exploration of the evolutionary picture of early human diets as now known by science. Coming from your Hygienic background, what was it that

202

got you so interested in evolution?


Well, I have always taken very seriously as one of my first principles the axiom in Hygiene that we should be eating "food of our biological adaptation." What is offered in Hygiene to tell us what that is, is the "comparative anatomy" line of reasoning we are all familiar with: You look at the anatomical and digestive structures of various animals, classify them, and note the types of food that animals with certain digestive structures eat. By that criterion of course, humans are said to be either frugivores or vegetarians like the apes are said to be, depending on how the language is used.

Shortcomings of the "comparative anatomy" rationale for determining our "natural" diet.
Now at first (like any good upstanding Hygienist!) I did not question this argument because as far as it goes it is certainly logical. But nonetheless, it came to seem to me that was an indirect route for finding the truth, because as similar as we may be to the apes and especially the chimpanzee (our closest relative), we are still a different species. We aren't looking directly at ourselves via this route, we are looking at a different animal and basically just assuming that our diet will be pretty much just like theirs based on certain digestive similarities. And in that difference between them and us could reside errors of fact. So I figured that one day, probably from outside Hygiene itself, someone would come along with a book on diet or natural foods that would pull together the evidence directly from paleontology and evolutionary science and nail it down once and for all. Of course, I felt confident at that time it would basically vindicate the Hygienic argument from comparative anatomy, so it remained merely an academic concern to me at the time.

Exposure to the evolutionary picture and subsequent disillusionment with Natural Hygiene.
And then one day several years ago, there I was at the bookstore when out popped the words The Paleolithic Prescription[1] (by Boyd Eaton, M.D. and anthropologists Marjorie Shostak and Melvin Konner) on the spine of a book just within the range of my peripheral vision. Let me tell you I tackled that book in nothing flat! But when I opened it up and began reading, I was very dismayed to find there was much talk about the kind of lean game animals our ancestors in Paleolithic times (40,000 years ago) ate as an aspect of their otherwise high-plant-food diet,* but nowhere was there a word anywhere about pure vegetarianism in our past except one measly paragraph to say it had never existed and simply wasn't supported by the evidence.[2] I have to tell you that while I bought the book, red lights were flashing as I argued vociferously in my head with the authors on almost every other page, exploiting every tiny little loophole I could find to save my belief in humanity's original vegetarian and perhaps even fruitarian ways. "Perhaps you haven't looked far enough back in time," I told them inside myself. "You are just biased because of the modern meat-eating culture that surrounds us," I silently screamed, "so you can't see the vegetarianism that was really there because you aren't even looking for it!" So in order to prove them wrong, I decided I'd have to unearth all the scientific sources at the local university library myself and look at the published evidence directly. But I didn't do this at first--I stalled for about a year, basically being an ostrich for that time, sort of forgetting about the subject to bury the cognitive dissonance I was feeling.

News of long-time vegetarians abandoning the diet due to failure to thrive. In the meantime,
though, I happened to hear from a hatha yoga teacher I was acquainted with who taught internationally and was well-known in the yoga community both in the U.S. and abroad in the '70s and early '80s, who, along with his significant other, had been vegetarian for about 17 years. To my amazement, he told me in response to my bragging about my raw-food diet that he and his partner had re-introduced some flesh foods to their diet a few years previously after some years of going downhill on their vegetarian diets, and it had resulted in a significant upswing in their health. He also noted that a number of their vegetarian friends in the yoga community had run the same gamut of deteriorating health after 10-15 years as vegetarians since the '70s era.

203

Once again, of course, I pooh-poohed all this to myself because they obviously weren't "Hygienist" vegetarians and none of their friends probably were either. You know the line of thinking: If it ain't Hygienic vegetarianism, by golly, we'll just discount the results as completely irrelevant! If there's even one iota of difference between their brand of vegetarianism and ours, well then, out the window with all the results! But it did get me thinking, because this was a man of considerable intellect as well as a person of integrity whom I respected more than perhaps anyone else I knew.

Gradual personal health decline on vegan diet. And then a few months after that, I began noticing I
was having almost continual semi-diarrhea on my raw-food diet and could not seem to make well-formed stools. I was not sleeping well, my stamina was sub-par both during daily tasks and exercise, which was of concern to me after having gotten back into distance running again, and so real doubts began creeping in. It was around this time I finally made that trip to the university library.

And so what did you find?


Enough evidence for the existence of animal flesh consumption from early in human prehistory (approx. 23 million years ago) that I knew I could no longer ignore the obvious. For awhile I simply could not believe that Hygienists had never looked into this. But while it was disillusioning, that disillusionment gradually turned into something exciting because I knew I was looking directly at what scientists knew based on the evidence. It gave me a feeling of more power and control, and awareness of further dietary factors I had previously ruled out that I could experiment with to improve my health, because now I was dealing with something much closer to "the actual" (based on scientific findings and evidence) as opposed to dietary "idealism."

Paleontological evidence shows humans have always been omnivores

What kind of "evidence" are we talking about here?


At its most basic, an accumulation of archaeological excavations by paleontologists, ranging all the way from the recent past of 10,000-20,000 years ago back to approximately 2 million years ago, where ancient "hominid" (meaning human and/or proto-human) skeletal remains are found in conjunction with stone tools and animal bones that have cut marks on them. These cut marks indicate the flesh was scraped away from the bone with human-made tools, and could not have been made in any other way. You also find distinctively smashed bones occurring in conjunction with hammerstones that clearly show they were used to get at the marrow for its fatty material.[3] Prior to the evidence from these earliest stone tools, going back even further (2-3 million years) is chemical evidence showing from strontium/calcium ratios in fossilized bone that some of the diet from earlier hominids was also coming from animal flesh.[4] (Strontium/calcium ratios in bone indicate relative amounts of plant vs. animal foods in the diet.[5]) Scanning electron microscope studies of the microwear of fossil teeth from various periods well back into human prehistory show wear patterns indicating the use of flesh in the diet too.[6] The consistency of these findings across vast eons of time show that these were not isolated incidents but characteristic behavior of hominids in many times and many places.

204

Evidence well-known in scientific community; controversial only for vegetarians. The


evidence--if it is even known to them--is controversial only to Hygienists and other vegetarian groups, few to none of whom, so far as I can discern, seem to have acquainted themselves sufficiently with the evolutionary picture other than to make a few armchair remarks. To anyone who really looks at the published evidence in the scientific books and peer-reviewed journals and has a basic understanding of the mechanisms for how evolution works, there is really not a whole lot to be controversial about with regard to the very strong evidence indicating flesh has been a part of the human diet for vast eons of evolutionary time. The real controversy in paleontology right now is whether the earliest forms of hominids were truly "hunters," or more opportunistic "scavengers" making off with pieces of kills brought down by other predators, not whether we ate flesh food itself as a portion of our diet or not.[7]

Timeline of dietary shifts in the human line of evolution


Can you give us a timeline of dietary developments in the human line of evolution to show readers the overall picture from a bird's-eye view, so we can set a context for further discussion here?
Sure. We need to start at the beginning of the primate line long before apes and humans ever evolved, though, to make sure we cover all the bases, including the objections often made by vegetarians (and fruitarians for that matter) that those looking into prehistory simply haven't looked far enough back to find our "original" diet. Keep in mind some of these dates are approximate and subject to refinement as further scientific progress is made.

65,000,000 to 50,000,000 B.C.: The first primates, resembling today's mouse lemurs, bush-babies, and
tarsiers, weighing in at 2 lbs. or less, and eating a largely insectivorous diet.[8]

50,000,000 to 30,000,000 B.C.: A gradual shift in diet for these primates to mostly frugivorous in the
middle of this period to mostly herbivorous towards the end of it, but with considerable variance between specific primate species as to lesser items in the diet, such as insects, meat, and other plant foods.[9]

30,000,000 to 10,000,000 B.C.: Fairly stable persistence of above dietary pattern.[10] Approx. 10,000,000 to 7,000,000 B.C.: Last common primate ancestor of both humans and the
modern ape family.[11]

Approx. 7,000,000 to 5,000,000 B.C.: After the end of the previous period, a fork occurs branching
into separate primate lines, including humans.[12] The most recent DNA evidence shows that humans are closely related to both gorillas and chimpanzees, but most closely to the chimp.[13] Most paleoanthropologists believe that after the split, flesh foods began to assume a greater role in the human side of the primate family at this time.[14]

Approx. 4,500,000 B.C.: First known hominid (proto-human) from fossil remains, known as
Ardipithecus ramidus--literally translating as "root ape" for its position as the very first known hominid, which may not yet have been fully bipedal (walking upright on two legs). Anatomy and dentition (teeth) are very suggestive of a form similar to that of modern chimpanzees.[15]

Approx. 3,700,000 B.C.: First fully upright bipedal hominid, Australopithecus afarensis (meaning
"southern ape," for the initial discovery in southern Africa), about 4 feet tall, first known popularly from the famous "Lucy" skeleton.[16]

205

3,000,000 to 2,000,000 B.C.: Australopithecus line diverges into sub-lines,[17] one of which will
eventually give rise to Homo sapiens (modern man). It appears that the environmental impetus for this "adaptive radiation" into different species was a changing global climate between 2.5 and 2 million years ago driven by glaciation in the polar regions.[18] The climatic repercussions in Africa resulted in a breakup of the formerly extensively forested habitat into a "mosaic" of forest interspersed with savanna (grassland). This put stress on many species to adapt to differing conditions and availability of foodstuffs.[19] The different Australopithecus lineages, thus, ate somewhat differing diets, ranging from more herbivorous (meaning high in plant matter) to more frugivorous (higher in soft and/or hard fruits than in other plant parts). There is still some debate as to which Australopithecus lineage modern humans ultimately descended from, but recent evidence based on strontium/calcium ratios in bone, plus teeth microwear studies, show that whatever the lineage, some meat was eaten in addition to the plant foods and fruits which were the staples. [20]

2,300,000 to 1,500,000 B.C.: Appearance of the first "true humans" (signified by the genus Homo), known as Homo habilis ("handy man")--so named because of the appearance of stone tools and cultures at
this time. These gatherer-hunters were between 4 and 5 feet in height, weighed between 40 to 100 pounds, and still retained tree-climbing adaptations (such as curved finger bones)[21] while subsisting on wild plant foods and scavenging and/or hunting meat. (The evidence for flesh consumption based on cut-marks on animal bones, as well as use of hammerstones to smash them for the marrow inside, dates to this period. [22]) It is thought that they lived in small groups like modern hunter-gatherers but that the social structure would have been more like that of chimpanzees.[23] The main controversy about this time period by paleoanthropologists is not whether Homo habilis consumed flesh (which is well established) but whether the flesh they consumed was primarily obtained by scavenging kills made by other predators or by hunting.[24] (The latter would indicate a more developed culture, the former a more primitive one.) While meat was becoming a more important part of the diet at this time, based on the fact that the diet of modern hunter-gatherers--with their considerably advanced tool set--has not been known to exceed 40% meat in tropical habitats* like habilis evolved in, we can safely assume that the meat in habilis' diet would have been substantially less than that.[25]

1,700,000 to 230,000 B.C.: Evolution of Homo habilis into the "erectines,"* a range of human species often collectively referred to as Homo erectus, after the most well-known variant. Similar in height to
modern humans (5-6 feet) but stockier with a smaller brain, hunting activity increased over habilis, so that meat in the diet assumed greater importance. Teeth microwear studies of erectus specimens have indicated harsh wear patterns typical of meat-eating animals like the hyena.[26] No text I have yet read ventures any sort of percentage figure from this time period, but it is commonly acknowledged that plants still made up the largest portion of the subsistence.* More typically human social structures made their appearance with the erectines as well.[27] The erectines were the first human ancestor to control and use fire. It is thought that perhaps because of this, but more importantly because of other converging factors--such as increased hunting and technological sophistication with tools--that about 900,000 years ago in response to another peak of glacial activity and global cooling (which broke up the tropical landscape further into an even patchier mosaic), the erectines were forced to adapt to an increasingly varied savanna/forest environment by being able to alternate opportunistically between vegetable and animal foods to survive, and/or move around nomadically.[28] For whatever reasons, it was also around this time (dated to approx. 700,000 years ago) that a significant increase in large land animals occurred in Europe (elephants, hoofed animals, hippopotamuses, and predators of the big-cat family) as these animals spread from their African home. It is unlikely to have been an accident that the spread of the erectines to the European and Asian continent during and after this timeframe coincides with this increase in game as well, as they probably followed them.[29]

206

Because of the considerably harsher conditions and seasonal variation in food supply, hunting became more important to bridge the seasonal gaps, as well as the ability to store nonperishable items such as nuts, bulbs, and tubers for the winter when the edible plants withered in the autumn. All of these factors, along with clothing (and also perhaps fire), helped enable colonization of the less hospitable environment. There were also physical changes in response to the colder and darker areas that were inhabited, such as the development of lighter skin color that allowed the sun to penetrate the skin and produce vitamin D, as well as the adaptation of the fat layer and sweat glands to the new climate.*[30] Erectus finds from northern China 400,000 years ago have indicated an omnivorous diet of meats, wild fruit and berries (including hackberries), plus shoots and tubers, and various other animal foods such as birds and their eggs, insects, reptiles, rats, and large mammals.[31]

500,000 to 200,000 B.C.: Archaic Homo sapiens (our immediate predecessor) appears. These human
species, of which there were a number of variants, did not last as long in evolutionary time as previous ones, apparently due simply to the increasingly rapid rate of evolution occurring in the human line at this time. Thus they represent a transitional time after the erectines leading up to modern man, and the later forms are sometimes not treated separately from the earliest modern forms of true Homo sapiens.[32]

150,000 to 120,000 B.C.: Homo sapiens neanderthalensis--or the Neanderthals--begin appearing


in Europe, reaching a height between 90,000 and 35,000 years ago before becoming extinct. It is now well accepted that the Neanderthals were an evolutionary offshoot that met an eventual dead-end (in other words, they were not our ancestors), and that more than likely, both modern Homo sapiens and Neanderthals were sister species descended from a prior common archaic sapiens ancestor.[33]

140,000 to 110,000 B.C.: First appearance of anatomically modern humans (Homo sapiens).[34] The last Ice Age also dates from this period--stretching from 115,000 to 10,000 years ago. Thus it was in
this context, which included harsh and rapid climatic changes, that our most recent ancestors had to flexibly adapt their eating and subsistence.[35] (Climatic shifts necessitating adaptations were also experienced in tropical regions, though to a lesser degree.[36]) It may therefore be significant that fire, though discovered earlier, came into widespread use around this same time[37] corresponding with the advent of modern human beings. Its use may in fact be a defining characteristic of modern humans[38] and their mode of subsistence. (I'll discuss the timescale of fire and cooking at more length later.)

130,000 to 120,000 B.C.: Some of the earliest evidence for seafoods (molluscs, primarily) in the diet by
coastal dwellers appears at this time,[39] although in one isolated location discovered so far, there is evidence going back 300,000 years ago.[40] Common use of seafoods by coastal aborigines becomes evident about 35,000 years ago,[41] but widespread global use in the fossil record is not seen until around 20,000 years ago and since.[42] For the most part, seafoods should probably not be considered a major departure,* however, as the composition of fish, shellfish, and poultry more closely resembles the wild land-game animals many of these same ancestors ate than any other source today except for commercial game farms that attempt to mimic ancient meat.[43]

40,000 to 35,000 B.C.: The first "behaviorally modern" human beings--as seen in the sudden
explosion of new forms of stone and bone tools, cave paintings and other artwork, plus elaborate burials and many other quintessentially modern human behaviors. The impetus or origin for this watershed event is still a mystery.[44]

40,000 B.C. to 10-8,000 B.C.: Last period prior to the advent of agriculture in which human beings universally subsisted by hunting and gathering (also known as the "Late Paleolithic"--or "Stone Age"-period). Paleolithic peoples did process some of their foods, but these were simple methods that would have been confined to pounding, grinding, scraping, roasting, and baking.[45]

207

35,000 B.C. to 15-10,000 B.C.: The Cro-Magnons (fully modern pre-Europeans) thrive in the cold
climate of Europe via big-game hunting, with meat consumption rising to as much as 50%* of the diet.[46]

25,000 to 15,000 B.C.: Coldest period of the last Ice Age, during which global temperatures
averaged 14F cooler than they do today[47] (with local variations as much as 59F lower[48]), with an increasingly arid environment and much more difficult conditions of survival to which plants, animals, and humans all had to adapt.[49] The Eurasian steppes just before and during this time had a maximum annual summer temperature of only 59F.[50] Humans in Europe and northern Asia, and later in North America, adapted by increasing their hunting of the large mammals such as mammoths, horses, bison and caribou which flourished on the open grasslands, tundra, and steppes which spread during this period.[51] Storage of vegetable foods that could be consumed during the harsh winters was also exploited. Clothing methods were improved (including needles with eyes) and sturdier shelters developed--the most common being animal hides wrapped around wooden posts, some of which had sunken floors and hearths.[52] In the tropics, large areas became arid. (In South Africa, for instance, the vegetation consisted mostly of shrubs and grass with few fruits.[53])

20,000 B.C. to 9,000 B.C.: Transitional period known as the "Mesolithic," during which the bow-andarrow appeared,[54] and gazelle, antelope, and deer were being intensively hunted,[55] while at the same time precursor forms of wild plant and game management began to be more intensively practiced. At this time, wild grains, including wheat and barley by 17,000 B.C.--before their domestication--were being gathered and ground into flour as evidenced by the use of mortars-and-pestles in what is now modern-day Israel. By 13,000 B.C. the descendants of these peoples were harvesting wild grains intensely and it was only a small step from there to the development of agriculture.[56] Game management through the burning-off of land to encourage grasslands and the increase of herds became widely practiced during this time as well. In North America, for instance, the western high plains are the only area of the current United States that did not see intensive changes to the land through extensive use of fire.[57] Also during this time, and probably also for some millennia prior to the Mesolithic (perhaps as early as 45,000 B.C.), ritual and magico-religious sanctions protecting certain wild plants developed, initiating a new symbiotic relationship between people and their food sources that became encoded culturally and constituted the first phase of domestication well prior to actual cultivation. Protections were accorded to certain wild food species (yams being a well-known example) to prevent disruption of their life cycle at periods critical to their growth, so that they could be profitably harvested later.[58] Digging sticks for yams have also been found dating to at least 40,000 B.C.,[59] so these tubers considerably antedated the use of grains in the diet. Foods known to be gathered during the Mesolithic period in the Middle East were root vegetables, wild pulses (peas, beans, etc.), nuts such as almonds, pistachios, and hazelnuts, as well as fruits such as apples. Seafoods such as fish, crabs, molluscs, and snails also became common during this time.[60]

Approx. 10,000 B.C.: The beginning of the "Neolithic" period, or "Agricultural Revolution," i.e.,
farming and animal husbandry. The transition to agriculture was made necessary by gradually increasing population pressures due to the success of Homo sapiens' prior hunting and gathering way of life. (Hunting and gathering can support perhaps one person per square 10 miles; Neolithic agriculture 100 times or more that many.[61]) Also, at about the time population pressures were increasing, the last Ice Age ended, and many species of large game became extinct (probably due to a combination of both intensive hunting and disappearance of their habitats when the Ice Age ended).[62] Wild grasses and cereals began flourishing,* making them prime candidates for the staple foods to be domesticated, given our previous familiarity with them.[63] By 9,000 B.C. sheep and goats were being domesticated in the Near East, and cattle and pigs shortly after, while wheat, barley, and legumes were being cultivated somewhat before 7,000 B.C., as were fruits and nuts, while meat consumption fell enormously.[64] By 5,000 B.C. agriculture had spread to all inhabited continents except Australia.[65] During the time since the beginning of the Neolithic, the ratio of

208

plant-to-animal foods in the diet has sharply increased from an average of probably 65%/35%* during Paleolithic times[66] to as high as 90%/10% since the advent of agriculture.[67]

Remains of fossil humans indicate decrease in health status after the Neolithic. In most
respects, the changes in diet from hunter-gatherer times to agricultural times have been almost all detrimental, although there is some evidence we'll discuss later indicating that at least some genetic adaptation to the Neolithic has begun taking place in the approximately 10,000 years since it began. With the much heavier reliance on starchy foods that became the staples of the diet, tooth decay, malnutrition, and rates of infectious disease increased dramatically over Paleolithic times, further exacerbated by crowding leading to even higher rates of communicable infections. Skeletal remains show that height decreased by four inches* from the Late Paleolithic to the early Neolithic, brought about by poorer nutrition, and perhaps also by increased infectious disease causing growth stress, and possibly by some inbreeding in communities that were isolated. Signs of osteoporosis and anemia, which was almost non-existent in pre-Neolithic times, have been frequently noted in skeletal pathologies observed in the Neolithic peoples of the Middle East. It is known that certain kinds of osteoporosis which have been found in these skeletal remains are caused by anemia, and although the causes have not yet been determined exactly, the primary suspect is reduced levels of iron thought to have been caused by the stress of infectious disease rather than dietary deficiency, although the latter remains a possibility.[68]

Subjectively based vegetarian naturalism vs. what evolution tells us So have Hygienists really overlooked all the [evolutionary] evidence you've compiled in the above timeline about the omnivorous diet of humans throughout prehistory? Are you serious?
It was a puzzle to me when I first stumbled onto it myself. Why hadn't I been told about all this? I had thought in my readings in the Hygienic literature that when the writers referred to our "original diet" or our "natural diet," that must mean what I assumed they meant: that not only was it based on comparative anatomy, but also on what we actually ate during the time the species evolved. And further, that they were at least familiar with the scientific evidence even if they chose to keep things simple and not talk about it themselves. But when I did run across and chase down a scientific reference or two that prominent Hygienists had at long last bothered to mention, I found to my dismay they had distorted the actual evidence or left out crucial pieces.

Could you name a name or two here and give an example so people will know the kind of thing you are talking about?
Sure, as long as we do it with the understanding I am not attempting to vilify anybody, and we all make mistakes. The most recent one I'm familiar with is Victoria Bidwell's citation (in her Health Seeker's Yearbook[69]) of a 1979 science report from the New York Times,[70] where she summarizes anthropologist Alan Walker's microwear studies of fossil teeth in an attempt to show that humans were originally exclusively, only, fruit-eaters. Bidwell paraphrases the report she cited as saying that "humans were once exclusively fruit eaters... eaters of nothing but fruit." And also that, "Dr. Walker and other researchers are absolutely certain that our ancestors, up to a point in relatively recent history, were fruitarians/vegetarians."[71] But a perusal of the actual article being cited reveals that:

Inappropriate absolutistic interpretation. The diet was said to be "chiefly" fruit, which was
the "staple," and the teeth studied were those of "fruit-eater[s]," but the article is not absolutistic like Bidwell painted it.

209

Meaning of "fruit" and "frugivore" misinterpreted. Fruit as defined by Walker in the


article included tougher, less sugary foods, such as acacia tree pods. (Which laypeople like ourselves would be likely to classify as a "vegetable"-type food in common parlance.) And although it was not clarified in the article, anyone familiar with or conscientious enough to look a little further into evolutionary studies of diet would have been aware that scientists generally use the terms "frugivore," "folivore," "carnivore," "herbivore," etc., as categories comparing broad dietary trends, only very rarely as exclusivist terms, and among primates exclusivity in food is definitely not the norm.

Time period was at the very earliest cusp of the evolution of australopithecines into homo habilis. The primate/hominids in the study were Australopithecus and Homo habilis-among the earliest in the hominid line--hardly "relatively recent history" in this context. Study was preliminary and equivocal. The studies were preliminary, and Walker was cautious, saying he didn't "want to make too much of this yet"--and his caution proved to be well warranted. I believe there was enough research material available by the late 1980s (Health Seeker's Yearbook was published in 1990) that had checking been done, it would have been found that while he was largely right about Australopithecine species being primarily frugivores (using a very broad definition of "fruit"), later research like what we outlined in our timeline above has shown Australopithecus also included small amounts of flesh, seeds, and vegetable foods, and that all subsequent species beginning with Homo habilis have included significant amounts of meat in their diet, even if the diet of habilis probably was still mostly fruit plus veggies.

There is more that I could nitpick, but that's probably enough. I imagine Victoria was simply very excited to see scientific mention of frugivorism in the past, and just got carried away in her enthusiasm. There's at least one or two similar distortions by others in the vegetarian community that one could cite (Viktoras Kulvinskas' 1975 book Survival into the 21st Century,[72] for instance, contains inaccuracies about ape diet and "fruitarianism") so I don't want to pick on her too much because I would imagine we've all done that at times. It may be understandable when you are unfamiliar with the research, but it points out the need to be careful.

Subjective naturalism vs. the functional definition provided by evolution/genetics


Overall, then, what I have been left with--in the absence of any serious research into the evolutionary past by Hygienists--is the unavoidable conclusion that Hygienists simply assume it ought to be intuitively obvious that the original diet of humans was totally vegetarian and totally raw. (Hygienists often seem impatient with scientists who can't "see" this, and may creatively embellish their research to make a point. Research that is discovered by Hygienists sometimes seems to be used in highly selective fashion only as a convenient afterthought to justify conclusions that have already been assumed beforehand.) I too for years thought it was obvious in the absence of realizing science had already found otherwise.

The subjective "animal model" for raw-food naturalism. The argument made is very similar to
the "comparative anatomy" argument: Look at the rest of the animals, and especially look at the ones we are most similar to, the apes.* They are vegetarians [this is now known to be false for chimps and gorillas and almost all the other great apes--which is something we'll get to shortly], and none of them cook their food. Animals who eat meat have large canines, rough rasping tongues, sharp claws, and short digestive tracts to eliminate the poisons in the meat before it putrefies, etc.

The trap of reactionary "reverse anthropomorphism." In other words, it is a view based on a


philosophy of "naturalism," but without really defining too closely what that naturalism is. The Hygienic view of naturalism, then, simplistically looks to the rest of the animal kingdom as its model for that naturalism by way of analogy. This is good as a device to get us to look at ourselves more objectively from "outside" ourselves, but when you take it too far, it completely ignores that we are unique in some ways,

210

and you cannot simply assume it or figure it all out by way of analogy only. It can become reverse anthropomorphism. (Anthropomorphism is the psychological tendency to unconsciously make human behavior the standard for comparison, or to project human characteristics and motivations onto the things we observe. Reverse anthropomorphism in this case would be saying humans should take specific behaviors of other animals as our own model where food is concerned.)

Subjective views of dietary naturalism are prone to considerable differences of opinion; don't offer meaningful scientific evidence. When you really get down to nuts and bolts about
defining what you subjectively think is "natural," however, you find people don't so easily agree about all the particulars. The problem with the Hygienic definition of naturalism--what we could call "the animal model for humans"--is that it is mostly a subjective comparison. (And quite obviously so after you have had a chance to digest the evolutionary picture, like what I presented above. Those who maintain that the only "natural" food for us is that which we can catch or process with our bare hands are by any realistic evolutionary definition for what is natural grossly in error, since stone tools for obtaining animals and cutting the flesh have been with us almost 2 million years now.) Not that there isn't value in doing this, and not that there may not be large grains of truth to it, but since it is in large part subjectively behavioral, there is no real way to test it fairly (which is required for a theory to be scientific), which means you can never be sure elements of it may not be false. You either agree to it, or you don't--you either agree to the "animal analogy" for raw-food eating and vegetarianism, or you have reservations about it--but you are not offering scientific evidence.

Evolutionary adaptation/genetics as the functional scientific test of what's natural. So my


view became, why don't we just look into the evolutionary picture as the best way to go straight to the source and find out what humans "originally" ate? Why fool around philosophizing and theorizing about it when thanks to paleoanthropologists we can now just go back and look? If we really want to resolve the dispute of what is natural for human beings, what better way than to actually go back and look at what we actually did in prehistory before we supposedly became corrupted by reason to go against our instincts? Why aren't we even looking? Are we afraid of what we might see? These questions have driven much of my research into all this. If we are going to be true dietary naturalists--eat "food of our biological adaptation" as the phrase goes-then it is paramount that we have a functional or testable way of defining what we are biologically adapted to. This is something that evolutionary science easily and straightforwardly defines: what is "natural" is simply what we are adapted to by evolution, and a central axiom of evolution is that what we are adapted to is the behavior our species engaged in over a long enough period of evolutionary time for it to have become selected for in the species' collective gene pool. This puts the question of natural behavior on a more squarely concrete basis. I wanted a better way to determine what natural behavior in terms of diet was for human beings that could be backed by science. This eliminates the dilemma of trying to determine what natural behavior is by resorting solely to subjective comparisons with other animals as Hygienists often do.

Correcting the vegetarian myths about ape diets


You mentioned the "comparative anatomy" argument that Natural Hygienists look to for justification instead of evolution. Let's look at that a little more. Are you saying it is fundamentally wrong?
No, not as a general line of reasoning in saying that we are similar to apes so our diets should be similar. It's a good argument--as far as it goes. But for the logic to be valid in making inferences about the human diet based on ape diet, it must be based on accurate observations of the actual food intake of apes. Idealists such as we Hygienists don't often appreciate just how difficult it is to make these observations, and do it thoroughly enough to be able to claim you have really seen everything the apes are doing, or capable of

211

doing. You have to go clear back to field observations in the 1960s and earlier to support the contention that apes are vegetarians. That doesn't wash nowadays with the far more detailed field observations and studies of the '70s, '80s, and '90s. Chimp and gorilla behavior is diverse, and it is difficult to observe and draw reliable conclusions without spending many months and/or years of observation. And as the studies of Jane Goodall and others since have repeatedly shown, the early studies were simply not extensive enough to be reliable.[73]

Citing of outdated science an earmark of idealism out of touch with reality. Science is a
process of repeated observation and progressively better approximations of the "real world," whatever that is. It is critical then, that we look at recent evidence, which has elaborated on, refined, and extended earlier work. When you see anybody--such as apologists for "comparative anatomy" vegetarian idealism (or in fact anybody doing this on any topic)--harking back to outdated science that has since been eclipsed in order to bolster their views, you should immediately suspect something.

Accumulation of modern post-1960s research shows apes are not actually vegetarians. The
main problem with the comparative anatomy argument, then--at least when used to support vegetarianism-is that scientists now know that apes are not vegetarians after all, as was once thought. The comparative anatomy argument actually argues for at least modest amounts of animal flesh in the diet, based on the now much-more-complete observations of chimpanzees, our closest animal relatives with whom we share somewhere around 98 to 98.6% of our genes.[74] (We'll also look briefly at the diets of other apes, but the chimpanzee data will be focused on here since it has the most relevance for humans.)

Diet of chimpanzees. Though the chimp research is rarely oriented to the specific types of percentage
numerical figures we Hygienists would want to see classified, from what I have seen, it would probably be fair to estimate that most populations of chimpanzees are getting somewhere in the neighborhood of 5%* of their diet on average in most cases (as a baseline) to perhaps 8-10%* as a high depending on the season, as animal food--which in their case includes bird's eggs and insects in addition to flesh--particularly insects, which are much more heavily consumed than is flesh.[75]

Meat consumption by chimps. There is considerable variation across different chimp


populations in flesh consumption, which also fluctuates up and down considerably within populations on a seasonal basis as well. (And behavior sometimes differs as well: Chimps in the Tai population, in 26 of 28 mammal kills, were observed to break open the bones with their teeth and use tools to extract the marrow for consumption,[76] reminiscent of early Homo habilis.) One population has been observed to eat as much as 4 oz. of flesh per day during the peak hunting season, dwindling to virtually nothing much of the rest of the time, but researchers note that when it is available, it is highly anticipated and prized.[77] It's hard to say exactly, but a reasonable estimate might be that on average flesh may account for about 1-3% of the chimp diet.[78] The more significant role of social-insect/termite/ant consumption. Now of course, meat consumption among chimps is what gets the headlines these days,[79] but the bulk of chimpanzees' animal food consumption actually comes in the form of social insects[80] (termites, ants, and bees), which constitute a much higher payoff for the labor invested to obtain them[81] than catching the colobus monkeys that are often the featured flesh item for chimps. However, insect consumption has often been virtually ignored[82] since it constitutes a severe blind spot for the Western world due to our cultural aversions and biases about it. And by no means is insect consumption an isolated occurrence among just some chimp populations. With very few exceptions, termites and/or ants are eaten about half the days out of a year on average, and during peak seasons are an almost daily item, constituting a significant staple food in the diet (in terms of regularity), the remains of which show up in a minimum of approximately 25% of all chimpanzee stool samples.[83] Breakdown of chimpanzee food intake by dietary category. Again, while chimp researchers normally don't classify food intake by the types of volume or caloric percentages that we Hygienists would prefer to see it broken down for comparison purposes (the rigors of observing these creatures in the wild make it difficult), what they do record is illustrative. A chart

212

for the chimps of Lope in Gabon classified by numbers of different species of food eaten (caveat: this does not equate to volume), shows the fruit species eaten comprising approx. 68% of the total range of species eaten in their diets, leaves 11%, seeds 7%, flowers 2%, bark 1%, pith 2%, insects 6%, and mammals 2%.[84] A breakdown by feeding time for the chimps of Gombe showed their intake of foods to be (very roughly) 60% of feeding time for fruit, 20% for leaves, with the other items in the diet varying greatly on a seasonal basis depending on availability. Seasonal highs could range as high as (approx.) 17% of feeding time for blossoms, 22-30% for seeds, 10-17% for insects, 2-6% for meat, with other miscellaneous items coming in at perhaps 4% through most months of the year. [85] Miscellaneous items eaten by chimps include a few eggs,[86] plus the rare honey that chimps are known to rob from beehives (as well as the embedded bees themselves), which is perhaps the most highly prized single item in their diet,[87] but which they are limited from eating much of by circumstances. Soil is also occasionally eaten--presumably for the mineral content according to researchers.[88]

Fluid intake in chimps not restricted to fruit, and includes water separately. For those
who suppose that drinking is unnatural and that we should be able to get all the fluid we need from "high-water-content" foods, I have some more unfortunate news: chimps drink water too. Even the largely frugivorous chimp may stop 2-3 times per day during the dry season to stoop and drink water directly from a stream (but perhaps not at all on some days during the wet season), or from hollows in trees, using a leaf sponge if the water cannot be reached with their lips.[89] (Or maybe that should be good news: If you've been feeling guilty or substandard for having to drink water in the summer months, you can now rest easy knowing your chimp brothers and sisters are no different!) The predilection of chimpanzees toward omnivorous opportunism. An important observation that cannot be overlooked is the wide-ranging omnivorousness and the predilection for tremendous variety in chimpanzees' diet, which can include up to 184 species* of foods, 40-60 of which may comprise the diet in any given month, with 13 different foods per day being one average calculated.[90] Thus, even given the largely frugivorous component of their diets, it would be erroneous to infer from that (as many Hygienists may prefer to believe) that the 5% to possibly 8% or so of their diet that is animal foods (not to mention other foods) is insignificant, or could be thrown out or disregarded without consequence--the extreme variety in their diet being one of its defining features. Over millions of years of evolution, the wheels grind exceedingly fine, and everything comes out in the wash. Remember that health is dependent on getting not just the right amounts of macroelements such as carbohydrates, fats, and proteins, but also critical amounts of trace minerals and vitamins for instance. We require, and are evolutionarily adapted to, the behavior that is natural to us. Where chimps are concerned, 5% or 8% animal food--whatever it actually is--is a modest but significant amount, and not something you can just say is incidental or could be thrown out without materially changing the facts.

Other ape diets. In order of how closely related the other great apes are to humans, the gorilla is next
after the chimpanzee, then the orangutan, and gibbon in decreasing order.[91] I'll just briefly summarize a few basic facts about the other great apes here, concentrating primarily on the gorilla.

Diet of gorillas compared with chimps. Interestingly, while the gorilla has often been cited
as a model in the modern mythology of "fruitarianism,"[92] on average it is actually the least frugivorous of the apes. Highland gorillas (where less fruit is available in their higher-altitude mountainous habitat) have become primarily folivorous (leaf/vegetative-eaters), while the lowland gorilla is more of a hybrid folivore/frugivore.[93] I might mention in this regard that there is some

213

suggestion chimps seem not to prefer extra-high roughage volumes, at least compared to the gorilla. Certainly they do not seem to be able to physiologically tolerate as much cellulose from vegetative matter in their diet.*[94] Gorillas can, however, tolerate higher amounts of folivorous matter, due apparently to their more varied and extensive intestinal flora and fauna.[95] Chimps, however, are known to "wadge" some of their foods, which is a form of juicing that has the effect of reducing their fiber intake.[96] Wadging means that they make a wad of leaves which is mixed in with the primary food item (such as a fruit) as a mass, which is then used as a "press" against their teeth and palate to literally "juice" the main food which they may suck on for up to 10 minutes before discarding the wadge of fiber after all the juice has been sucked out. Wadging may also serve as a way to avoid biting into potentially toxic seeds of certain fruits, from which they can then still extract the juices safely, or as a way to handle very soft items such as pulpy or overripe fruits, as well as eggs and meat.[97] Such behavior ought to debunk the prevalent Hygienic/raw-foods myth that it is always the more natural thing to do to eat "whole" rather than fragmented foods. This is not necessarily true, and again, such a view is based in subjective definitions out of touch with the real world. Another example here is that chimps (and gorillas as well) also eat a fair amount of "pith" in their diet-meaning the insides of stems of various plants--which they first have to process by peeling off the tough outer covering before the pith inside is either eaten or wadged.[98]

Other apes less closely related to humans. All the great apes, with the exception of the
gorilla, are primarily frugivorous, but they do eat some animal products as well, though generally less than the chimp--although lowland gorillas eat insects at a comparable rate to chimps. In decreasing order of animal food consumption in the diet, the orang comes first after the chimp, then the bonobo chimp, the gibbon, the lowland gorilla, and the highland gorilla--the latter eating any animal foods (as insects) incidentally in or on the plants eaten. Again, remember, animal food consumption here does not equate solely with flesh consumption, as that is less prominent than insects in ape diets. The chimp and bonobo chimp are the only ones to eat flesh (other than a rare occurrence of an orang who was observed doing so once). All the apes other than the highland gorilla eat at least some social insects, with the chimp, bonobo chimp, and orang also partaking of bird's eggs.[99]

Update on fat consumption in preagricultural and hunter-gatherer diets


*** "...there was much talk [in The Paleolithic Prescription] about the kind of lean game animals our ancestors in Paleolithic times (40,000 years ago) ate as an aspect of their otherwise high-plant-food diet..."
As one of the few initial published summaries of the modern Paleodiet evidence, it may be that the The Paleolithic Prescription's description of the levels of fat believed to have prevailed during the Paleolithic was somewhat on the conservative side. However, the picture at this point is still being fleshed out and undergoing debate. Of perhaps most relevance, though, is not so much the absolute levels of fat, but rather what the fat-intake profile would have been in terms of saturated vs. polyunsaturated and monounsaturated fats.

Organ meats favored in preference to muscle meats in hunter-gatherer diets. Observations of


modern hunter-gatherers have shown that muscle meats (the leanest part of the animal) are least preferred, sometimes even being thrown away in times of plenty, in preference to the fattier portions. Eaten first are the organs such as brains, eyeballs, tongue, kidneys, bone marrow (high in monounsaturated fat), and storage fat areas such as mesenteric (gut) fat. (Even this gut fat is much less saturated in composition, however, than the kind of marbled fat found in the muscle meat of modern feedlot animals.) There is no

214

reason to believe earlier hunter-gatherers would have been any different in these preferences, since other species of animals who eat other animals for food also follow the same general order of consumption.

Type of fat may be more important than amount of fat. As a related point, while it is likely to be
controversial for some time to come, an increasing amount of recent research on fats in the diet suggests there may actually be little if any correlation between the overall level of fat consumption and heart disease, atherosclerosis, cancer, etc., contrary to previous conclusions. Instead, the evidence seems to point more specifically to the trans fats (hydrogenated vegetable oils used to emulsify and preserve foods) which permeate much of the modern food supply in supermarkets, and highly saturated fats (including dairy products and probably saturated commercial meats, but not lean game meats) along with, perhaps, a high consumption of starches and refined carbohydrates driving the hyperinsulinism syndrome. (See the postscript to Part 2 for a brief discussion of the recent findings on hyperinsulinism.)

Cardiovascular disease and cancer are rare in hunter-gatherers, despite their fat/protein intake. Given that atherosclerosis, heart disease, and cancer are almost non-existent even in longer-lived
members of hunter-gatherer tribes eating large amounts of meat, and whose diets much better approximate what the evolutionary diet is thought to have been like than the diets of modern cultures, there is good reason to believe the earlier mainstream research on fats was faulty. (For details on health and disease in hunter-gatherers, see Hunter-Gatherers: Examples of Healthy Omnivores, on this site. The next page of Part 1 here will also discuss corrected anthropological survey data showing the average level of meat consumption of hunter-gatherers to be in excess of 50% of the diet.) Review papers and publications by fat and cholesterol researcher Mary Enig, Ph.D., and by Russell Smith and others have revealed a pattern of misinterpretation and misrepresentation of results of many earlier studies on dietary fat and cholesterol. (Enig recommends Smith's comprehensive Diet, Blood Cholesterol and Coronary Heart Disease: A Critical Review of the Literature [Vector Enterprises, Santa Monica, CA (1988

Co-evolution of increased human brain size with decreased size of digestive system

Data points to increasing dependence on denser foods, processed by a less energy-intensive gut to free up energy for the evolving brain.
Also, left completely out of Part 1 of the interview above due to my initial passing familiarity with further evidence are recent findings pointing to a correlation between increasing levels of animal flesh in the diet over the eons at the same time the human brain was in the process of near-tripling in size--from 375-550cc at the time of Australopithecus, to 500-800cc in Homo habilis, 775-1225cc in Homo erectus, and 1350cc in modern humans (Homo sapiens).

Sufficient amounts of long-chain fatty acids essential to support brain growth. While the
specific evolutionary factor(s) that drove the increase in human brain size are still being speculated about, one recent paper suggests that--whatever the causes--the evolutionary increase in brain size would not have been able to be supported physiologically without an increased intake of preformed long-chain fatty acids, which are an essential component in the formation of brain tissue. [Crawford 1992]

Animal prey likeliest source for required amounts of long-chain fatty acids during human brain evolution. Lack of sufficient intake of long-chain fatty acids in the diet would therefore be a
limiting factor on brain growth, and these are much richer in animal foods than plant. (Relative brain size development in herbivorous mammals was apparently limited by the amount of these fatty acids in plant food that was available to them.) Given the foods available in humanity's habitat during evolution, the

215

necessary level of long-chain fatty acids to support the increasing size of the human brain would therefore presumably only have been available through increased intake of flesh.

Human brain size since the late Paleolithic has decreased in tandem with decreasing contribution of animal food to diet. In addition, a recent analysis updating the picture of
encephalization (relative brain size) changes in humans during our evolutionary history has revealed that human cranial capacity has decreased by 11% in the last 35,000 years, the bulk of it (8%) in the last 10,000 [Ruff, Trinkaus, and Holliday 1997]. Eaton [1998] notes that this correlates well with decreasing amounts of animal food in the human diet during this timeframe. (Of particular relevance here is that most of this decrease in animal foods correlates with the dawn of agriculture 10,000 years ago.)

The central role of DHA in brain growth. Eaton [1988] also notes the obvious hypothesis here
would be that shortfalls in the preformed long-chain fatty acids important to brain development are logical candidates as the potentially responsible factors, most particularly docosahexaenoic acid (DHA), which is the long-chain fatty acid in most abundance in brain tissue, as well as docosatetraenoic acid (DTA), and arachidonic acid (AA). (The human body can synthesize these from their 18-carbon precursors linoleic acid (LA) and a-linolenic acid (ALA)--obtainable from plant foods--but the rate of synthesis does not match the amounts that can be gotten directly from animal foods. Additionally, an excessive amount of LA compared to ALA, which is likely when plant foods predominate in the diet, inhibits the body's ability to synthesize DHA endogenously, compounding the problem.) This evidence of decreasing brain size in the last 35,000 years, and particularly the last 10,000, represents important potentially corroborative evidence for the continuing role of animal foods in human brain development, since dietary changes in this most recent period of human prehistory can be estimated with more precision than dietary composition earlier in human evolution. While it should be clearly noted here that correlation alone is not causation, at the same time it should be acknowledged that there seem to be no other worthy hypotheses as yet to explain the dietary basis that could have supported the dramatic increase in brain size during human evolution.

Recent tuber-based hypothesis for evolutionary brain expansion fails to address key issues such as DHA and the recent fossil record. As a case in point, there has been one tentative alternative
hypothesis put forward recently by primatologist Richard Wrangham et al. [1999] suggesting that perhaps cooked tubers (primarily a starch-based food) provided additional calories/energy that might have supported brain expansion during human evolution. However, this idea suffers from some serious, apparently fatal flaws, in that the paper failed to mention or address critical pieces of key evidence regarding brain expansion that contradict the thesis. For instance, it overlooks the crucial DHA and/or DHA-substrate adequacy issue just discussed above, which is central to brain development and perhaps the most gaping of the holes. It's further contradicted by the evidence of 8% decrease in human brain size during the last 10,000 years, despite massive increases in starch consumption since the Neolithic revolution which began at about that time. (Whether the starch is from grain or tubers does not essentially matter in this context.) Meat and therefore presumed DHA consumption levels, both positive *and* negative-trending over human evolution, track relatively well not simply with the observed brain size increases during human evolution, but with the Neolithic-era decrease as well, on the other hand. [Eaton 1998] These holes, among others in the hypothesis, will undoubtedly be drawing comment from paleo researchers in future papers, and hopefully there will be a writeup on Beyond Veg as more is published in the peerreview journals in response to the idea. At this point, however, it does not appear to be a serious contender in plausibly accounting for all the known evidence.

Co-evolution of increased brain size with concurrent reduction in size of the human gut.
Recent work is showing that the brain (20-25% of the human metabolic budget) and the intestinal system are both so metabolically energy-expensive that in mammals generally (and this holds particularly in

216

primates), an increase in the size of one comes at the expense of the size of the other in order not to exceed the organism's limited "energy budget" that is dictated by its basal metabolic rate. The suggestion here is not that the shrinkage in gut size caused the increase in brain size, but rather that it was a necessary accompaniment. In other words, gut size is a constraining factor on potential brain size, and vice versa. [Aiello and Wheeler 1995]

Human gut has evolved to be more dependent on nutrient- and energy-dense foods than other primates. The relationship of all this to animal flesh intake is that compared to the other primates,
the design of the more compact human gut is less efficient at extracting sufficient energy and nutrition from fibrous foods and considerably more dependent on higher-density, higher-bioavailable foods, which require less energy for their digestion per unit of energy/nutrition released. Again, while it is not clear that the increasing levels of animal flesh in the human diet were a directly causative factor in the growth of the evolving human brain, their absence would have been a limiting factor regardless, without which the change likely could not have occurred. Other supporting data suggest that in other animals there is a pattern whereby those with larger brain-to-body-size ratios are carnivores and omnivores, with smaller, less complex guts, and dependent on diets of denser nutrients of higher bioavailability. Vegetarian philosophy has traditionally relied on observing that the ratio of intestinal length to body trunk length parallels that of the other primates as an indication the human diet should also parallel their more frugivorous/vegetarian diet. However, this observation is based on the oversimplification that gut length is the relevant factor, when in fact both cell types and intestinal surface area are the more important operative factors, the latter of which can vary greatly depending on the density of villi lining the intestinal walls. In these respects, the human gut shares characteristics common to both omnivores and carnivores. [McArdle 1996, p. 174] Also, intestinal length does not necessarily accurately predict total gut mass (i.e., weight), which is the operative criterion where brain size/gut size relationships are at issue. The human pattern of an overall smaller gut with a proportionately longer small intestine dedicated more to absorptive functions, combined with a simple stomach, fits the same pattern seen in carnivores. [Aiello and Wheeler 1995, p. 206]

Corrected anthropological survey data shows meat averages over 50% of hunter-gatherer diets
*** "No text I have yet read ventures any sort of percentage figure from this time period [the historical period of Homo erectus' existence], but it is commonly acknowledged that plants still made up the largest portion of the subsistence."
The most significant correction to Part 1 of the interview series here involves newer Paleodiet analysis of the amount of plant vs. animal food in modern hunter-gatherer diets, which--in conjunction with optimal foraging theory and knowledge of ancient habitats--can be used as a basis for extrapolating backward to estimate what that of our hominid ancestors may have been. The widely quoted figures of Richard Lee (in his Man the Hunter, 1968) that stated an average of 65% plant and 35% animal food for modern huntergatherers have, upon review by other researchers, been discovered to have been somewhat flawed. (Lee's 65%/35% ratio was in turn repeated in the research sources I used for this interview, but I had not traced the figures to their ultimate source at the time.)

Previous meta-analysis of Ethnographic Atlas survey data on hunter-gatherer diets was performed incorrectly. As noted by Ember [1978], it turns out that in calculating his averages, Lee
somewhat arbitrarily threw out a portion of the North American hunter-gatherer cases (who often had higher rates of meat consumption); plus he classified shellfishing as a "gathering" activity (normally used to categorize plant-food gathering). Taken together, these skewed the resulting average considerably. Reanalysis correcting for Lee's analytical errors shows a more likely average of somewhere between 5065% meat consumption in modern hunter-gatherer diets. (For a brief enumeration of the reanalysis that researcher Loren Cordain's group has done, see the first section or two of Metabolic Evidence of Human Adaptation to Increased Carnivory.)

217

Modern hunter-gatherer diets vis-a-vis optimal foraging theory and reconstructed prehistoric diets. An important issue worthy of mention here is the reliability of backward extrapolations
from behaviorally modern hunter-gatherers to more primitive hominids. The phrase "behaviorally modern" generally refers to the time horizon of approximately 40,000 B.C., during and after which humans began exhibiting behaviors we think of as modern: such as ritual burials, cave paintings, body ornamentation, other expressions of art, and most importantly for our purposes here, more sophisticated tool design, use, and techniques where hunting is concerned (eventually culminating in the bow-and-arrow, for example) which would presumably have increased hunting success. In light of this, critics have questioned how reliable the backward extrapolations may be. Proponents, in justification, note that well-established "optimal foraging theory" (which, reduced to its bare essentials, says that all creatures tend to expend the least foraging effort for the greatest caloric and/or nutritional return) can be used as a basis to make reasonable predictions. The kind of food intake that optimal foraging theory predicts for a particular species (modern or ancient) is in turn dependent on the surrounding environment and foods prevailing at the time (i.e., savanna habitat, generally, in the case of humans), along with what is known about the kind of foods their digestive physiology would likely be able to efficiently process. Based on this approach, proponents believe even with less sophisticated hunting technologies--given what is now known about ancient environments and availabilities of ancient plant and animal resources--the level of animal food in the diet beginning with erectus, at least, probably would have been in the 50% range with variations for local habitat. The increasing encephalization quotient (brain volume relative to body size) of Homo and particularly the jump in brain size with Homo erectus at 1.7 million years ago also tends to corroborate the suggestion of increasingly large amounts of meat in the diet at this time.

Meat consumption levels in early Homo species

*** "based on the fact that the diet of modern hunter gatherers...has not been known to exceed 40% meat in tropical habitats like habilis evolved in, we can safely assume that the meat in habilis' diet would have been substantially less than that."
As just discussed above regarding earlier analyses of anthropological data that have since been corrected, the average percentage of meat eaten by all modern hunter-gatherers studied to date who have been described in the Ethnographic Atlas (an anthropological compendium and database of hunter-gatherer data)--has now been shown to be in the 50-65% range, looking at the entire range of habitats studied. There are tropical hunter-gatherers eating in this range of meat consumption as well. Therefore, the percentage of meat in habilis' diet becomes an even more interesting question than before. As the diet of habilis' precursor Australopithecus would presumably have been higher in meat than modern chimps (who are in the 2% range of meat consumption) at some undetermined level, while habilis' successor erectus is now thought to have been near the range of modern hunter-gatherers (perhaps 50%), the question of where in between those wide endpoints habilis' meat consumption fell is not something I am prepared to guess. However, when we note that habilis was the first known hominid tool user--and that those tools were specifically used for processing meat--it seems logical to suggest that the amount must have jumped well above that of Australopithecus.

*** "1,500,000 to 230,000 B.C.: Evolution of Homo habilis into the 'erectines'..."
New and controversial reanalysis of previous fossil discoveries in Java has suggested erectus' span of existence may possibly have extended to as late as 30,000-50,000 B.C. in isolated areas, although the new analysis is still undergoing intensive debate. [Wilford 1996]

218

*** "There were also physical changes in response to the colder and darker [more northerly] areas that were inhabited [by erectus after 700,000 years ago], such as the development of lighter skin color that allowed the sun to penetrate the skin and produce vitamin D, as well as the adaptation of the fat layer and sweat glands to the new climate."
To avoid confusion here, it should be mentioned that these adaptations are believed to have evolved a second time in modern homo sapiens 100,000-200,000 years ago once they began migrating out of Africa-after having evolved from a group of erectines that would have remained behind in Africa during the earlier migrational wave during which other erectines had spread northward 700,000 years ago.

Research updates relating to diet and health late in human evolution


*** "...but widespread global use [of seafoods] in the fossil record is not seen until around 20,000 years ago and since. For the most part, seafoods should probably not be considered a major departure, however, as the composition of fish, shellfish, and poultry more closely resembles the wild land-game animals many of these same ancestors ate than any other source today except for commercial game farms that attempt to mimic ancient meat."
It may be that the fact fish is one of the foods some people are allergic to is an indicator evolutionary adaptation to this food may well not be complete, or is at least still problematic for some genetic subgroups, though compared to grains and dairy, there is of course a far better case for including it in a Paleolithic diet. Strictly speaking, meat from land animals is the primary flesh-food adaptation for humans. (The small minority of researchers promoting the very controversial "aquatic ape" theory of human evolution presently considered fringe science--which proposes a brief formative phase of human evolution taking place in coastal waters--might disagree, of course.) To this date, I have not myself seen or heard convincing research either way on how well adapted the human species overall may be to fish consumption compared with land game animals.

*** "35,000 B.C. to 15-10,000 B.C.: The Cro-Magnons (fully modern pre-Europeans) thrive in the cold climate of Europe via big-game hunting, with meat consumption rising to as much as 50% of the diet."
At least one researcher has voiced the opinion that given Ice-Age Europe in many areas was not too dissimilar from the arctic climate that modern-day Eskimos inhabit [i.e., very little plant food available], meat consumption in Europe during the last Ice Age may have approached similar levels--that is, considerably more than 50%, up to as high as 90% in some areas.

*** "Wild grasses and cereals began flourishing [around 10,000 B.C.] making them prime candidates for the staple foods to be domesticated, given our previous familiarity with them."
This statement could be construed as giving the impression that wild grasses and cereals may have been flourishing in many places around the globe. While it is true that environmental conditions at this time became considerably more favorable to wild grasses and cereals, I have since learned that the ones which thrived naturally and lent themselves best to the development of agriculture (wheat, barley, etc.) were indigenous only to a limited number of locales around the globe, the most notable of which, of course, was the Near East, where agriculture got its earliest start. A more accurate statement would be that peoples in the locations where these grasses and cereals began flourishing indigenously took advantage of the improved weather and environment to exploit them even further. The grasses and cereals would not have flourished to the extent they did, nor--most importantly--spread as widely as they came to do (virtually around the globe), were it not for proactive human intervention.

219

*** "During the time since the beginning of the Neolithic, the ratio of plant-to-animal foods in the diet has sharply increased from an average of probably 65%/35% during Paleolithic times to as high as 90%/10% since the advent of agriculture."
As stated above, the average animal food consumption during the Paleolithic was more likely in the 5065% range than 35%. This makes the decrease down to 10% of the diet in many places even more noteworthy in terms of suggesting the possible nutritional repercussions of modern-day diets.

*** "Skeletal remains show that height decreased by four inches from the Late Paleolithic to the early Neolithic..."
Various sources I've seen since say as much as six inches decrease in height. It is worth noting in this connection that vegetarians will often argue that the extra growth animal foods promote is "pathological" hypergrowth; leads to premature aging, etc. It should be noted, however, that recent studies show that the reduced growth sometimes seen in infants on vegan diets can lead to failure to meet accepted benchmarks for health status during the growth period. In refuting the common claim of vegetarians that the rebound in adult heights of modern populations eating more animal food is nothing but "pathological hypergrowth," it is also worth repeating that the historical decrease in height noted in skeletal remains after the Neolithic (when animal food consumption began plunging) was accompanied by skeletal signs of disease-stress, as mentioned in Part 1 of the interview. Also, longevity of Neolithic peoples, who ate diets higher in plant food (particularly grains) and lower in animal food, was in general a bit shorter than that of Paleolithic peoples. [Angel 1984]

Clarifications regarding chimpanzee diet

*** "The argument made is very similar to the "comparative anatomy" argument: Look at the rest of the animals, and especially look at the ones we are most similar to, the apes."
A brief note: the original article noted that humans are most genetically similar to chimpanzees. If more space had been available, I would have made the further distinction that we are equally as genetically close to the bonobo chimpanzee. However, since the common chimpanzee has been more closely studied than the rare bonobo chimp, the research from these studies has (until recently, perhaps) provided more wideranging insights. So far as I am aware, the main observed differences between the common chimp and the bonobo, diet-wise, are that bonobos eat a higher percentage of fruit in the diet (approx. 80% vs. up to 60%-65% for the common chimp); and apparently insect and flesh consumption is less. Interestingly reminiscent of human behavior, the bonobos' highly sexual behavior in many common social situations seems to serve a communication function both as a social "glue" and as a conciliatory mechanism between individuals in potentially divisive situations. (Examples: bonobos will often engage in sex-play upon coming onto a major food source--sensory excitation often turns in sexual excitation, which is then discharged in fraternization with others in the group and seems to promote group cohesiveness. Also, when spats occur between bonobos, sexual engagement afterwards may serve as a way to smooth the incident over and reestablish normal relations.)

*** "...it would probably be fair to estimate that most populations of chimpanzees are getting somewhere in the neighborhood of 5% of their diet on average in most cases (as a baseline) to perhaps 8-10%..."
While the 5% figure was the best guess I could make at the time based on the difficulty in finding any research with precise figures, it does appear to be fairly close to the mark. I have since seen figures quoted

220

in the 4% to 7% range for chimp animal-food consumption. 8-10% is in all likelihood too high as an average, though could plausibly reach that high at the height of the termite season or during the peak months for red colobus monkey kills.

*** "An important observation that cannot be overlooked is the wide-ranging omnivorousness and the predilection for tremendous variety in chimpanzees' diet, which can include up to 184 species of foods..."
Something I overlooked in this factoid taken from Goodall's The Chimpanzees of Gombe was that 26 out of the 184 items were observed to be eaten only once. Nevertheless, the 158 remaining items still constitute a remarkable variety of food items. That the additional items would still be sought out, on top of an already high level of variety, can be looked at as an interesting indicator of the "opportunistic" nature of chimpanzees in taking advantage of whatever food they can find and utilize. (In other words, as a "model" for the kind of dietary behavior that vegetarians sometimes hold up that humans ought to be compared to, chimps' opportunism suggests they are not particularly constrained by arbitrary food categories in deciding what is appropriate or not to eat.)

*** "...there is some suggestion chimps seem not to prefer extra-high roughage volumes, at least compared to the gorilla. Certainly they do not seem to be able to physiologically tolerate as much cellulose from vegetative matter in their diet."
This could be a bit misleading if construed to mean chimps don't have a lot of roughage in their diet, because they certainly do. It remains true, however, that chimps do not possess the requisite kind of symbiotic intestinal fauna (bacteria) to subsist on diets as high in leafy vegetative matter as gorillas do. The subsequent discussion on wadging was meant to show two things: that chimp behavior is inventive toward meeting their needs, and also that wadging can be a way of exploiting the part of fibrous foods they could otherwise not eat freely of due to the limitations of their gut. The relevance of these observations for human diet would be that the first hominids went much further in the direction of less fibrous foods than chimpanzees were to go after the divergence from our common ape ancestor, and toward a much more concentrated diet higher in animal foods containing denser nutrients of higher bioavailability, as we discussed above regarding the increasing size of the human brain that occurred concomitantly with a reduction in the size of the gut. This concludes Part 1. In Part 2, we'll look into such things as fire and cooking in human evolution, rates of genetic adaptation to change, modern hunter/gatherers, diseases in the wild, and then turn to "the psychology of idealistic diets."

Knowledge gap in vegetarian community about evolutionary data/implications

Health & Beyond: In Part 1 of our interview, you discussed the extensive evidence showing that primitive human beings as well as almost all of the primates today have included animal foods such as flesh or insects in their diets. Why haven't Natural Hygienists and other vegetarians looked into all this information?
Ward Nicholson: My guess is that: (1) Most aren't aware that paleoanthropologists have by now assembled a considerable amount of data about our evolutionary past related to diet. But more importantly, I think it has to do with psychological barriers, such as: (2) Many Hygienists assume they don't have to look because the subjective "animal model" for raw-food naturalism makes it "obvious" what our natural diet is, and therefore the paleontologists' evidence must therefore be in error, or biased by present cultural eating practices. Or: (3) They don't want to look, perhaps because they're afraid of what they might see.

221

Many Hygienists identify the system mostly with certain dietary details, even though the system itself flows from principles independent of those details. I think in spite of what most
Natural Hygienists will tell you, they are really more wedded to certain specific details of the Hygienic system that remain prevalent (i.e., raw-food vegetarianism, food combining, etc.) than they are truly concerned with whether those details follow logically from underlying Hygienic principles. The basic principle of Natural Hygiene is that the body is a self-maintaining, self-regulating, self-repairing organism that naturally maintains its own health when it is given food and other living conditions appropriate to its natural biological adaptation. In and of itself, this does not tell you what foods to eat. That has to be determined by a review of the best evidence we have available. So while the principles of Hygiene as a logical system do not change, our knowledge of the appropriate details that follow from those principles may and probably will change from time to time--since science is a process of systematically elucidating more "known" information from what used to be unknown. Thus the accuracy of our knowledge is to some extent time-based, dependent on the accumulation of evidence to provide a more inclusive view of "truth" which unfortunately is probably never absolute, but--as far as human beings are concerned--relative to the state of our knowledge. Science simply tries to bridge the knowledge gap. And a hallmark of closing the knowledge gap through scientific discovery is openness to change and refinements based on the accumulation of evidence. Open-mindedness is really openness to change. Just memorizing details doesn't mean much in and of itself. It's how that information is organized, or seen, or interpreted, or related to, that means something.

Hygienic and vegan diets are a significant restriction of the diet(s) on which humans evolved. What's interesting to me is that the evolutionary diet is not so starkly different from the Hygienic
diet. Much of it validates important elements of the Hygienic view. It is very similar in terms of getting plenty of fresh fruits and veggies, some nuts and seeds, and so forth, except for the addition of the smaller role of flesh and other amounts of animal food (at least compared to the much larger role of plant foods) in the diet. It's one exception. We have actually done fairly well in approximating humanity's "natural" or "original" diet, except we have been in error about this particular item, and gotten exceedingly fundamentalist about it when there is nothing in the body of Hygienic principles themselves that would outlaw meat if it's in our evolutionary adaptation.

Avowed Shelton loyalists are actually the ones who have most ignored his primary directive.
But for some reason, even though Natural Hygiene is not based on any "ethical" basis for vegetarianism (officially at least), this particular item seems to completely freak most Hygienists out. Somehow we have made a religion out of dietary details that have been the hand-me-downs of past Hygienists working with limited scientific information. They did the best they could given the knowledge they had available to them then, and we should be grateful for their hard work. But today the rank and file of Natural Hygiene has largely forgotten Herbert Shelton's rallying cry, "Let us have the truth, though the heavens fall." Natural Hygiene was alive and vital in Shelton's time because he was actively keeping abreast of scientific knowledge and aware of the need to modify his previous views if scientific advances showed them to be inadequate. But since Shelton retired from the scene, many people in the mainstream of Hygiene have begun to let their ideas stagnate and become fossilized. The rest of the dietary world is beginning to pass us by in terms of scientific knowledge.

Only two insights remain that are still somewhat unique to Natural Hygiene. As I see it, there
remain only two things Natural Hygiene grasps that the rest of the more progressive camps in the dietary world still don't:

Strong recognition of the principle of the body as homeostatic self-healing mechanism. An understanding of the fundamental health principle that outside measures (drugs,
surgery, etc.) never truly "cure" degenerative health problems. In spite of the grandiose hopes and claims that they do, and the aura of research breakthroughs, their function is really to serve as crutches, which can of course be helpful and may truly be needed in some circumstances. But the

222

only true healing is from within by a body that has a large capacity, within certain limits, to heal and regenerate itself when given all of its essential biological requirements--and nothing more or less which would hamper its homeostatic functioning. The body's regenerative (homeostatic) abilities are still commonly unrecognized today (often classed as "unexplained recoveries" or--in people fortunate enough to recover from cancer--as "spontaneous remission") because the population at large is so far from eating anything even approaching a natural diet that would allow their bodies to return to some kind of normal health, that it is just not seen very often outside limited pockets of people seriously interested in approximating our natural diet.

Fasting as a tool to promote such self-healing. And the other thing is that Hygienists are
also keenly aware of the power of fasting to help provide ideal conditions under which such selfhealing can occur.

But the newer branch of science called "darwinian medicine" is slowly beginning (albeit with certain missteps) to grasp the principle of self-healing, or probably more correctly, at least the understanding that degenerative diseases arise as a result of behavior departing from what our evolutionary past has adapted us to. They see the negative side of how departing from our natural diet and environment can result in degenerative disease, but they do not understand that the reverse--regenerating health by returning to our pristine diet and lifestyle, without drugs or other "crutches"--is also possible, again, within certain limits, but those limits are less than most people believe.

In some ways, though, Hygiene now resembles a religion as much as it does science, because
people seem to want "eternal" truths they can grab onto with absolute certainty. Unfortunately, however, knowledge does not work that way. Truth may not change, but our knowledge of it certainly does as our awareness of it shifts or expands. Once again: The principles of Hygiene may not change, but the details will always be subject to refinement

The rift in the Natural Hygiene movement over raw vs. cooked foods
Speaking of such details subject to refinement, I know you've been sitting on some very suggestive evidence to add further fuel to the fire-and-cooking debate now raging between the raw-foodist and "conservative-cooking" camps within Hygiene. Please bring us up to date on what the evolutionary picture has to say about this.
I'd be happy to. But before we get into the evolutionary viewpoint, I want to back up a bit first and briefly discuss the strange situation in the Hygienic community occurring right now over the raw foods vs. cooking-of-some-starch-foods debate. The thing that fascinates me about this whole brouhaha is the way the two sides justify their positions, each of which has a strong point, but also a telling blind spot.

Character of the rift: doctors vs. the rank-and-file. Now since most Natural Hygienists don't have
any clear picture of the evolutionary past based on science for what behavior is natural, the "naturalistic" model used by many Hygienists to argue for eating all foods raw does so on a subjective basis--i.e., what I have called "the animal model for raw-food naturalism." The idea being that we are too blinded culturally by modern food practices involving cooking, and to be more objective we should look at the other animals--none of whom cook their food--so neither should we. Now it's true the "subjective raw-food naturalists" are being philosophically consistent here, but their blind spot is they don't have any good scientific evidence from humanity's primitive past to back up their claim that total raw-foodism is the most natural behavior for us--that is, using the functional definition based on evolutionary adaptation I have proposed if we are going to be rigorous and scientific about this.

223

Now on the other hand, with the doctors it's just the opposite story. In recent years, the Natural Hygiene doctors and the ANHS (American Natural Hygiene Society) have been more and more vocal about what they say is the need for a modest amount of cooked items in the diet--usually starches such as potatoes, squashes, legumes, and/or grains. And their argument is based on the doctors' experience that few people they care for do as well on raw foods alone as they do with the supplemental addition of these cooked items. Also, they argue that there are other practical reasons for eating these foods, such as that they broaden the diet nutritionally, even if one grants that some of those nutrients may be degraded to a degree by cooking. (Though they also say the assimilation of some nutrients is improved by cooking.) They also point out these starchier foods allow for adequate calories to be eaten while avoiding the higher levels of fat that would be necessary to obtain those calories if extra nuts and avocados and so forth were eaten to get them.

One side ignores the need for philosophical consistency. The other denies practical realities and real-world results. So we have those with wider practical experience arguing for the inclusion of
certain cooked foods based on pragmatism. But their blind spot is in ignoring or attempting to finesse the inconsistency their stance creates with the naturalist philosophy that is the very root of Hygienic thinking. And again, the total-raw-foodists engage in just the opposite tactics: being philosophically consistent in arguing for all-raw foods, but being out of touch with the results most other people in the real world besides themselves get on a total raw-food diet, and attempting to finesse that particular inconsistency by nitpicking and fault-finding other implementations of the raw-food regime than their own. (I might interject here, though we'll cover this in more depth later, that although it's not true for everyone, experience of most people in the Natural Hygiene M2M supports the view that the majority do in fact do better when they add some cooked foods to their diet.)

Is there a way these two stances in the conflict over cooking can be reconciled and accounted for scientifically? Now my tack as both a realist and someone who is also interested in being
philosophically consistent has been: If it is true that most people* do better with the inclusion of some of these cooked items in their diet that we've mentioned--and I believe that it is, based on everything I have seen and heard--then there must be some sort of clue in our evolutionary past why this would be so, and which would show why it might be natural for us. The question is not simply whether fire and cooking are "natural" by some subjective definition. It's whether they have been used long enough and consistently enough by humans during evolutionary time for our bodies to have adapted genetically to the effects their use in preparing foods may have on us. Again, this is the definition for "natural" that you have to adopt if you want a functional justification that defines "natural" based on scientific validation rather than subjectivity.

When was fire first controlled by human beings?


So the next question is obvious: How long have fire and cooking been around, then, and how do we know whether that length of time has been long enough for us to have adapted sufficiently?
Let's take the question one part at a time. The short answer to the first part of the question is that fire was first controlled by humans anywhere from about 230,000 years ago to 1.4 or 1.5 million years ago, depending on which evidence you accept as definitive.

Evidence for very early control of fire is sparse and ambiguous. The earliest evidence for control of fire by humans, in the form of fires at Swartkrans, South Africa and at Chesowanja, in Kenya, suggests that it may possibly have been in use there as early as about 1.4
or 1.5 million years ago.[100] However, the interpretation of the physical evidence at these early

224

sites has been under question in the archaeological community for some years now, with critics saying these fires could have been wildfires instead of human-made fires. They suggest the evidence for human control of fire might be a misreading of other factors, such as magnesiumstaining of soils, which can mimic the results of fire if not specifically accounted for. For indisputable evidence of fire intentionally set and controlled by humans, the presence of a hearth or circle of scorched stones is often demanded as conclusive proof,[101] and at these early sites, the evidence tying the fires to human control is based on other factors. Earliest dates for control of fire accepted by skeptical critics. At the other end of the timescale, these same critics who are only willing to consider the most unequivocal evidence will still admit that at least by 230,000 years ago[102] there is enough good evidence at at least one site to establish fire was under control at this time by humans. At this site, called Terra Amata, an ancient beach location on the French Riviera, stone hearths are found at the center of what may have been huts; and more recent sources may put the site's age at possibly 300,000 years old rather than 230,000.[103] Somewhat further back--from around 300,000 to 500,000 years ago--more evidence has been accumulating recently at sites in Spain and France[104] that looks as if it may force the ultraconservative paleontologists to concede their 230,000-year-ago date is too stingy, but we'll see.

And then there is Zhoukoudian cave in China, one of the most famous sites connected with Homo erectus, where claims that fire may have been used as early as 500,000 to 1.5 million years ago have now largely been discredited due to the complex and overlapping nature of the evidence left by not just humans, but hyenas and owls who also inhabited the cave. (Owl droppings could conceivably have caught fire and caused many of the fires.) Even after discounting the most extreme claims, however, it does seem likely that at least by 230,000 to 460,000 years ago humans were using fire in the cave[105], and given scorching patterns around the teeth and skulls of some animal remains, it does appear the hominids may have done this to cook the brains (not an uncommon practice among hunting-gathering peoples today).[106] The most recent excavation with evidence for early use of fire has been within just the last couple of years in France at the Menez-Dregan site, where a hearth and evidence of fire has been preliminarily dated to approximately 380,000 to 465,000 years. If early interpretations of the evidence withstand criticism and further analysis, the fact that a hearth composed of stone blocks inside a small cave was found with burnt rhinoceros bones close by has provoked speculation that the rhino may have been cooked at the site.[107]

Crux of the question: first control of fire vs. earliest widespread use. Now of course, the crucial
question for us isn't just when the earliest control of fire was; it's at what date fire was being used consistently--and more specifically for cooking, so that more-constant genetic selection pressures would have been brought to bear. Given the evidence available at this time, most of it would probably indicate that 125,000 years ago is the earliest reasonable estimate for widespread control.*[108] Another good reason it may be safer to base adaptation to fire and cooking on the figure of 125,000 years ago is that more and more evidence is indicating modern humans today are descended from a group of ancestors who were living in Africa 100,000-200,000 years ago, who then spread out across the globe to replace other human groups.[109] If true, this would probably mean the fire sites in Europe and China are those of separate human groups who did not leave descendants that survived to the present. Given that the African fire sites in Kenya and South Africa from about 1.5 million years ago are under dispute, then, widespread usage at 125,000 years ago seems the safest figure for our use here.

Sequence of stages in control: fire for warmth vs. fire for cooking. One thing we can say about
the widespread use of fire probable by 125,000 years ago, however, is that it would almost certainly have included the use of fire for cooking.* Why can this be assumed? It has to do with the sequence for the progressive stages of control over fire that would have had to have taken place prior to fire usage becoming

225

commonplace. And the most interesting of these is that fire for cooking would almost inevitably have been one of the first uses it was put to by humans, rather than some later-stage use.* The first fires on earth occurred approximately 350 million years ago--the geological evidence for fire in remains of forest vegetation being as old as the forests themselves.[110] It is usual to focus only on fire's immediately destructive effects to plants and wildlife, but there are also benefits. In response to occasional periodic wildfires, for example, certain plants and trees have evolved known as "pyrophytes," for whose existence periodic wildfires are essential. Fire revitalizes them by destroying their parasites and competitors, and such plants include grasses eaten by herbivores as well as trees that provide shelter and food for animals.[111]

Opportunistic exploitation of animal kills by predators after wildfires. Fires also provide other
unintended benefits to animals as well. Even at the time a wildfire is still burning, birds of prey (such as falcons and kites)--the first types of predators to appear at fires--are attracted to the flames to hunt fleeing animals and insects. Later, land-animal predators appear when the ashes are smoldering and dying out to pick out the burnt victims for consumption. Others, such as deer and bovine animals appear after that to lick the ashes for their salt content. Notable as well is that most mammals appear to enjoy the heat radiated at night at sites of recently burned-out fires.[112] It would have been inconceivable, therefore, that human beings, being similarly observant and opportunistic creatures, would not also have partaken of the dietary windfall provided by wildfires they came across. And thus, even before humans had learned to control fire purposefully--and without here getting into the later stages of control over fire--their early passive exposures to it would have already introduced them, like the other animals, to the role fire could play in obtaining edible food and providing warmth.

Potential adaptation to cooking in light of genetic rates of change

So if fire has been used on a widespread basis for cooking since roughly 125,000 years ago, how do we know if that has been enough time for us to have fully adapted to it?
To answer that, we have to be able to determine the rate at which the genetic changes constituting evolutionary adaptation take place in organisms as a result of environmental or behavioral change--which in this case means changes in food intake.

Rates of genetic change as estimated from speciation in the fossil record. The two sources for
estimates of rates at which genetic change takes place are from students of the fossil record and from population geneticists. Where the fossil record is concerned, Niles Eldredge, along with Stephen Jay Gould, two of the most well-known modern evolutionary theorists, estimated the time span required for "speciation events" (the time required for a new species to arise in response to evolutionary selection pressures) to be somewhere within the range of "five to 50,000 years."[113] Since this rough figure is based on the fossil record, it makes it difficult to be much more precise than that range. Eldredge also comments that "some evolutionary geneticists have said that the estimate of five to 50,000 years is, if anything, overly generous."[114] Also remember that this time span is for changes large enough to result in a new species classification. Since we are talking here about changes (digestive changes) that may or may not be large enough to result in a new species (though changes in diet often are in fact behind the origin of new species), it's difficult to say from this particular estimate whether we may be talking about a somewhat shorter or longer time span than that for adaptation to changes in food.

Measurements of genetic change from population genetics. Fortunately, however, the estimates
from the population geneticists are more precise. There are even mathematical equations to quantify the

226

rates at which genetic change takes place in a population, given evolutionary "selection pressures" of a given magnitude that favor survival of those individuals with a certain genetic trait.[115] The difficulty lies in how accurately one can numerically quantify the intensity of real-world selection pressures. However, it turns out there have been two or three actual examples where it has been possible to do so at least approximately, and they are interesting enough I'll mention a couple of them briefly here so people can get a feel for the situation. The most interesting of these examples relates directly to our discussion here, and has to do with the gene for lactose tolerance in adults. Babies are born with the capacity to digest lactose via production of the digestive enzyme lactase. Otherwise they wouldn't be able to make use of mother's milk, which contains the milk sugar lactose. But sometime after weaning, this capacity is normally lost, and there is a gene that is responsible. Most adults--roughly 70% of the world's population overall--do not retain the ability to digest lactose into adulthood[116] and this outcome is known as "lactose intolerance." (Actually this is something of a misnomer, since adult lactose intolerance would have been the baseline normal condition for virtually everyone in the human race up until Neolithic (agricultural) times.[117]) If these people attempt to drink milk, then the result may be bloating, gas, intestinal distress, diarrhea, etc.[118]

Influence of human culture on genetic selection pressures. However--and this is where it gets
interesting--those population groups that do retain the ability to produce lactase and digest milk into adulthood are those descended from the very people who first began domesticating animals for milking during the Neolithic period several thousand years ago.[119] (The earliest milking populations in Europe, Asia, and Africa began the practice probably around 4,000 B.C.[120]) And even more interestingly, in population groups where cultural changes have created "selection pressure" for adapting to certain behavior--such as drinking milk in this case--the rate of genetic adaptation to such changes significantly increases. In this case, the time span for widespread prevalence of the gene for lactose tolerance within milking population groups has been estimated at approximately 1,150 years[121]--a very short span of time in evolutionary terms.

Relationship between earliest milking cultures and prevalence of lactose tolerance in populations. There is a very close correlation between the 30% of the world's population who are tolerant
to lactose and the earliest human groups who began milking animals. These individuals are represented most among modern-day Mediterranean, East African, and Northern European groups, and emigrants from these groups to other countries. Only about 20% of white Americans in general are lactose intolerant, but among sub-groups the rates are higher: 90-100% among Asian-Americans (as well as Asians worldwide), 75% of African-Americans (most of whom came from West Africa), and 80% of Native Americans. 50% of Hispanics worldwide are lactose intolerant.[122] Now whether it is still completely healthy for the 30% of the world's population who are lactose tolerant to be drinking animals' milk--which is a very recent food in our evolutionary history--I can't say. It may well be there are other factors involved in successfully digesting and making use of milk without health sideeffects other than the ability to produce lactase--I haven't looked into that particular question yet. But for our purposes here, the example does powerfully illustrate that genetic adaptations for digestive changes can take place with much more rapidity than was perhaps previously thought.*

Genetic changes in population groups who crossed the threshold from hunting-gathering to grain-farming earliest. Another interesting example of the spread of genetic adaptations since the
Neolithic has been two specific genes whose prevalence has been found to correlate with the amount of time populations in different geographical regions have been eating the grain-based high-carbohydrate diets common since the transition from hunting and gathering to Neolithic agriculture began 10,000 years ago. (These two genes are the gene for angiotensin-converting enzyme--or ACE--and the one for apolipoprotein B, which, if the proper forms are not present, may increase one's chances of getting cardiovascular disease.) [123]

227

In the Middle East and Europe, rates of these two genes are highest in populations (such as Greece, Italy, and France) closer to the Middle Eastern "fertile crescent" where agriculture in this part of the globe started, and lowest in areas furthest away, where the migrations of early Neolithic farmers with their grainbased diets took longest to reach (i.e., Northern Ireland, Scotland, Finland, Siberia). Closely correlating with both the occurrence of these genes and the historical rate of grain consumption are corresponding rates of deaths due to coronary heart disease. Those in Mediterranean countries who have been eating highcarbohydrate grain-based diets the longest (for example since approximately 6,000 B.C. in France and Italy) have the lowest rates of heart disease, while those in areas where dietary changes due to agriculture were last to take hold, such as Finland (perhaps only since 2,000 B.C.), have the highest rates of death due to heart attack. Statistics on breast cancer rates in Europe also are higher for countries who have been practicing agriculture the least amount of time.[124] Whether grain-based diets eaten by people whose ancestors only began doing so recently (and therefore lack the appropriate gene) is actually causing these health problems (and not simply correlated by coincidence) is at this point a hypothesis under study. (One study with chickens, however--who in their natural environment eat little grain--has shown much less atherosclerosis on a high-fat, high-protein diet than on a low-fat, high-carbohydrate diet.[125]) But again, and importantly, the key point here is that genetic changes in response to diet can be more rapid than perhaps once thought. The difference in time since the advent of Neolithic agriculture between countries with the highest and lowest incidences of these two genes is something on the order of 3,000-5,000 years,[126] showing again that genetic changes due to cultural selection pressures for diet can force more rapid changes than might occur otherwise.

Recent evolutionary changes in immunoglobulin types, and genetic rates of change overall.
Now we should also look at the other end of the time scale for some perspective. The Cavalli-Sforza population genetics team that has been one of the pioneers in tracking the spread of genes around the world due to migrations and/or interbreeding of populations has also looked into the genes that control immunoglobulin types (an important component of the immune system). Their estimate here is that the current variants of these genes were selected for within the last 50,000-100,000 years, and that this time span would be more representative for most groups of genes. They also feel that in general it is unlikely gene frequencies for most groups of genes would undergo significant changes in time spans of less than about 11,500 years.[127] However, the significant exception they mention--and this relates especially to our discussion here--is where there are cultural pressures for certain behaviors that affect survival rates.[128] And the two examples we cited above: the gene for lactose tolerance (milk-drinking) and those genes associated with high-carbohydrate grain consumption, both involve cultural selection pressures that came with the change from hunting and gathering to Neolithic agriculture. Again, cultural selection pressures for genetic changes operate more rapidly than any other kind. Nobody yet, at least so far as I can tell, really knows whether or not the observed genetic changes relating to the spread of milk-drinking and grain-consumption are enough to confer a reasonable level of adaptation to these foods among populations who have the genetic changes, and the picture seems mixed.*

Rates of gluten intolerance (gluten is a protein in certain grains such as wheat, barley, and oats that
makes dough sticky and conducive to bread-baking) are lower than for lactose intolerance, which one would expect given that milk-drinking has been around for less than half the time grain-consumption has. Official estimates of gluten intolerance range from 0.3% to 1% worldwide depending on population group. [129] Some researchers, however, believe that gluten intolerance is but the tip of the iceberg of problems due to grain consumption (or more specifically, wheat). Newer research seems to suggest that anywhere from 5% to as much as 20-30% of the population with certain genetic characteristics (resulting in what is called a "permeable intestine") may absorb incompletely digested peptide fragments from wheat with adverse effects that could lead to a range of possible diseases.[130] What do common genetic rates of change suggest about potential adaptation to cooking? We have gone a little far afield here getting some kind of grasp on rates of genetic change, but I think it's been necessary

228

for us to have a good sense of the time ranges involved. So to bring this back around to the question of adaptation to cooking, it should probably be clear by this point that given the time span involved (likely 125,000 years since fire and cooking became widespread), the chances are very high that we are in fact adapted to the cooking of whatever foods were consistently cooked.* I would include in these some of the vegetable foods, particularly the coarser ones such as starchy root vegetables such as yams, which are long thought to have been cooked,[131] and perhaps others, as well as meat, from what we know about the fossil record.

Are cooking's effects black-and-white or an evolutionary cost/benefit tradeoff?


What about the contention by raw-food advocates that cooking foods results in pyrolytic byproducts that are carcinogenic or otherwise toxic to the body, and should be avoided for that reason?
It's true cooking introduces some toxic by-products, but it also neutralizes others.[132] In addition, the number of such toxins created is dwarfed by the large background level of natural toxins (thousands)[133] already present in plant foods from nature to begin with, including some that are similarly carcinogenic in high-enough doses. (Although only a few dozen have been tested so far,[134] half of the naturally occurring substances in plants known as "nature's pesticides" that have been tested have been shown to be carcinogenic in trials with rats and mice.[135]) Nature's pesticides appear to be present in all plants, and though only a few are found in any one plant, 5-10% of a plant's total dry weight is made up of them.[136] [The reason "nature's pesticides" occur throughout the plant kingdom is because plants have had to evolve low-level defense mechanisms against animals to deter overpredation. On one level, plants and animals are in a continual evolutionary "arms race" against each other. Fruiting plants, of course, have also evolved the separate ability to exploit the fact that certain animals are attracted to the fruit by enabling its seeds to be dispersed through the animals' feces.]

We have a liver and kidneys for a reason, which is that there have always been toxins in natural foods that the body has had to deal with, and that's one reason why these organs evolved. There
are also a number of other more general defenses the body has against toxins. These types of defenses make evolutionary sense given the wide range of toxic elements in foods the body has had to deal with over the eons. [Perhaps not clear enough in the original version of the interview is the point that a wide range of GENERAL defenses might therefore be reasonably expected to aid in neutralizing or ejecting toxins even of a type the body hadn't necessarily seen before, such as those that might be introduced by cooking practices.] Such mechanisms include the constant shedding of surface-layer cells of the digestive system, many defenses against oxygen free-radical damage, and DNA excision repair, among others.[137]

The belief that a natural diet is, or can be, totally toxin-free is basically an idealistic fantasy--an illusion of black-and-white thinking not supported by real-world investigations. The real
question is not whether a diet is completely free of toxins, but whether we are adapted to process what substances are in our foods--in reasonable or customary amounts such as encountered during evolution-that are not usable by the body. Again, the black-and-white nature of much Hygienic thinking obscures here what are questions of degrees rather than absolutes.

Cooking may favorably impact digestibility. Also, and I know raw-foodists generally don't like to
hear this, but there has long been evidence cooking in fact does make foods of certain types more digestible. For example, trypsin inhibitors (themselves a type of protease inhibitor) which are widely distributed in the plant kingdom, particularly in rich sources of protein, inhibit the ability of digestive enzymes to break down protein. (Probably the best-known plants containing trypsin inhibitors are legumes and grains.) Research has shown the effect of most such protease inhibitors on digestion to be reduced by

229

cooking.[138] And it is this advantage in expanding the range of utilizable foods in an uncertain environment that was the evolutionary advantage that helped bring cooking about and enhanced survival.* I want to make clear that I still believe the largest component of the diet should be raw (at least 50% if not considerably more), but there is provision in the evolutionary picture for reasonable amounts of cooked foods of certain types, such as at the very least, yams, probably some other root vegetables, the legumes, some meat, and so forth. (With meat, the likelihood is that it was eaten raw when freshly killed, but what could not be eaten would likely have been dried or cooked to preserve it for later consumption, rather than wasting it.) Whether or not some foods like these can be eaten raw if one has no choice or is determined enough to do so is not the real question. The question is what was more expedient or practical to survival and which prevailed over evolutionary time.

Cooking practices of Aborigines in light of survival needs. A brief look at the Australian
Aborigines might be illustrative here.* What data is available since the Aborigines were first encountered by Europeans shows that inland Aborigines in the desert areas were subject to severe food shortages and prolonged droughts.[139] This of course made emphasizing the most efficient use of whatever foods could be foraged paramount. Estimates based on studies of Aborigines in northern Australia are that they processed roughly half of their plant foods, but that no food was processed unnecessarily, any such preparation being done only to make a food edible, more digestible, or more palatable.[140] In general food was eaten as it was collected, according to its availability during the seasons--except during times of feasts--with wastage being rare, such a pattern being characteristic of feast-and-famine habitats. Some food, however, was processed for storage and later retrieval (usually by drying), including nuts and seeds, but may also have been ground and baked into cakes instead, before burying in the ground or storing in dry caches.[141] Fresh foods such as fruits, bulbs, nectar, gums, flowers, etc., were eaten raw when collected. Examples of foods that were prepared before consumption include the cooking of starchy tubers or seeds, grinding and roasting of seeds, and cooking of meat.[142] That these practices were necessary to expand the food supply and not merely induced by frivolous cultural practices like raw-foodists often tend to theorize can be seen in the fact that after colonization by Europeans, Aborigines were not above coming into missions during droughts to get food.[143]

The role of individual experimentation given evolutionary uncertainties about diet


Fly in the ointment: dietary changes since advent of agriculture. But the more interesting and
more pressing question, to my mind, is not whether we are adapted to cooking of certain foods, which seems very likely,* but how much we have adapted to the dietary changes since the Neolithic agricultural transition, given the 10,000 years or less it's been underway. At present, the answer is unclear, although in general, we can probably say there just hasn't been enough time for full adaptation yet. Or if so, only for people descended from certain ancestral groups with the longest involvement with agriculture. My guess (and it is just a guess) would be that we are still mostly adapted to a Paleolithic diet, but for any particular individual with a given ancestral background, certain Neolithic foods such as grains, perhaps even modest amounts of certain cultured milk products such as cheese or yogurt (ones more easily digested than straight milk) for even fewer people, might be not only tolerated, but helpful. Especially where people are avoiding flesh products which is our primary animal food adaptation, these animal by-products may be helpful,* which Stanley Bass's work with mice and his mentor Dr. Gian-Cursio's work with Hygienic patients seems to show, as Dr. Bass has discussed previously here in H&B (in the April and June 1994 issues).

230

How are we to determine an optimum diet for ourselves, then, given that some genetic changes may be more or less complete or incomplete in different population groups?
I think what all of this points to is the need to be careful in making absolute black-and-white pronouncements about invariant food rules that apply equally to all. It is not as simple as saying that if we aren't sure we are fully adapted to something to just eliminate it from the diet to be safe. Because adaptation to a food does not necessarily mean just tolerance for that food, it also means that if we are in fact adapted to it, we would be expected to thrive better with some amount of that food in our diet. Genetic adaptation cuts both ways. This is why I believe it is important for people to experiment individually. Today, because of the Neolithic transition and the rates at which genetic changes are being discovered to take place, it is apparent humanity is a species in evolutionary transition. Due to the unequal flow and dissemination of genes through a population during times like these, it is unlikely we will find [more] uniform adaptation across the population, as we probably would have during earlier times. This means it is going to be more likely right now in this particular historical time period that individuals will be somewhat different in their responses to diet. And as we saw above (with the two genes ACE and apolipoprotein-B) these genetic differences may even confound attempts to replicate epidemiological dietary studies from one population to another unless these factors are taken into account.*

Conflicting data from various modern lines of evidence means people must experiment and decide for themselves. So while it is important to look for convergences among different lines of
evidence (evolutionary studies, biochemical nutritional studies, epidemiological studies and clinical trials, comparative anatomy from primate studies, and so forth), it is well to consider how often the epidemiological studies, perhaps even some of the biochemical studies, reverse themselves or come back with conflicting data. It usually takes many years--even decades--for their import to become clear based on the lengthy scientific process of peer review and replication of experiments for confirmation or refutation.

Openness means challenging any rigid assumptions we may have through experimentation.
So my advice is: don't be afraid to experiment. Unless you have specific allergies or strong food intolerances and whatnot, the body is flexible enough by evolution to handle short-term variations in diet from whatever an optimal diet might be anyway. If you start within the general parameters we've outlined here and allow yourself to experiment, you have a much better chance of finding the particular balance among these factors that will work best for you. If you already have something that works well for you, that's great. If, however, you are looking for improvements, given the uncertainties above we've talked about, it's important to look at any rigid assumptions you may have about the "ideal" diet, and be willing to challenge them through experimentation. In the long run, you only have yourself to benefit by doing so.

Conflicts between paleo/anthropological vs. biochemical/epidemiological evidence


Despite the evolutionary picture you've presented here, there are still objections that people have about meat from a biochemical or epidemiological standpoint. What about T. Colin Campbell's China Study for example?
Good point. Campbell's famous study, to my mind, brings up one of the most unremarked-upon recent conflicts in epidemiological data that has arisen. In his lecture at the 1991 ANHS annual conference, reported on in the national ANHS publication Health Science, Campbell claimed that the China Study data

231

pointed to not just high fat intake, but to the protein in animal food, as increasing cholesterol levels. (High cholesterol levels in the blood are now widely thought by many to be the biggest single factor responsible for increased rates of atherosclerosis--clogged blood vessels--and coronary heart disease.) According to him, the lower the level of animal protein in the diet (not just the lower the level of fat) the lower the cholesterol level in the blood. He believes that animal food is itself the biggest culprit, above and beyond just fat levels in food.[144]

Campbell's conclusions about cholesterol and animal protein are contradicted by evidence from studies of modern hunter-gatherers. Yet as rigorous as the study is proclaimed to be, I have to
tell you that Campbell's claim that animal protein by itself is the biggest culprit in raising blood cholesterol is contradicted by studies of modern-day hunter-gatherers eating considerable amounts of wild game in their diet who have very low cholesterol levels comparable to those of the China study. One review of different tribes studied showed low cholesterol levels for the Hadza of 110 mg/dl (eating 20% animal food), San Bushmen 120 (20-37% animal), Aborigines 139 (10-75% animal), and Pygmies at 106, considerably lower than the now-recommended safe level of below 150.[145] Clearly there are unaccounted-for factors at work here yet to be studied sufficiently.

Large and significant differences between domesticated meat vs. wild game. One of them
might be the difference in composition between the levels of fat in domesticated meat vs. wild game: on average five times as much for the former than the latter. On top of that, the proportion of saturated fat in domesticated meat compared to wild game is also five times higher.[146] Other differences between these two meat sources are that significant amounts of EPA (an omega-3 fatty acid thought to perhaps help prevent atherosclerosis) are found in wild game (approx. 4% of total fat), while domestic beef for example contains almost none.[147] This is important because the higher levels of EPA and other omega-3 fatty acids in wild game help promote a low overall dietary ratio of omega-6 vs. omega-3 fatty acids for hunter-gatherers--ranging from 1:1 to 4:1--compared to the high 11:1 ratio observed in Western nations. Since omega-6 fatty acids may have a cancer-promoting effect, some investigators are recommending lower ratios of omega-6 to omega-3 in the diet which would, coincidentally, be much closer to the evolutionary norm.[148] Differences like these may go some way toward explaining the similar blood cholesterol levels and low rates of disease in both the rural Chinese eating a very-low-fat, low-animal-protein diet, and in huntergatherers eating a low-fat, high-animal-protein diet. Rural Chinese eat a diet of only 15% fat and 10% protein, with the result that saturated fats only contribute a low 4% of total calories. On the other hand, those hunter-gatherer groups approximating the Paleolithic norm eat diets containing 20-25% fat and 30% protein, yet the contribution of saturated fat to total caloric intake is nevertheless a similarly low 6% of total calories.[149]

What about the contention that high-protein diets promote calcium loss in bone and therefore contribute to osteoporosis?
The picture here is complex and modern studies have been contradictory. In experimental settings, purified, isolated protein extracts do significantly increase calcium excretion, but the effect of increased protein in natural foods such as meat is smaller or nonexistent.[150] Studies of Eskimos have shown high rates of osteoporosis eating an almost all-meat diet[151] (less than 10% plant intake[152]) but theirs is a recent historical aberration not typical of the evolutionary Paleolithic diet thought to have averaged 65% plant foods and 35% flesh.* Analyses of numerous skeletons from our Paleolithic ancestors have shown development of high peak bone mass and low rates of bone loss in elderly specimens compared to their Neolithic agricultural successors whose rates of bone loss increased considerably even though they ate

232

much lower-protein diets.[153] Why, nobody knows for sure, though it is thought that the levels of phosphorus in meat reduce excretion of calcium, and people in Paleolithic times also ate large amounts of fruits and vegetables[154] with an extremely high calcium intake (perhaps 1,800 mg/day compared to an average of 500-800 for Americans today[155]) and led extremely rigorous physical lives, all of which would have encouraged increased bone mass.[156]

Caveats with respect to using modern hunter-gatherers as dietary models


Okay, let's move on to the hunter-gatherers you mentioned earlier. I've heard that while some tribes may have low rates of chronic degenerative disease, others don't, and may also suffer higher rates of infection than we do in the West.
This is true. Not all "hunter-gatherer" tribes of modern times eat diets in line with Paleolithic norms. Aspects of their diets and/or lifestyle can be harmful just as modern-day industrial diets can be. When using these people as comparative models, it's important to remember they are not carbon copies of Paleolithicera hunter-gatherers.[157] They can be suggestive (the best living examples we have), but they are a mixed bag as "models" for behavior, and it is up to us to keep our thinking caps on. We've already mentioned the Eskimos above as less-than-exemplary models. Another example is the Masai tribe of Africa who are really more pastoralists (animal herders) than hunter-gatherers. They have low cholesterol levels ranging from 115 to 145,[158] yet autopsies have shown considerable atherosclerosis. [159] Why? Maybe because they deviate from the Paleolithic norm of 20-25% fat intake due to their pastoralist lifestyle by eating a 73% fat diet that includes large amounts of milk from animals in addition to meat and blood.*[160] Our bodies do have certain limits.

But after accounting for tribes like these, why do we see higher rates of mortality from infectious disease among other hunter-gatherers who are eating a better diet and show little incidence of degenerative disease?
There are two major reasons I know of. First, most modern-day tribes have been pushed onto marginal habitats by encroaching civilization.[161] This means they may at times experience nutritional stress resulting from seasonal fluctuations in the food supply (like the Aborigines noted above) during which relatively large amounts of weight are lost while they remain active. The study of "paleopathology" (the study of illnesses in past populations from signs left in the fossil record) shows that similar nutritional stress experienced by some hunter-gatherers of the past was not unknown either, and at times was great enough to have stunted their growth, resulting in "growth arrest lines" in human bone that can be seen under conditions of nutritional deprivation. Such nutritional stress is most likely for hunter-gatherers in environments where either the number of food sources is low (exposing them to the risk of undependable supply), or where food is abundant only seasonally.[162]

Fasting vs. extended nutritional stress/deprivation. Going without food--or fasting while under
conditions of total rest as hygienists do as a regenerative/recuperative measure--is one thing, but nutritional stress or deprivation while under continued physical stress is unhealthy and leaves one more susceptible to pathologies including infection.[163] The second potential cause of higher rates of infection is the less artificially controlled sanitary conditions (one of the areas where modern civilization is conducive rather than destructive to health)--due to less control over the environment by hunter-gatherers than by modern civilizations. Creatures in the wild are in frequent contact with feces and other breeding grounds for microorganisms such as rotting fruit and/or carcasses, to which they are exposed by skin breaks and injuries, and so forth.[164]

233

Animals in the wild on natural diets are not disease-free. Contrary to popular Hygienic myth,
animals in the wild eating natural diets in a natural environment are not disease-free, and large infectious viral and bacterial plagues in the past and present among wild animal populations are known to have occurred. (To cite one example, rinderpest plagues in the African Serengeti occurred in the 1890s and again around 1930, 1960, and 1982 among buffalo, kudu, eland, and wildebeest.[165]) It becomes obvious when you look into studies of wild animals that natural diet combined with living in natural conditions is no guarantee of freedom from disease and/or infection. Chimpanzees, our closest living animal relatives, for instance, can and do suffer bouts in the wild from a spectrum of ailments very similar to those observed in human beings: including pneumonia and other respiratory infections (which occur more often during the cold and rainy season), polio, abscesses, rashes, parasites, diarrhea, even hemorrhoids on occasion.[166] Signs of infectious disease in the fossil record have also been detected in remains as far back as the dinosaur-age, as have signs of immune system mechanisms to combat them. [167]

Uninformed naturalism and unrealistic expectations in diet


Pure "naturalism" often overlooks the large positive impact of modern environmental health advantages. One of the conclusions to be drawn from this is that artificial modern conditions are
not all bad where health is concerned. Such conditions as "sanitation" due to hygienic measures, shelter and protection from harsh climatic extremes and physical trauma, professional emergency care after potentially disabling or life-threatening accidents, elimination of the stresses of nomadism, plus protection from seasonal nutritional deprivation due to the modern food system that Westerners like ourselves enjoy today all play larger roles in health and longevity than we realize.[168]

Unrealistic perfectionism leads to heaping inhumanity and guilt on ourselves. Also, I would
hope that the chimp examples above might persuade hygienists not to feel so guilty or inevitably blame themselves when they occasionally fall prey to acute illness. We read of examples in the Natural Hygiene M2M which sometimes seem to elicit an almost palpable sense of relief among others when the conspiracy of silence is broken and they find they aren't the only ones. I think we should resist the tendency to always assume we flubbed the dietary details. In my opinion it is a mistake to believe that enervation need always be seen as simply the instigator of "toxemia" which is then held to always be the incipient cause of any illness. It seems to me you can easily have "enervation" (lowered energy and resistance) without toxemia, and that that in and of itself can be quite enough to upset the body's normal homeostasis ("health") and bring on illness. (Indeed I have personally become ill once or twice during the rebuilding period after lengthy fasts when overworked, a situation in which it would be difficult to blame toxemia as the cause.) The examples of modern-day hunter-gatherers as well as those of chimps should show us that you can eat a healthy natural diet and still suffer from health problems, including infectious disease, due to excessive stresses--what we would call "enervation" in Natural Hygiene.

Health improvements after becoming ex-vegetarian


Ward, we still have some space here to wrap up Part 2. Given the research you've done, how has it changed your own diet and health lifestyle? What are you doing these days, and why?
I would say my diet right now* [late 1996] is somewhere in the neighborhood of about 85% plant and 15%

234

animal, and overall about 60% raw and 40% cooked by volume. A breakdown from a different angle would be that by volume it is, very roughly, about 1/4 fruit, 1/4 starches (grains/potatoes, etc.), 1/4 veggies, and the remaining quarter divided between nuts/seeds and animal products, with more of the latter than the former. Of the animal foods, I would say at least half is flesh (mostly fish, but with occasional fowl or relatively lean red meat thrown in, eaten about 3-5 meals per week), the rest composed of varying amounts of eggs, goat cheese, and yogurt. Although I have to admit I am unsure about the inclusion of dairy products on an evolutionary basis given their late introduction in our history, nevertheless, I do find that the more heavily I am exercising, the more I find myself tending to eat them. To play it safe, what dairy I do eat is low- or no-lactose cultured forms like goat cheese and yogurt.* Where the grains are concerned, so far I do not experience the kind of sustained energy I like to have for distance running without them, even though I am running less mileage than I used to (20 miles/week now as opposed to 35-40 a few years ago). The other starches such as potatoes, squash, etc., alone just don't seem to provide the energy punch I need. Again, however, I try to be judicious by eating non-glutencontaining grains such as millet, quinoa, or rice, or else use sprouted forms of grains, or breads made from them, that eliminate the gluten otherwise present in wheat, barley, oats, and so forth.* In general, while I do take the evolutionary picture heavily into account, I also believe it is important to listen to our own bodies and experiment, given the uncertainties that remain. Also, I have to say that I find exercise, rest, and stress management as important as diet in staying energetic, healthy, and avoiding acute episodes of ill-health. Frankly, my experience is that once you reach a certain reasonable level of health improvement based on your dietary disciplines, and things start to level out--but maybe you still aren't where you want to be--most further gains are going to come from paying attention to these other factors, especially today when so many of us are overworked, over-busy, and stressed-out. I think too many people focus too exclusively on diet and then wonder why they aren't getting any further improvements. Diet only gets you so far. I usually sleep about 8-10 hours a night, and I very much enjoy vigorous exercise, which I find is necessary to help control my blood-sugar levels, which are still a weak spot for me. The optimum amount is important, though. A few years ago I was running every day, totaling 35-40 miles/week and concentrating on hard training for age-group competition, and more prone to respiratory problems like colds, etc. (not an infrequent complaint of runners). In the last couple of years, I've cut back to every-otherday running totaling roughly 20 miles per week. I still exercise fairly hard, but a bit less intensely than before, I give myself a day of rest in between, and the frequency of colds and so forth is now much lower.

I am sure people will be curious here, Ward: What were some of the improvements you noticed after adding flesh foods to your diet?
Well, although I expected it might take several months to really notice much of anything, one of the first things was that within about 2 to 3 weeks I noticed better recovery after exercise--as a distance runner I was able to run my hard workouts more frequently with fewer rest days or easy workouts in between. I also began sleeping better fairly early on, was not hungry all the time anymore, and maintained weight more easily on lesser volumes of food. Over time, my stools became a bit more well-formed, my sex drive increased somewhat (usually accompanies better energy levels for me), my nervous system was more stable and not so prone to hyperreactive panic-attack-like instability like before, and in general I found I didn't feel so puny or wilt under stress so easily as before. Unexpectedly, I also began to notice that my moods had improved and I was more "buoyant." Individually, none of these changes was dramatic, but as a cumulative whole they have made the difference for me. Most of these changes had leveled off after about 4-6 months, I would say.

235

Something else I ought to mention here, too, was the effect of this dietary change on a visual disturbance I had been having for some years prior to the time I embarked on a disciplined Hygienic program, and which continued unchanged during the two or three years I was on the traditional vegetarian diet of either all-raw or 80% raw/20% cooked. During that time I had been having regular episodes of "spots" in my visual field every week or so, where "snow" (like on a t.v. set) would gradually build up to the point it would almost completely obscure my vision in one eye or the other for a period of about 5 minutes, then gradually fade away after another 5 minutes. As soon as I began including flesh in my diet several times per week, these started decreasing in frequency and over the 3 years since have almost completely disappeared.

What problems are you still working on?


I still have an ongoing tussle with sugar-sensitivity due to the huge amounts of soft drinks I used to consume, and have to eat fruits conservatively. I also notice that I still do not hold up under stress and the occasional long hours of work as well as I think I ought to, even though it's better than before. Reducing stress and trying not to do so much in today's world is an area I really pay attention to and try to stay on top of. My own personal experience has been that no matter what kind of variation of the Hygienic or evolutionary "Paleolithic" diet I have tried so far, excessive prolonged stress of one sort or another (whether physical or mental) is a more powerful factor than lapses in diet in bringing on symptoms of illness, assuming of course that one is consistently eating well most of the time. And Chet, this brings up something I also want to emphasize: Just as you've freely mentioned about yourself here in H&B on numerous occasions, I'm not perfect in my food or health habits, and I don't intend to set myself up as some sort of example for anyone. Like anyone else, I'm a fallible human being. I still have the occasional extra-cheese pizza or frozen yogurt or creamy lasagna or whatever as treats, for example. Not that I consider having those on occasion to be huge sins or anything. I stick to my intended diet most of the time, but I don't beat myself up for the indiscretions. I would hope other people don't beat themselves up for it either, and that we can all be more forgiving of each other than that.

Uncertainties about earliest use of fire for cooking

*** "Given the evidence available at this time, most of it would probably indicate that 125,000 years ago is the earliest reasonable estimate for widespread control." *** "...given the time span involved (likely 125,000 years since fire and cooking became widespread), the chances are very high that we are in fact adapted to the cooking of whatever foods were consistently cooked." *** "...it is this advantage in expanding the range of utilizable foods in an uncertain environment that was the evolutionary advantage that helped bring cooking about and enhanced survival."
In hindsight here, it is worth pointing out that the above estimate of 125,000 years ago for widespread cooking is not based on very much evidence--nor is anything about the earliest origins for fire and cooking based on very much evidence, for that matter. Much about what is inferred about the earliest occurrences of

236

fire use is based on considerable deduction applied to very limited evidence (unlike the considerably more well-established, consistent, and converging lines of evidence for early meat consumption). There are some who believe a more reasonable estimate might be more like 40,000-60,000 years ago for widespread use of fire, based on wider distribution of hearths, but in my conversations with those perhaps more familiar with the paleontological evidence for cooking and fire than me, it still seems apparent that probably nobody knows for sure given the current state of the paleontological findings. While some believe the rarity of evidence such as hearths early on after control of fire points toward the early use of fire mostly for heating and protection from predators rather than cooking, discussions on the internet's PALEODIET listgroup have brought up the point that some modern hunter-gatherers in fact have been known on occasion to use cooking techniques that don't require hearths or cooking vessels that leave behind fossil evidence. And though modern hunter-gatherers have evolved behaviorally since ancient ones, this suggests it may not be out of the realm of possibility that other methods could have been used to utilize fire in cooking than just hearths.

*** "...the most interesting of these [stages of sequence for control] is that fire for cooking would almost inevitably have been one of the first uses it was put to by humans, rather than some later-stage use."
Again, I would now be more cautious in inferring this (my earlier inference about this was based on Johan Goudsblom's 1992 book, Fire and Civilization), after having been privy to further discussions with Paleodiet researchers. As the preceding paragraph mentions, it is possible the stage of use for warmth and protection from predators may not have so quickly progressed to the stage of using fire for cooking. But no one really seems to know one way or the other for sure from what I can tell.

*** "A brief look at the Australian Aborigines [who utilize cooking for survival purposes] might be illustrative here."
Elsewhere I believe studies of other hunter-gatherer tribes show that many of them cook about half their food. Again, whether this particular behavior of modern hunter-gatherers is representative of behavior in the more-distant evolutionary past should perhaps be regarded with caution.

Incompatibilities between dairy consumption and human physiology

*** "...for our purposes here, the example [of lactose tolerance having developed within 1,150 years in some segments of the population] does powerfully illustrate that genetic adaptations for digestive changes can take place with much more rapidity than was perhaps previously thought."
The estimate of 1,150 years is from the Cavalli-Sforza data. A somewhat more conservative estimate based on the prevalence of lactose tolerance in those of Northern European extraction is that the gene for adult lactose tolerance would have increased from 5% to 70% prevalence within about 5,000 years (approx. 250 generations). [Aoki 1991]

Genetic changes due to "neoteny" (such as adult lactose tolerance) not indicative of overall rates of adaptation. Even while these data for relatively quick evolutionary changes resulting in adult
lactase production remain essentially true, however, an important point that should be clarified is that the gene for lactase production is already present and expressed in all humans (for breastfeeding) up through the time of weaning. Therefore, for lactase production to continue into adulthood would require only relatively small changes in the gene, e.g., via the process known as "neotenization" (the retention of juvenile traits into adulthood). Thus, "brand-new" traits, so to speak, unlike polymorphisms such as the

237

gene for lactase production which already exist (even if not in a form previously expressed in adults) would take much longer to evolve.

Additional indications of incongruence between dairy and human physiology. Further, beyond
the question of lactose tolerance, I have since learned there would be many additional genetic changes required (than just that for lactose tolerance) to result in more complete adaptation to milk consumption. A number of recent studies demonstrate problems of milk consumption that go considerably beyond whether or not a person is capable of handling lactose:

Lactose and heart disease. One is that lactose itself is a risk factor for heart disease, since in
appreciable quantities it induces copper deficiency which, in turn, can lead through additional mechanisms to heart pathologies and mortality as observed in lab animals. Poor Ca:Mg ratio which can skew overall dietary ratio. Another problem is the calciumto-magnesium ratio of dairy products of approximately 12:1, which is directly at odds with the ratio of 1:1 from a Paleolithic diet composed of meats, fruits, and vegetables. Depending on the amount of milk in the diet, the resulting overall dietary ratio can go as high as 4 or 5:1. This high ratio leads to reduced magnesium stores, which have the additional ramification of increasing the risk of coronary heart disease, since magnesium helps to lower levels of blood lipids (cholesterol), lower the potential for cardiac arrthymias, lower the oxidation of LDL and VLDL cholesterol (oxidation of cholesterol has been linked to atherosclerosis), and prevent hyperinsulinism. (More about hyperinsulinism shortly below.) Saturated fat. Milk has also been linked to coronary heart disease because of its very high saturated fat content. Molecular mimicry/autoimmune response issues. Additionally, autoimmune responses are being increasingly recognized as a factor in the development of atherosclerosis. In relation to this, research has shown milk to cause exceptionally high production of certain antibodies which crossreact with some of the body's own tissues (an autoimmune response), specifically an immune response directed against the lining of the blood vessels. This process is thought to lead to atherosclerotic lesions, the first step that paves the way for consequent buildup of plaque.

[See Part 2 of Loren Cordain, Ph.D.'s posting of 10/9/97 to the PALEODIET list (relayed to the list and posted by Dean Esmay) for details and references relating to the above points about dairy consumption.]

Signs of evolutionary mismatch between grains and human physiology

*** "Nobody yet, at least so far as I can tell, really knows whether or not the observed genetic changes relating to the spread of milk-drinking and grain-consumption are enough to confer a reasonable level of adaptation to these foods among populations who have the genetic changes, and the picture seems mixed."
The most succinct addendum to this assessment is: Not any more, as we have seen above with milk. Where grains are concerned, there are several similar problems which I have since learned have been uncovered. To list a few:

Certain wheat peptides appear to significantly increase the risk of diabetes through molecular mimicry of the body's own tissues, leading to autoimmune responses destructive of
cells that produce insulin. [See Loren Cordain, Ph.D.'s post of 6/23/97 on the PALEODIET listgroup for reference citations, and also his article on this site about the evolutionary discordance of grains and legumes in the human diet, for details. ]

238

Increasing amounts of research suggest that celiac disease is probably also caused by autoimmune responses generated through molecular mimicry by certain peptides in wheat
and other grains (known collectively as "glutens"). Additional studies on autoimmune problems have led some researchers to believe numerous additional chronic conditions are also traceable to autoimmune responses generated by the glutens in grains. [See Ron Hoggan's various postings in the archives of the PALEODIET listgroup for information and reference citations about this.]

It is well documented that the phytates in grains bind the minerals iron, zinc, magnesium, and calcium, which can impair bone growth and metabolism, among other
problems. Antinutrients in grains also negatively affect vitamin D absorption which can lead to rickets with sufficient levels of intake. Grain consumption also generates biotin deficiencies in experimental animal models--the lack of which impairs fatty acid synthesis. [See Loren Cordain's post of 10/1/97 on PALEODIET for a brief summary and references pertaining to the preceding points, as well as the article on the evolutionary discordance of grains and legumes mentioned in the first bullet point just above.] Hyperinsulinism and excess carbohydrate consumption. Lastly, and most importantly, significant amounts of grain consumption considerably increase the carbohydrate intake of the diet, and excessive carbohydrate consumption is the primary factor driving what has come to be known as the hyperinsulinism syndrome. Hyperinsulinism and its associated constellation of resulting symptoms, collectively known as "Syndrome X," is not yet well-accepted in the medical and mainstream nutritional communities, but recent research has been increasingly pointing in its direction as a potential underlying factor in the development of many "diseases of civilization," which may be linked together via hyperinsulinism as a common cause.

The etiology of hyperinsulinism leading to the symptomology of Syndrome X is as follows:


All carbohydrates, from whatever source (natural or artificial), whether simple or complex, are ultimately broken down into glucose, or blood sugar. (And remember here that grains contribute the largest percentage of carbohydrates in most modern diets.) For glucose to be taken up by the cells of the body and used as fuel, the hormone insulin must be secreted by the pancreas. When chronically excessive levels of carbohydrates are eaten, insulin is overproduced, and the body eventually becomes dulled--more unresponsive--to insulin. This can become a vicious circle: Since the body is dulled to insulin, more has to be produced, which causes further dulling of sensitivity, leading the body to produce even more.

Biomarkers indicating hyperinsulinism. High levels of insulin have been correlated with high blood
pressure, high cholesterol, high triglycerides. If the relationship is causative, then by extension hyperinsulinism would also presumably be causative of the health problems that these symptoms themselves lead to (i.e., heart disease, etc.). High levels of insulin also lead to obesity, since in addition to enabling glucose to be used as fuel, insulin also promotes fat storage, while inhibiting the burning of fat.

Hyperinsulinism and diabetes. The foregoing constellation of symptoms constitutes what has come to
be called "Syndrome X." The extreme end of Syndrome X is probably Type II diabetes, in which the body has become so insulin-resistant it can no longer control its blood sugar levels, or even Type I diabetes, in which the pancreas overloads and is no longer able to produce insulin at all.

Recent studies indicate diets higher in protein reduce symptoms of Syndrome X. Dovetailing
with this research on Syndrome X is that diets higher in protein--or diets which lower carbohydrate levels by substituting animal protein--improve blood lipid (cholesterol) profiles by lowering LDL, VLDL (the

239

"bad cholesterols") and total cholesterol, while increasing HDL ("good cholesterol") and improving other symptoms of Syndrome X. Conversely, repeated studies are showing that low-fat, high-carbohydrate diets popular today do the opposite. [See Loren Cordain, Ph.D.'s posting of 3/26/97 to the PALEODIET list, relayed to the list by Dean Esmay, for reference citations on this.] Note particularly that these dietary changes which improve hyperinsulinism parallel the macronutrient composition that would have prevailed in the evolutionary diet of humans during Paleolithic times. (I.e., fairly low carb intake combined with relatively higher levels of fat and protein due to the prevalence of animal flesh in the diet, and more limited availability of fruits with high sugar content.)

Why do cooked or denser foods often improve raw/vegan health?


*** "If it is true that most people do better with the inclusion of some of these cooked items in their diet that we've mentioned--and I believe that it is, based on everything I have seen and heard..."
In retrospect, I want to make clear here that this was true of the vegetarians I was in contact with during the time I ran the Natural Hygiene M2M (and others since). My opinion at this point is that the improved health observed in most otherwise raw-foodist vegetarians who later include some amount of cooked food in their diet occurs because cooking simply allows additional foods to be eaten that broaden the diet. Just as importantly, it also increases the nutrient density of the diet by making available more concentrated nutrition in an otherwise restricted spectrum of nutrient intake brought about by eating only the bulkier, less-dense foods that compose most vegetarian fare that is edible raw.

For those who do not thrive on raw vegan diets, do the benefits experienced from grains/dairy outweigh any downsides? With the previously discussed evidence that grain and milk
consumption can carry with them certain health ramifications, an interesting question presents itself. Considering our earlier observation above that vegetarians on raw-food diets often experience improvement when adding dairy products, or cooked items to their diet that allow more concentrated and nutrient-dense foods such as grains or legumes, one now has to ask: For vegetarians who otherwise refuse animal products such as meat--which is the primary Paleolithic adaptation to foods of animal origin--and eat a diet whose macronutrient content (higher carb and lower protein) is out of line with the evolutionary past, how much do the possible benefits of getting a wider spectrum of more concentrated nutritional intake in consuming grains, legumes, and/or milk or milk by-products outweigh their disadvantages? Certainly in the short run it seems to help many vegetarians, based on anecdotal observations, especially those we observed in the Natural Hygiene M2M. (There will be more about this in Part 3 of the interview.)

Long-term concerns. In the long-term, however, it appears from the recent evidence coming in,
especially regarding hyperinsulinism, that there is a price to pay. While Americans have only been experimenting with vegetarianism in larger numbers since the 1960s and 1970s, epidemiological studies of populations in Southeast Asia where cultures have lived on grain and legume-based diets for centuries (often supplemented by dairy)--which is not so different in character to many vegetarian diets--show high rates of heart disease and widespread nutritional deficiencies. That we saw numerous health problems in the Natural Hygiene M2M that were often called "detox," but would appear to clinicians more like deficiencies, adds at least some anecdotal support that similar results might prevail among vegetarians. (There is unfortunately a lack of any controlled studies of predominantly raw-foodists who also include some dairy and/or grain/legume products in their diets.)

Mitigating circumstances. Two observations worth noting that cloud the picture a bit, however, are the
following:

Noteworthy difference between lacto-vegetarian subpopulations and raw-foodists adding grains/dairy. There is a primary difference between present-day raw-foodists (or

240

predominantly raw-foodists) who supplement the diet with dairy and grains vs. those traditional societies eating diets similar to (though not strictly the same as) lacto-vegetarianism. And that is the huge focus that raw eaters put on large amounts of fresh fruits and vegetables. Given the protective effects against disease that research has been showing consumption of fresh fruits/veggies confers, this point of distinction between the two groups would lead one to expect there may be interesting differences in results worth further consideration or study.

For previous raw-foodists, the supplemental amounts are usually relatively modest.
Secondly, those who may otherwise be predominantly raw eaters supplementing their diet with grains and dairy usually do so in modest amounts. How much their restricted intake of these problematic foods might mitigate what detrimental fallout there is from eating them would also be an interesting question to pursue.

*** "...the more interesting and more pressing question, to my mind, is not whether we are adapted to cooking of certain foods, which seems very likely, but how much we have adapted to the dietary changes since the Neolithic agricultural transition, given the 10,000 years or less it's been underway." Information about cooking's ultimate impact on health at the biochemical level of detail is still inconclusive. While as we've noted above, the detrimental repercussions of grains on health have
now been implicated in a number of ways, the picture about cooking is more unclear than I had thought at the time of the interview. And this is not only due to the uncertainties over when cooking began. As I have since found, there have been no "review" papers summing up the research on cooking in modern food science studies that have taken a unified look at the subject, and what other studies there are, are fragmented and scattered widely. (But see Looking at the Science on Raw vs. Cooked Foods on this site, as a first attempt to remedy this lack.) This lack of unified study on the subject shows, in one way, just how new the evolutionary perspectives embodied in Paleodiet research are to the field of mainstream nutrition. (Though of course, the question of raw vs. cooked foods has been around in modern times in the alternative health movement, particularly within Natural Hygiene, since probably at least the mid-1800s.)

Big picture is more clear: Impact of cooking is likely to be much less important than other overarching considerations. One interesting observation here, however, which may serve to put the
cooking question in more realistic perspective is that studies of modern hunter-gatherers (many of whom seem to cook about half their food) that have been performed show them to be probably among the most free of chronic degenerative diseases of all peoples on the planet. This suggests that whatever negative (or positive) effects cooking may or may not have, it probably just does not play a very large role adding to, or mitigating, the overall health factors that determine freedom from the degenerative diseases of civilization.

Magnitude of effect from macronutrient ratios likely plays the most influential role. Indeed,
as we outlined above, more and more Paleodiet-relevant research seems to be showing that it is the overall macronutrient profile of a diet, in terms of:

Types of fats, and their ratios and sources (which are considerations that seem to be more
important than former analyses emphasizing simply total amount of fat), and Factors that may precipitate Syndrome X. Just as crucially, the balance between proteins and carbohydrates in avoiding hyperinsulinism. (As we have seen, substituting more protein for carbohydrate increases the "good" HDL cholesterol while lowering the "bad" LDL cholesterol, and significantly reduces risk for other symptoms of Syndrome X).

It seems likely that these factors plus avoiding to the degree possible known non-evolutionary foods play the largest role in health, overshadowing what effect cooking may have. (Within reason, of course: There is evidence that overcooking to the point of charring should definitely be avoided since taking it that far produces carcinogenic by-products in fats/proteins such as meat. At the same time, however, it should be remembered that even raw uncooked plant foods also contain a certain natural level of mutagens and

241

carcinogens. Again, these two points about cooking taken together with the fact cooking can also eliminate toxic antinutrients suggest that, as was pointed out in Part 2 of the interview itself, the issue of potential toxic by-products from cooking is far from a clear or cut-and-dried one.)

Eating all natural foods or all-raw by itself does not automatically result in a prudent diet.
The upshot is that the obsession with a total raw-food diet in some sectors of the vegetarian community-while certainly not bad in the strictly technical sense--is considerably overhyped, assuming, that is, that, say half the diet is raw to begin with (particularly getting sufficient amounts of fresh vegetables and/or fruits). It is becoming more clear that the idea among vegetarians that if a person just eats all natural or all raw foods and they'll be okay ignores important factors, such as:

Which foods are, in fact, the most natural for humans (certainly not just vegetarian
ones--evolution shows meat is important as well);

Which natural foods can be important to minimize or avoid (i.e., legumes, grains,
dairy--if they turn out to be problematic--for those who otherwise consider them natural); and What balance of macronutrients, whether raw or cooked, best approximates the ancestral human diet and/or results in the best mix of nutrients to support health over the longterm. (Using this standard for comparison, a totally raw vegetarian diet may be too low in protein (it is often too low in calories to maintain weight for many individuals without great effort), and markedly overabundant in carbohydrates, which in spite of eating an "all-natural" diet, or an allraw diet, can still lead to long-term health problems due to possible hyperinsulinemia, particularly if the diet is too high in fruits.)

*** "Especially where people are avoiding flesh products which is our primary animal food adaptation, these animal by-products [cheese, eggs] may be helpful [for some vegetarians]..."
Again, while this may be true enough in the short-term, long-term is a different story, although eggs are something of an exception, and should not be lumped in with dairy when analyzing their effects. However, while eggs are native to the human diet, they would have been consumed only seasonally and in limited quantities, not daily or year-round like people do today. While the nutritional content of eggs is quite high and they contain complete protein of high bioavailability, they also contain the antinutrients avidin and conalbumin in the egg-white, the former of which inhibits biotin and other B vitamins; the latter of which binds iron. Egg-white is also allergic for some individuals. Thus not everyone can benefit from eggs, and those who can should still keep in mind that while natural and a food early humans would have eaten, they would only have been available in limited amounts.

*** "This [differing adaptation between population groups to Neolithic practices begun 10,000 years ago, specifically grain consumption] means it is going to be more likely right now in this particular historical time period that individuals will be somewhat different in their responses to diet. And as we saw above (with the two genes ACE and apolipoproteinB) these genetic differences may even confound attempts to replicate epidemiological dietary studies from one population to another unless these factors are taken into account."
While this is of course true, the fact that most of the detrimental repercussions of grains that we have outlined above hold across all population groups implies that--whatever genetic advantages there may be within certain groups--they have apparently so far conferred only minimal (physiological) adaptive advantages, which, overall, fall well short of complete genetic adaptation. Note, however, on the other hand, that grains have conferred considerable cultural selective advantages. That is, they have built the agricultural base that has enabled the rise of settled, hierarchical civilizations, which support the social stratifications and specialized pursuits that have given rise to the huge technological advances since then. The problem here, of course, where physical health is concerned is that genetic (physiological) adaptation always lags behind cultural selection in the behavior/culture evolutionary feedback loop (see discussion on

242

the feedback loop between evolution and culture elsewhere on the site for more information on this evolutionary process). Even assuming that civilizations worldwide were to remain based on grains indefinitely into the future, it would still probably be many more thousands, perhaps tens of thousands, of years before a fuller physiological adaptation can have taken place.

*** "In experimental settings, purified, isolated protein extracts do significantly increase calcium excretion, but the effect of increased protein in natural foods such as meat is smaller or nonexistent."
A recent posting on the PALEODIET list by Staffan Lindeberg, M.D., Ph.D. about this concern citing more studies than I had access to at the time of the interview [see next point for title and date of post] indicates that taken as a whole, studies on calcium excretion due to increased protein intake do in fact indicate animal protein has a more pronounced effect than other sources. (This is due primarily to the sulfates contained in animal protein [Breslau et al. 1988, as cited in Barzel and Massey 1998].)

Paradox of high bone mass in pre-agricultural skeletons despite large animal protein intake.
As stated in the body of the H&B interview, however--and as Lindeberg implies in his remarks--the debate as to animal protein's potential role in osteoporosis continues since it seems that hunter-gatherers in equatorial and temperate zones (excluding those in arctic regions such as Eskimos eating the highest levels of animal protein) have good bone mass parameters. (Archaeological specimens prior to the Neolithic certainly substantiate the picture of robust skeletal development in primitive hunter-gatherers.)

This would point to compensating factors in the complete picture of a Paleolithic diet that may
render the protein/calcium loss issue something of a non-concern when the diet as a whole is assessed. What the physiological mechanisms are that lead to this net result, however, have not been fully worked out by researchers, although the role of a number of other factors besides protein that also influence calcium balance is becoming more clear in recent years. (For a more in-depth discussion of the paradox of high bone mass in Paleolithic skeletons despite high protein intake, see "Are Higher Protein Intakes Responsible for Excessive Calcium Excretion?" elsewhere on this website.)

*** "Studies of Eskimos have shown high rates of osteoporosis eating an almost all-meat diet (less than 10% plant intake) but theirs is a recent historical aberration not typical of the evolutionary Paleolithic diet thought to have averaged 65% plant foods and 35% flesh."
I did not know at the time of making this statement that the modern studies showing Eskimos to have high rates of osteoporosis are unfair because, in this case, the more recent epidemiological studies were performed with Eskimos already partway along the road to Western habits and diets, which has not been generally noted. Thus these particular studies are not reliable indicators of the effect of their native diet, for which one has to go back to observational reports in the early part of this century and before. Unfortunately, information on the Eskimos from this era is limited to observational reports from which no rigorous physiological data is apparently available. On the other hand, studies of prehistoric Eskimo skeletons do show some degree of osteoporosis, although how much may be due to diet vs. other lifestyle factors such as reduced sunlight exposure, compression fractures due to traumatic vibrations from extensive sledding, etc., is not known. The rate of osteoporosis is increased in prehistoric Eskimo populations living furthest north, and among these furthest-north-living Eskimos, this rate increases as one moves further east from Alaska, across Canada, and into Greenland compared to those living in the western end of their range. [See the post of 7/29/97 on the PALEODIET list titled "Osteoporosis in Eskimos" by Staffan Lindeberg, M.D., Ph.D. for an extensive discussion plus scientific references.]

*** "W.N.: I would say my diet right now is somewhere in the neighborhood of about..."
It should go without saying that given the above updates to the Paleodiet research discussed here, I have

243

made attempts to bring my current diet more in line with what is now known, although I certainly don't claim to be perfect in my habits, and recommend people make up their own minds about what to do. There is also the problem that economics, as well as current agricultural practices in how animals are raised, slaughtered, processed, and made available for sale (the dearth of organ meats available is one problem) introduce difficulties in approximating a true Paleolithic diet today.

*** "...low- or no-lactose cultured forms like goat cheese and yogurt."
While cultured forms of milk do indeed have little or no remaining lactose, the other drawbacks of milk products outlined in the postscript above in all likelihood also would apply to goat dairy, even if the nutritional profile of goat's milk may be somewhat closer to human milk than is cow's milk, as is often asserted. [See Loren Cordain's post of 10/15/97 on the PALEODIET list for a brief summary of what is currently known and unknown about the nutritional composition of goat's milk compared to cow's milk.]

*** "...sprouted forms of grains, or breads made from them, that eliminate the gluten otherwise present in wheat, barley, oats, and so forth."
Although I have not been able to confirm the following to my complete satisfaction with documentation, sprouting of grains is probably as important for deactivating the antinutrients (presumably phytates) they contain. Whether sprouting is as effective in reducing the gluten content as sprouting enthusiasts believe is something I am no longer so sure about. For either process, however, much undoubtedly depends on how many hours or days the sprouts are allowed to germinate and what stage the growth process is allowed to reach before consumption. .

Problem: Finding unsanitized reports about the full spectrum of real-world results with vegetarian diets
Health & Beyond: Ward, I briefly described what the M2M is in the first part of our interview, but why don't you recap it for us here?
Ward Nicholson: Sure. The term "M2M" is an acronym that stands for "many-to-many," and is something I sometimes refer to as "group mail" because people can get pretty passionate about it. It's like a huge penpal group, except it's operated as a newsletter. Ours is published bi-monthly, containing letters from participants that they send to me as its Coordinator. Normally, you might call me the "Editor," except one of an M2M's operating principles is that we agree to print what everybody says exactly as they send it in, with no editing other than setting a page limit for each person's letters. So an M2M is a free-speech kind of thing. Each issue, we set an optional topic to be discussed, usually relating to health or Natural Hygiene, but it's only a suggestion, and people can ignore it if they want and talk about anything--which, believe me, they sometimes do! We even encourage this, because sometimes you can glean the most interesting insights from what people say that is supposedly "off topic," or only indirectly related, than by what they say directly "on topic." The conversation that develops between everyone is this amorphous thing that continues to change in an organic sort of fashion, shifting with the sands of who joins and drops out and what happens in the members' lives, and what things are said that hit a nerve with folks.

How many people are in the M2M?


The membership has hovered around 30 to 50 "active participants" (those who agree to write in letters regularly) for the last few years, plus about 10 to 20 "read-only" subscribers, some of whom become active themselves after seeing what goes on. These numbers are about where we have to keep things because of logistics to keep the M2M from getting unwieldy. But it's also a kind of revolving door, since usually a couple of people join and a couple of people tend to drop out every issue or so, with a certain stable core of longer-term members. This is just a rough guess, but I would say in the four years the M2M has been in operation, overall we have had maybe 75, perhaps as many as 100 people write in letters to the M2M telling something about their experiences with Hygiene. The real value, though, is you get to know the people who stay on-board

244

for some length of time on an ongoing basis, and in considerable depth, which gives you much better insight into their Hygienic or other vegetarian or "radical diet" experiences and experiments, the predispositions in their thinking, and thus a basis for judging what is really going on with people. I would not claim that the M2M is a scientific sample. However, I do think it is a unique forum for getting at the truth of people's dietary and health experiences, because: Those who write in are enthusiastic about Hygiene or vegetarianism or other "natural food" diets based on some kind of first principles, so you know they are trying hard. Their words are unfiltered and unedited, which I believe gives a much better glimpse of the truth of the rank-and-file's results than what we get from the public statements made by the ANHS (the American Natural Hygiene Society, the national organizing body) and the IAHP (the International Association of Hygienic Physicians), who have an image to uphold. The ability to cross-examine people tends to elicit the truth from them over time, with the resultant honesty it forces in people unless they are going to drop out. And, I think you can pretty reliably assume that even where they are not being completely honest about their level of health and behavior, since people will tend to present even their failings in the best light, you can be pretty sure that if they have problems to discuss, they are really experiencing them.

I do think, therefore, this allows you to draw some pretty fair conclusions about what is truly going on with a range of Hygienic vegetarians in the real world, especially in the absence of any scientific studies.

So why did you start the M2M and what have you learned from it?
My original idea in doing the M2M was just to talk directly to "the horse's mouth" so to speak--to bypass "official" channels of information--so I could find out what was really happening with Hygienists in the real world. I learned plenty about this, of course. However, what surprised me--at first, anyway--is that I have learned far more instead about what I call "the psychology of idealistic diets." I discovered that a sizable proportion of Hygienists are experiencing problems and disappointing results--even pronounced problems--on the diet, often in spite of adhering faithfully to the Hygienic program over considerable periods of time (many months or a number of years).

Not everyone does well on raw and/or vegan diets ("failure to thrive"). So the first thing I would
say here to set the stage for discussing the psychology of an idealistic health and dietary system is that in Natural Hygiene's case, while it certainly works well for some people, it just doesn't for others, regardless, apparently, of the time span. Yet you don't hear about this in mainstream Hygiene--not from the ANHS or in the books and magazines that are sold anyway. There is a real see-no-evil, hear-no evil, speak-no-evil syndrome. (I have also been told by a Hygienist not connected with the M2M, who has had contact with higher-ups within the ANHS, that they are aware of some of these failings, but simply choose not to mention them.) If problems are acknowledged, they are attributed to lack of discipline in following the program, or in screwing up the details, or not giving things enough time, or that one is focusing only on diet (even when, in fact, they aren't) to the exclusion of other health factors. But never are they attributed to the idea that Natural Hygiene could be incomplete in some way. I know that I myself found this hard to believe initially, because having read many of the Hygienic books including those by Herbert Shelton (whom I still have a tremendous amount of respect for as a great synthesizer and original thinker in many areas) that were so very convincing as to the wonderful results that were gotten, I thought here at last was a system whose logic and results both showed it to work. And the system was so logical and seemingly all-encompassing as laid out by Shelton, how could it not work for everyone?

245

The controversies that arise over dietary failures provide an entry point for observing the psychology behind idealistic diets. Eventually, however, I could no longer deny the obvious, and most
who have been with the M2M any length of time will tell you this is one of the more eye-opening things about participating in it and being exposed to lots of different people: There really are numerous problemcases on the Hygienic system of health and diet. Once faced with this, you have to come to terms with it one way or the other. And of course, it is in how people explain the problems where all the controversy lies. We will get into how people react to this state of affairs in depth, but for now I just want to put this on the table here for people to think about.

The attractions and pitfalls of purist black-and-white dietary philosophies


I take it, then, you have some other observations to introduce before we proceed further?
Yes. Let's back up at this point and set the context for why people get into Natural Hygiene or any other dietary system in the first place.

PROBLEM #1: Confusion and contradiction in the marketplace of dietary ideas and research. I think most of us have experienced before we got into Natural Hygiene how confusing the
dietary world is these days. Everyone has something different to say and even the scientific studies on nutrition seem to contradict themselves every few years. About the only things anyone seems to agree on right now are that lots of fresh fruits and vegetables are good, and too much fat in the diet is bad--and some people don't even agree about the details of the latter. On most every other topic there is divisive controversy. Most people are confused. There is nothing that leads to a good argument like food these days. Of course, maybe it's always been that way, but in any event, confusion and contradiction about diet are rampant and have been for a long time.

THE APPEAL: The psychological attractions of emotional certainty. So the first thing
to look at here is that people strongly crave a lot more certainty in this area. This is important from the psychological standpoint: It means there is going to be appeal in systems that make everything very simple, very black-and-white, systems that offer some sort of key litmus test by which you can assess everything and reduce it all to fundamental, delineated concepts, hopefully with few ambiguous gray areas--even in spite of the fact the real world may turn out to be far more complex. THE PITFALL: Oversimplifications and illusory truths. Now there is nothing inherently wrong with desiring more certainty in and of itself. The problem comes in when the illusion of absolute certainty is so strongly desired, and people become so sure of themselves, that everything becomes oversimplified and things start getting left out of the equation. We'll talk more about this later too.

PROBLEM #2: Feeling powerless over microbes, genetics, or out-of-control authoritarian health-care. Another part of the psychological climate to look at here is that with the
dominant/submissive doctor/patient relationship that has dominated the Western medical approach for decades if not centuries, people don't feel in control of their health. They feel at the mercy of unknown forces--not only doctors, but microbes, inherited genetics, and so forth.

THE APPEAL: Giving people tools to regain individual control over their own health. So here, there is an inherent appeal in dietary systems that give back control to people by
putting tools and methods in their hands that give them more power to control their own health. Again, this is good, of course.

246

THE PITFALL: Perceptual reframing of health factors can blind people to real dangers. But on the downside, there can be the danger of a false sense of control that occurs
when people are not as much in control as they think they are, or think they know more than they do. This can cause one to view health or disease symptoms through perceptual filters which distort one's vision of what may actually be going on, which can get people into real trouble. We see this in the case of Natural Hygiene where all symptoms tend to be viewed as "detox," blinding people to other possibilities that may go unaddressed as their condition worsens while the person thinks they are getting better. This we will also discuss in depth.

PROBLEM #3: Thoughtful non-authoritarianism vs. mere reactionary rebellion.

THE APPEAL: Casting off outmoded or oppressive authority, and thinking for oneself. Another psychological backdrop for the appeal of Natural Hygiene and other givecontrol-back-to-the-people approaches is that they are anti-authoritarian. There is an inherent and legitimate appeal in systems of thought that take power away from oppressive systems like what much of our current modern health-care system has become. I would hope most of us laud this trend. THE PITFALL: Myopia and self-delusion. However, there is also a danger here as well, in that if one becomes so reactionary that they begin not to listen to feedback from others, they risk becoming lost in an emotionally reactive, subjective world of their own. Feedback is the only corrective to self-delusion, and the feedback of information outside your own "self-certain" or myopic thinking processes can be vital if your world is not to become solely self-oriented to the point you lose your bearings.

Okay, this gives us some insights into what psychological factors are appealing to people about a dietary system like Natural Hygiene. However, people also usually get into health foods because of specific health problems. Why Hygiene and not some other system?
Well, this is something I have done a lot of thinking about, including thinking back to why I personally was attracted to Hygiene, and I've also observed what people new to the M2M talk about. And I don't think it would be exaggerating too much to say that it is the paradigm of "detoxification" that captures people's imaginations and satisfies their sense of logic. It powerfully addresses the three psychological desires we talked about above, by: Number one, answering the desire for a feeling of certainty with a simple, powerful theoretical and practical mechanism that appears to explain everything (or almost everything). It also gives a completely new perspective for most health-seekers who--like most of the populace-have been absorbed in the make-sure-you-get-all-the-right-nutrients approach or who have been mesmerized by the idea microbes are at the root of everything. This is what I think is responsible for the statement you often hear by converts to Hygiene that "it just had the ring of truth." What I think people are really describing here is the shift from one paradigm to another. Suddenly you have the new explanatory paradigm that in one stroke simplifies the entire health equation, relegating microbes to a secondary role and assuring you that if you just eat a variety of vegetarian natural foods, you will get all the nutrients you need, so what you end up mainly needing to focus on is eating clean foods. It is this simplicity that is partly what makes people feel so certain, along with the fact that this simplicity is so logically black-andwhite, a characteristic of Hygienic thought we will also look at more as we proceed.

247

Second, just as convincingly, the need for control is answered by the fact one can actually see results by eating cleaner foods. This part of the Natural Hygiene system definitely works for almost everybody who has been eating what we call the "SAD" ("standard American diet")--at least at first, and up to a point--and so it is what one could call the "conversion experience" or the "rite of initiation" that furnishes the proof people need to be convinced of the validity of the entire system of Hygienic practice. Whether on the basis of initial symptom disappearance due to detoxification the entire fabric of Hygiene merits unquestioned credence is an issue most people don't think about. But the fact is, it does play the role of convincing people to follow the rest of the dietary admonitions of the system too, by extension; i.e., if this is so powerfully true, how can the rest not be also? And third, of course, people love the anti-authoritarian blast-the-medicos rhetoric that blames their profession for everything short of the fall of Greek and Roman civilization. It gives people something to stand for--the underdog against Goliath, or the lone light of truth in a dark, SADeating world--and enables us to pump up our egos by showing how ignorant most people are, especially the M.D.s, who know nothing about nutrition, and who with their drugs are creating more toxification of people and creating the very problem that needs solving in the first place. Even the scientists are seen as stupid too, who, in spite of all their knowledge, can't see the forest for the trees like we Hygienists can see it, just by using a little common sense and breaking out of our previous paradigms.

Other unconscious needs also met. Psychologically, all of this serves a lot of unconscious needs by
giving people a sense of identity, villains to fight against, Gentiles to convert to the cause, and so forth. One can even be a martyr and a prophet not recognized in their own country (family) during holiday gatherings where they are ostracized for refusing to partake in heathen rituals like turkey-eating. And of course, as I have personally experienced first-handedly, there are the heretics such as myself who are upbraided for having abandoned the cause, or the backsliders who are pitied in their weakness. It's a complete psychological package that people often get a lot more mileage out of than they realize.

Success/failure rates of vegan diets in Natural Hygiene


All right, the preceding gives us a basic psychological framework for the appeal that Hygiene makes to people's minds in the beginning. What I want to look at now are some of the actual results on people's bodies and health that we see in the M2M to add the next layer of events, and upon which further stages of Hygienic psychology are built. Because it is in dealing with or explaining the successes and failures that much of the psychology manifests itself.

Before you do that, let me slip in another question here to get the broad picture first: What is the percentage of Hygienists in the M2M who do well on the diet compared to those who don't? What's the basic split?
It's hard to say with any precision since a number of Hygienists follow versions of the diet that don't strictly match the traditional definition. But I'll make some very rough guesses here, as long as you understand that's what they are. If you are defining the diet to include those who are very strict about their behavior--an important point--and who follow either the all-raw regimen or the 80% raw/20% cooked plan, I would say perhaps 3 to 4 out of 10, maybe half if one is generous, do well. [Note: "Cooked" foods in this context refers to grains, legumes, and tubers; the relative amounts given would be by volume.] This is long-term, an important distinction we'll look at as we proceed. Significantly more do well short-term.

Dropouts over time and "cheating" on the diet complicate the assessment. There is also a
certain amount of attrition of "strict-behavers" over time, so that the ranks of these individuals who do well

248

long-term consist of those left after others have liberalized their behavior. Again, the distinctions one makes change the answer, making it very difficult to say. A real problem here in coming up with a number is that there are many who "cheat" a little (we'll discuss this behavior later) and include haphazard but regular driblet amounts of cheese or butter, even eggs. If you were to include these individuals, even more would be classed in the group doing well. All-in-all, I would guess what you are left with after this is a quarter of Hygienists who are experiencing some kind of difficulty, possibly up to half if one were being very tough about it. It all depends very much on just how you define "difficulty," which we'll be doing shortly. I don't want to quibble over specific figures because they can be argued. But a significant percentage--whatever it is--experience the problems we'll be listing, from minor to more troublesome.

The problem of vested interests among "official" sources in getting straight answers. Now
admittedly the M2M is not a scientific sampling, but frankly, one probably doesn't exist right now, so we have to go on what we hear from people when they appear to be being straight with us. It would be better if we could get statistics from Hygienic practitioners' caseloads, but since they are so heavily invested both monetarily and psychologically in Hygiene, it is difficult to assess how unbiased they are. Even though I have a lot of respect for their knowledge about applying currently accepted Hygienic practices to the circumstances of the lives of their patients, you have to admit that most are not going to allow the possibility to enter their minds that failures in their patients might be failures of the Hygienic diet. They are going to be more likely to see them as "failure to comply" or as due to other factors outside their control.

"Party line" views and ostracism of dissidents. I believe you know from having previously
interviewed practitioners Stanley Bass and Christopher Gian-Cursio here that when Gian-Cursio looked into the problems a number of his patients were experiencing and came to the conclusion that it was because the Hygienic diet was not sufficient for all patients--and that the problems were only turned around by carefully supplementing the diet with foods of animal origin such as cheese--and wrote it up, he was essentially ostracized or at least ignored by professional Hygienedom for saying so. Many of the Hygienic practitioners, as Dr. Bass has pointed out in one of your earlier interviews here in H&B, often are dealing primarily with fasting patients, and thus not necessarily seeing how well all their patients do out in the real world away from a fasting institution. Initially, you certainly cannot argue with the empirical fact that very few people with health problems will not improve considerably with a Hygienic fast followed by a cleaner diet than what they have been used to eating. But what happens during a fast and the aftermath is only a small window into what goes on the rest of the time on a Hygienic regimen. I myself have gotten decent results from fasting, but have not done my best on the Hygienic diet itself. It was a repeating pattern of improve-on-a-fast followed-by-stagnation-and-eventual-downslide on the traditional Hygienic diet that got me thinking along these lines.

Special measures that may make the diet work for more individuals argue against its naturalness. You also have to remember that there is a certain amount of "plasticity" in organisms to
what they can functionally adapt to, even if they are not optimally adapted genetically to something. With enough accumulated experience--as many Hygienic practitioners have--you can learn just what jots and tittles have to be followed in order for a Hygienic diet to work. Thus, there may be some truth to the fact that if people aren't getting results from a Hygienic diet (assuming they are following the other elements of the lifestyle such as proper rest, emotional poise, adequate exercise, clean air and water), then "they aren't doing something right." However, if you really have to follow so many dietary details down to the last iota, then that very strongly argues against such a diet being our natural one, given all the rules for behavior that have to be so painstakingly followed. It seems to me, most especially given the more rough-and-tumble evolutionary picture, that a "natural" diet will have more allowance for give-and-take variation in one's basic food intake than that.

249

Gap between uncensored reports and officialdom rarely surfaces publicly in a way that gets widespread attention. Bass and Gian-Cursio claim they were seeing some problems develop in
Hygienists in real life, especially over successive generations. However, their views were never allowed a forum by official Hygienedom, so very few have had a chance to hear their contentions. What we get is the party line from the IAHP and the ANHS. What one can observe in the M2M agrees more with what Bass and Gian-Cursio have said than what you hear officially.

What about those eating the all-raw-food version of the Hygienic diet? Don't they do better than this?
Unfortunately, no. As a group, they do worse. There are certainly some individuals who do well on the raw-food diet, but they are a small minority. We do have a few individuals in the M2M on all-raw who have done well for 10/20/30 years or more. But my estimate from having seen people come and go in the M2M, and the stories they have told about their attempts at sustaining an all-raw diet, is that somewhere in the neighborhood of about 10-15% of Hygienists are truly able to thrive on an all-raw diet. The rest do better when they have added a certain amount of cooked tubers, legumes, squashes, grains, etc., though they may of course be able to live in whatever condition they do on an all-raw diet.

The distinction between short-term and long-term results is critical in evaluating "success."
It is one thing to go on a raw-food diet and do well for some months or a year or two or whatever, but most who take it beyond that point begin to decline. The true test is long-term over a number of years. With the exception of those few who thrive, what we hear in the M2M from people attempting the all-raw diet for any significant length of time (this applies in spades to those attempting fruitarianism) is that over the long term they often start suffering from one or more of the following symptoms: They may lack energy, or their stools will become chronically diarrhea-like, they feel hungry all the time, they may not be able to maintain their weight without struggle, or they develop B-12 deficiency, lose sex drive, fatigue easily, and/or develop insomnia or a hyperreactive nervous system. With some individuals, we also sometimes see bingeeating develop when they try all-raw.

Are the rare all-raw success stories the "ideal," or simply "exceptions"? The all-raw diet works
well for some few individuals, but for these individuals to extrapolate (as most do) that therefore everyone would also do well if only they were "doing it right" ("like I do it," following certain jot-and-tittle procedural details) is fallacious and self-serving. It assumes that everybody is alike, or enough so for the same ideal diet to work for all--more black-and-white reasoning. If there is one thing modern genetics is showing, it is that people can vary considerably in certain respects. Especially given the evolutionary transition that the species has been in the midst of since the advent of agriculture 10,000 years ago, you would expect there to be more variability at this time.

Symptoms of "failure to thrive" on raw and/or vegetarian diets


All of this so far paints a disconcerting picture of the Natural Hygiene diet as one that works for some but not others. Why? Do you really feel people are so different that we can't agree on certain basic fundamentals of diet?
No. But I don't think it's too mysterious why some succeed and some fail on Hygienic or other vegetarian diets. In Part 1 and Part 2 here, we went to great lengths to detail what the so-called "original" or evolutionary diet of humanity was. What Hygienists miss is that the "basic fundamentals of diet" that you speak of comprise a wider variety of acceptable foodstuffs--including a certain percentage of foods such as cooked starches and lean animal game products--than have been considered "kosher" within Hygiene to this point in time. It is within the set of foodstuffs common to our evolutionary heritage that you find individual differences and the need to tailor things somewhat.

250

From this standpoint, then, the Hygienic diet is not so much different from the evolutionary diet as it is simply a restriction of it (as is true of all vegetarian diets). The foods we are eating are fine foods, but they are a subset of what the human body evolved on, and some individuals with the right kind of constitution or genetic plasticity--say, closer out toward the ends of the statistical "bell curve" of the average genetic make-up--can handle that restriction while others can't. And as we said previously, there are jots and tittles that experienced Hygienists may know how to implement in order to compensate to make the diet work for additional people. But they don't work for everyone.

So for the individuals who do not do well on a Hygienic diet--whether all-raw or the more mainstream 80% raw/20% cooked version--what are the symptoms they experience?
Well, I mentioned a few of them briefly above for the all-raw-fooders, but let's look at them in more depth, since they often also apply to the mainstreamers who are not doing so well, just to a lesser degree. Also, in most cases, whether all-raw or not, these individuals are more often total vegans, or close to it--no dairy or eggs gotten on the side occasionally. I want to mention here that I am well aware that the traditional Hygienic explanation for some of these symptoms would be that they are merely "detox" symptoms. Later I will go into just why I believe this idea is mistaken given the circumstances in which these long-term symptoms usually manifest.

A look at the most serious potential problems. First, here are a couple of severe problems we have
seen. Keep in mind these are the most severe we have seen, and they are individual cases, but they are real possibilities.

Case of congestive heart failure in a decades-long Sheltonian hygienist. Probably the


most sobering was that a year or two ago one older member who ended up dropping out of the M2M wrote to us that they had developed congestive heart failure. They said Hygienic practitioners told them it was due to lack of adequate protein and B-12 over an extended period. [No comment here as to potential accuracy of this attribution.] We later heard word from them through an intermediary in the M2M that it may have been due to having followed a diet too high in fruits percentage-wise for many years prior to that.* This even though they were apparently not extreme about it like a "fruitarian" would be, but had simply followed a traditional vegan Hygienic regimen over many, many years, only one higher in fruits. This particular situation I don't think anyone in the M2M really had much of a stock Hygienic explanation for. Most were simply stunned. I do think, however, it at least drove home the point with most people that diets too high in fruits are wise to stay away from. Case of rickets in a vegan toddler eating a Natural Hygiene-style diet. Another sobering occurrence was that in the most recent issue of the M2M, one of our participants reported that their two or three-year-old infant son--who was still breastfeeding (or had been until recently) and who, along with the mother, was eating vegan Hygienic foods, including getting sunlight regularly--had developed bowed legs as a manifestation of rickets due to vitamin D deficiency.* After getting professional help that included an initial megadose of vitamin D, the father began feeding the son raw goat's milk and cheese, and added a daily multivitamin/mineral supplement, with the result that the son's legs are improving and areas in his teeth where enamel had been missing are now re-enameling. Cases of vitamin B-12 deficiency. As a perhaps less severe problem, but one that occurs somewhat more frequently, we have had several members on the Hygienic diet (even when not all-raw) report very low levels or deficiencies of vitamin B-12. These have been discovered on blood tests not just by mainstream doctors but also by Hygienic practitioners, the latter taking the problem seriously. To my knowledge, the low B-12 levels have so far responded quickly to supplementation which is often recommended by Hygienic doctors these days.

251

Low-profile symptoms more commonly seen. The above are the most serious problems I can recall
offhand without going back and digging through back issues of the M2M to perhaps find a few others. Less severe but more frequent than the above symptoms are the following:

Continual diarrhea or poorly formed stools can develop in some individuals after some
months on the diet. This would occur mostly with people eating all-raw. This seems to be due to the very high roughage content which some individuals simply cannot handle well.

Some women's periods become quite erratic or may disappear. Some people experience insomnia or unrestful sleep on an all-raw [and/or vegan] diet and
get run down and chronically tired. Another contingent of people has very little energy eating all-raw and drag around all day. An alternate manifestation of this would be feeling languid much of the time or sleepy during the day even with regular nightly sleep. A mental effect sometimes seen may be lack of normal motivation to get regular daily tasks done--no "joie de vivre" ("joy in living").

Some individuals cannot maintain their weight well and become thinner than even they may think is healthy. We have one individual in the M2M who had this problem and who
was a huge eater, adhering to the Hygienic diet since 1989, who has finally been gaining weight after having included regular raw animal flesh for the past couple of years. The volume of food required has dropped dramatically. Most, although not all, athletic individuals, particularly endurance athletes, find they perform better when they include cooked starches in their diet--and that relying primarily on the sugars in fruits on a raw-food diet, and the calories in nuts and so forth, does not seem to provide the proper mix of carbohydrate fuels that provide the sustained energy they want. Perhaps the most common complaint of all is that more than just a few people are "hungry all the time." Some then become obsessed with eating and have to eat or piece all day long. Another common complaint is that sex drive decreases markedly for some, even to the point of virtually disappearing. This can occur in both males or females, but the complaint is heard the most often from men, many of whom are not accepting of, or happy with, the stock Hygienic explanation of a simple lack of "overstimulation," or the typical vegetarian view that it is a "spiritual" advance.

Some people experience more frequent respiratory problems and colds on an allraw diet, even when they have been on the diet long-term, which might indicate a
lowered immune system. For example, one individual who has been vegan since 1989, and Hygienic since 1991, has noted an increase in colds/respiratory problems whenever the raw component of their diet has climbed above the 80-90% range or so. When they include cooked grains and starches in their diet, they feel better, are not so hungry all the time, sleep more soundly, and the colds cease. Some people's nervous systems become more reactive and subject to upset, which may be connected to the insomnia problem that some experience.

How people get trapped by "pure" diets that don't maintain long-term health
The "frog in slowly boiling water" syndrome: initial improvement followed by long-term decline. And the final thing I want to look at here is not really a specific symptom, but a long-term
syndrome that takes time to manifest. We could call this the "better-for-awhile-at-first-followed-by-slowdecline" syndrome, or the "frog-in-slowly-boiling-water" syndrome. In fact, this is the usual long-term pattern within which the above problems occur.

252

How can things end up bad when they started out so good? As we mentioned earlier, the
key conversion mechanism for convincing people of the Natural Hygiene system--or other "cleandiet" approaches--is that when they go on a cleaner diet, almost everyone improves initially because of the detoxification that takes place. People often feel so incredibly good initially on a clean diet that they cannot believe--they do not want to believe--it could somehow go sour. However, over the longer term, many people's bodies simply may not be extracting sufficient nutrition from the diet and they begin to decline, or develop various symptoms even if they still feel okay, presumably due to deficiencies after their reserves are exhausted. Except in the case of B-12, these are not usually your typical "deficiency-disease" symptoms--like the case of rickets that was mentioned earlier. They are more subtle, but they have their effect, and they can be slowly debilitating over long-enough periods of time.

The lulling effect of imperceptibly slow declines in health. From a psychological


standpoint, the most interesting thing about this syndrome is that it occurs slowly enough that people often mentally adjust to the lowered state of health and do not perceive it as such, particularly since they so strongly do not want to believe it is happening. Instead, they continue to feel they are doing fine--sometimes even when they describe some of the above symptoms, which they may interpret as improvements that indicate "continued detox"--while people who know them may tell them otherwise. Yet because of the extreme anti-authoritarianism, they don't listen to what others are telling them, until finally, the problems reach a point they can't deny them anymore. Emotional "certainty" shuts down one's ability to rationally assess symptoms. By the time this happens, though, people have been so thoroughly convinced of the entire Hygienic dietary system by their initial successes, and particularly are so won over mentally by the detoxification paradigm, that they are now psychologically invested in the "rightness" of everything about the Hygienic system, and cannot believe there could be any shortcomings in it, since it seems so internally self-consistent logically.

The willingness to make sober judgments of current symptoms is perpetually displaced into the future. Since they got results to start with, they feel their problems now
must be only a temporary setback, or evidence of further detoxification taking place or that further detoxification or patience for further healing is needed, or if not that, if they can just correct a few details, all will be well again and they will be back on the road to superior health. "The better you get, the more energy the body has to expel the toxins, and the more powerful and healing the crises become!" This is the thinking, which in earlier stages of detoxification might be true, but is increasingly questionable the longer one follows the program, which is the case for many of the symptoms we outlined above.

It sounds like you are saying people can become more concerned with "being right" about Hygiene than in whether they are actually getting good results.
Basically, yes. That's a good way of putting it. The logic of Hygiene comes to captivate people and satisfies them as much as results do--perhaps more so if they aren't getting the results desired. The dynamic, of course, is a little different for those who are getting good results. They, too, are plenty interested in being right about Hygiene, but since it has worked for them, they have a more solid sense of "certainty" about it, which of course they project to others. And because they have this greater sense of certainty, these individuals tend to become the repositories of traditional wisdom which is dispensed to the less fortunate. Those who are successful have a tendency to blame the other persons' behaviors ("you aren't following all the itty-bitty details right") or to doubt the failures are failures at all, seeing them rather as expressions of the truths of Hygiene at work when its principles are violated, proving once again its rightness.

253

Those who are successful are partners in reinforcing the tendency not to see failures as real failures. That's one of the biggest things holding all this in place: Those for whom Hygiene has worked
cannot afford to consider failures as failures of Hygiene either, because just like everyone, they uphold the ideal that it must work for all, and so if it does not, it is also a threat to their own beliefs in spite of their success. This shared ideal is what binds the successes and failures together in common cause. The less successful imbibe the certainty of the more successful to maintain the faith. This syndrome is of course not unique to Hygiene, but occurs with any idealistic system where there are both significant successes and failures. The important point to note here is that it shuts down the mind's openness to new interpretations. And if the mind cannot look at alternative explanations, what you will often see is that the favored paradigm gets pushed to extreme limits by those who are failing at it in order to try to reach some sort of resolution.

Such as?
Well, at this point, two things may happen, which are that: In order to prove that the system does in fact work, people redouble their efforts at detoxification, and often attempt to go on an even stricter diet. And/or: Considering perhaps they may not be getting enough of some nutrient after all, they begin to get more perfectionistic about the many vegetarian dietary details that are possible to manipulate. After all, one characteristic of exercising more control is the need to pay attention to the additional details that are necessary in order to implement such increased control. And if the details one used to pay attention to aren't working anymore, one must become more attentive to even smaller details.

Thus, what is essentially a simple, logical system of garbage-in/garbage-out ("toxemia is the basic cause of all disease") begins to increase in complexity. In the end, obsession and fanaticism may develop.

How obsessively striving for absolute dietary purity becomes a fruitless "grail quest"
Significant parallels with religious behavior. In the introductory passages of Part 2 of the interview
here, I was not merely making a metaphorical comparison in saying that Natural Hygiene resembled a religion in some ways, because the reaction of redoubled purification efforts in response to the better-forawhile-but-then-worse syndrome has interesting parallels with certain religious motivations. It serves some of the same psychological needs.

Let's go into this more. How so?


Well, in religion, at least in the West, you have the idea that people have sinned, and that there are prayers or rituals you must go through to cleanse yourself of that sin. There is typically guilt or at least anguish over this state of assumed sin. Even if one does not believe in original sin, still, all it takes is one slip-up to establish an ad-hoc fallen state, after which you must redeem yourself. Guilt and redemption are strong motivators, because they go to the core of self-image.

The role of unseen and unverifiable "toxemia" as evidence of one's "sin." In Natural Hygiene,
if one makes a blanket assumption that "toxemia is the basic cause of all disease"--or at least the only one worth considering, since dietary sufficiency eating natural vegan foods is usually assumed to be a given-then the psychological motivations often cease to become merely practical and become coopted by a "redemptionist" quasi-religious quest. For those who are prone to taking it to an extreme, what was practical can become imbued with an element of superstition, or at least an unverifiable belief in unseen

254

toxemia that one cannot really substantiate completely. So one's philosophy can begin to obscure whatever realities to detoxification there may be and blow them out of proportion.

The "relativity" of absolute purification transforms it into an ever-receding goal and goad.
One's overriding goal becomes greater and greater purification toward the holy grail of complete detoxification that ever recedes as one's perceived level of detoxification continually falls short of the postulated goal. One paints themselves into a corner where because "no one is without sin" (we all goof up from time to time), one can never be sure after they have sinned if that sin is really forgiven or not (cleansed from the body by normal processes). Thus to redeem oneself, one must always either fast or go on some sort of cleansing program. Following from this is the idea that to remain pure and undefiled and also to serve as proof ("by their fruits ye shall know them," or the signs that one is more evolved), one should go even further and be able to live on the most restricted diet possible permanently. Such as not just a raw-food diet, but sometimes fruits alone (fruitarianism). And if one is not able to do so, then it means one is not really purified, or evolved enough, and thus the only recourse is to do further cleansing until one is really pure and evolved enough to do so.

Self-restriction becomes its own virtue as absolute purity recedes. After awhile, however, this
easily becomes a habit divorced from any real reason. Even if one did reach perfect purity, the urge toward self-restriction would continue because of the psychological dynamics which have become ingrained. So for those not doing well on the Hygienic diet who fault themselves, it can end up becoming a selfreinforcing vicious circle and one never reaches the "goal." Because as we've noted, what often happens by this time is after being on the cleansing diets and restricted intakes for long enough, instead of toxemia being a problem, deficiency (depletion of one's reserves) sets in. So what ensues if people are really hooked into this dynamic is they go through a cycle of cleansing and sinning, cleansing and sinning (because one's body is crying out for more nutrition and drives one to eat something different, anything different in an attempt to make up the nutritional shortfall). Or if they do not sin, they continually feel they still must not have figured the "system" out completely (whatever the system may be), and so they go on searching here, and searching there.

Endgame: fundamentalist obsessive-compulsiveness. Once this goes on long enough, because it is


self-reinforcing, many of these individuals become so locked into this way of thinking they are never again able to see food and eating any other way. (Just as fundamentalists rarely ever change their minds.) This particular syndrome is a potential breeding ground for obsessive-compulsives, and the individuals who fall into it often become the basket cases of Hygiene. I realize, of course, all this may sound rather ridiculous taken to this extreme, but it is a real dynamic that we see in the M2M in more individuals than you might think. Typically, of course, it's not nearly as extreme as painted above, but it's not an uncommon undercurrent in behavior. At some point a person has to decide between two perspectives: Is one going to take toxemia or detoxification to be all there is to health--by taking the reactionary opposite of their former all-American "more and more and more is better" outlook to its simplistic opposite of "less and less and less is better"? Or do we try to strike a balance point somehow?

Successful vegetarian diets require more than simple dietary purity 255

So how do those who are successful on the Hygienic program go about striking that balance? More attention paid to robust nutritional intake and other health factors. Well, we see a few
different ways, but they all boil down to two main areas, I think. Rather than getting so totally wrapped up in detox, successful Hygienists more often seem to pay a lot of proactive attention to making sure they get sufficient nutrition, rather than just assuming it will happen automatically if they eat all natural foods. But just as importantly, they also more often pay equal attention to the other factors of a healthy lifestyle such as adequate exercise, sleep, rest, sunshine, stress reduction (including lack of excessive mental stress), and even sometimes to living in more equable climates, which also reduces stress. I don't think the attention paid to these other factors is any accident. Long-time Hygienists are more likely to be aware on some level of the detrimental impact insufficient attention to these other factors can have, and intuitively seem to know they must be paid attention. These factors are important to health in and of themselves for anybody, of course. However, I believe there is an additional reason why they assume added importance for a Hygienic lifestyle in particular.

Stress and vegetarian diets. My own personal experience and observations corroborate what I have
heard from some other long-time former vegetarians outside Hygiene who have had the chance to observe successes and failures--both in themselves and widely in the vegetarian community. And this is that stress makes a vegetarian diet more difficult to thrive on--more so than for other diets. It just seems to be that when you look at a range of long-term vegetarians, the ones doing better are more often living "easier" lives. Their lives are less demanding work-wise or stress-wise, or if not, they have plenty of time to recharge. Again, I am not saying this is true of all, but it is a noticeable thread. Handling stress of course hinges on several other factors in the Hygienic system, such as adequate but not excessive exercise to create a state of fitness and better resilience to stress; and of course adequate rest and sleep to repair from stresses. And a number of Hygienic practitioners are noted for continually reminding the rest of us not to shirk these factors.

Stress creates less margin for error for nutrition's contribution to physiological maintenance. Why should it be that one may have to pay more attention than usual to alleviating stress
factors on a vegetarian diet if it is to succeed? Again, my opinion is that it goes back to the observation that compared to the "original" evolutionary diet of humanity, the Hygienic diet is a restricted one. This leaves less margin for error with regard to nutritional intake. And that would presumably also have the potential to affect any cellular rebuilding and repair necessitated by stresses. With less room for error in this area, greater attention must be paid to other factors besides food that either reduce stresses that make more demands on nutrition or on the stress-response system (adrenal glands, etc.), or that involve rebuilding or recuperative activities. This expands the margin again to create a bigger cushion.

Successful Natural Hygiene diets are often less strict and more diverse than traditional/"official" recommendations. Of course that's not the only thing to expand one's safety
margin. The second thing is that there is more diversity in the way Hygienists in the real world actually approach the dietary aspect of the system than is publicly acknowledged in the pages of ANHS's Health Science magazine, for instance. Some of the ones who are successful occasionally include--if not regularly supplement the diet on a low level with--things like yogurt, or some cheese, perhaps eggs. It may not be much, but the more-or-less even-if-haphazard continuing dribble of it is suggestive. Stanley Bass, one of the Hygienic practitioners as you know from his interviews here, and who is a member of the M2M, strongly recommends modest but regular amounts of cheese or eggs as necessary to complete the basic Hygienic diet nutritionally--based not only on his mice experiments, but experiences with real-life patients as well.

Special nutritional practices added by some individuals. A few individuals in the M2M take
special measures such as making flaxseed preparations which they feel provide crucial nutrients to get [i.e.,

256

essential fatty acids (EFAs) and/or their precursors]. Others may go in for BarleyGreen, such as yourself, Chet, or for the blended salads that Gian-Cursio developed and thought essential to increase assimilation, which you and Bass have also recommended. We have also heard privately from an individual who has consulted with one noted Hygienic practitioner who says they acknowledged to them in person that there is nothing wrong with fish in the diet once a week or so--though of course this practitioner cannot come out and say so publicly. Still other people go beyond the traditional Hygienic whole-foods idea that says juices should be limited to a now-and-then basis, and regularly drink vegetable-juice preparations (carrot-celeryromaine is one that seems to be favored) as a way of getting more concentrated doses of minerals and so forth. And finally, of course, there are those such as myself and another individual or two in the M2M who have added modest amounts of animal flesh to our diets, based on the evolutionary picture of humanity's original diet, and have seen significant improvements over the results we got on the normal Hygienic diet. We still believe the basic Hygienic way of eating is a fine way to eat as far as it goes, and we continue to follow many of the usual Hygienic eating patterns including plenty of fruits, vegetables, and nuts; we just don't think it is quite sufficient, and have carefully added reasonable amounts of these other items to round out the diet, paying attention to issues of quality. Those of us in this last camp go along with those such as Bass in thinking that the paradigm of detoxification is valid to a point but only half the picture, and that actually deficiencies are an equal if not greater cause of problems for those on long-term vegetarian diets, accounting for many of the problems that arise and persist.

Potential role of involuntary "lapses" in filling nutritional gaps. Individuals like the above who
supplement the basic Hygienic diet, of course, do so consciously. In relation to this, there is another interesting, relatively unconscious, behavior here we should also look at. While I don't mean to explain away all the successes in Hygiene as due to people doing non-kosher things--because that isn't true-nevertheless it is interesting to observe that many Hygienists still "cheat" a little or have periodic minor indulgences even when they do not really intend to do so as their "ideal" regimen. Using butter or cheese on one's steamed vegetables on a weekly basis--or more often depending on the individual--isn't uncommon to hear about in the M2M. Perhaps less frequently some eggs and so forth. Consciously, they really do not "mean" to do it, but they do it anyway. Which raises the legitimate question of how much of a role these periodic low-level "indulgences" may be playing in filling in potential nutritional gaps in people's diets. The bottom line is, I don't think anyone really knows the answer. It's a question I doubt anyone has studied systematically, and it would be very difficult to do so.

Lapses usually automatically interpreted as "addictions" instead by adherents. What is


interesting from the psychological standpoint, however, is that these so-called "lapses" are almost always viewed as discipline problems or addictions. If the Hygienic diet is supposed to be so "natural," why are so many enthusiastic and motivated followers not completely satisfied by it? One cannot help but get the strong feeling no one is interested in considering the possibility the diet may not be satisfying people's physical needs as a primary underlying reason why they cannot, or are not, sticking to the diet, and that their bodies may be making them do this. This is another "secret" of sorts that's not talked about much publicly but you see in the M2M: Many people struggle with sticking to the diet, and not just for "social" pressures, but because of cravings. After a point, while it may explain some of the cases, to blame it all on past addictions or lack of discipline or poor-quality produce is too convenient.

Rationalizing dietary failures with circular thinking and untestable excuses


This type of habitual pattern of theoretical explanation that exhibits the strong tendency to first fault results (or the individual responsible) if they do not measure up to the philosophy suggests motivations rooted more in idealism than in practically evaluating one's philosophy by results.

Can you clarify that a little further?

257

Pat answers and mantras. If one's interpretation of a theory is overidealistic--even though the theory
itself may satisfactorily answer some number of things, as I agree that Natural Hygiene can--they will try to make their view of it impregnable to criticism with a pat answer for everything, so that you could imagine no outcome of any circumstance or test that they would accept as valid evidence against it.

Unfalsifiable excuses impervious to testability. In science there is a prohibition against explanations


that are not "falsifiable"--meaning those that cannot be subjected to a test where one outcome would negate ("falsify") the explanation, and the other outcome would support it. Otherwise the assertion is impervious to a fair test and will not be taken seriously by science because it is untestable. In other words, if in interpreting the results of an experiment you can always twist them so that they support your theory, and you cannot allow or conceive of any result that would count against the theory, then you are trying to have your cake and eat it too, and that is not allowed if you are going to be scientific.

How about some examples here? "You're too addicted." Well, the above idea that people cannot stay on the Hygienic diet because they
are too addicted to other things is a circular kind of argument as normally stated. There could perhaps be ways of testing it, if one wanted, but not the way it is usually stated which is designed to deflect any possible criticism: Why can't one stay on the Hygienic diet? Because they are too addicted to past foods. How do you know you are addicted to such foods? Because if you weren't you could easily stay on the Hygienic diet! It goes round and round. You see, the way they are formulated, you can't subject these contentions to a real test because they are supposed to be true by definition. When you start hearing explanations like this given out as pat answers, then it indicates one doesn't want the theory to be argued with or subjected to a fair test where there could be a risk of an answer you don't want. "You just haven't given it enough time yet." There are other statements like this in Hygiene designed more to protect belief than to truly explain. For example, another common one we hear a lot in Natural Hygiene is that if someone isn't getting good results, then they "just haven't given it enough time yet." "Hey, you expect to get well in just a few months or years from 20 or 30 years of prior bad living habits!?" That's the rhetoric. Now we needn't deny the need for patience without also requesting some sort of reasonable criterion to determine if there has been enough time. Because without it, you just don't know--you are clawing at the thin air of a foggy explanation designed to obscure the issue. If one were truly interested in being practical here, they could at least make the observation that if what one is doing is working, then symptoms would tend to lessen in frequency or severity over time. (In other words, you would at least see a trend, even if full results did require a much lengthier period.) If this were in fact observed, this would uphold such reasoning, while if the symptoms are increasing or persisting over the long-term, then the hypothesis is falsified. But if all you are doing is just telling people they haven't given it enough time no matter how long they have given it, then you are obviously not very interested in putting things to a realistic test. What we find in the M2M is that usually people don't require all that long to notice some kind of results trend after a dietary change. A few months to several months is usually amply sufficient to see at least some kind of trend. Many people see early results or trends within a few weeks to a month or two. If you are going longer than several months to a year, and you have not gotten any better than when you started, or are experiencing persistent trends for the worse, something is probably wrong.

"Unnatural overstimulation." Here's another example, this one more of a double-standard explanation
to discount good results that other people get eating different diets than Hygiene. It's one I hear myself all the time: If you are someone who has chosen to start including meat in your diet again and you have been

258

gradually feeling better, you inevitably hear the mantra that it's not really better health, it's only "unnatural overstimulation" due to the supposed excessive toxins or nasty animal protein in meat. Yet by the canon of Hygiene itself, the criteria for a stimulant as contained in Shelton's "Laws of Life" contradict this. Law X, "The Law of Stimulation or Dual Effect," and XI, "The Law of Compensation," together basically state that the energy from a stimulant saps the energy of the body for its effect and requires a compensating recovery period. In other words, stimulants result in a prostrating effect afterward during which you feel worse and have to recuperate from. This is not what those of us adding meat to our diet in reasonable amounts in diets otherwise Hygienic in character and who are experiencing improved health go through--quite the opposite, as it is a gradual response. Yet the mantra gets repeated over and over again on autopilot without seeming comprehension of its lack of correspondence to the actual circumstances. It is invoked as a pat answer simply to explain away the fact someone is getting better results on a different diet one does not like.

When symptoms are always seen as "detox." The most important example, of course, of an
untestable answer repeated like a mantra that we need to address here would be when any symptoms encountered are always explained as symptoms of "detox"--while the alternative possibility that they could be indicative of a dietary deficiency, casting doubt on the sufficiency of the Hygienic diet, is ignored totally. Again, that a symptom is due to detoxification might be true depending on the circumstance, but without a way of reasonably assessing which might be the case, it's merely a convenient, unfalsifiable, and untestable explanation. How would you test such an explanation? Well, short of some sort of blood test that might pinpoint biomarkers indicative of toxemia, you could at the least point out that, as we did above, the symptoms-detoxification in this case--should begin to diminish in frequency over time. And on the other hand, if after these symptoms have diminished or abated and you are left with other symptoms that have persisted at the same level or even increased, you would have to admit to reasonable grounds for supposing that these symptoms may indicate deficiency or deterioration or metabolic imbalance of some kind. But merely to say without any further attempt at clarification that any symptom indicates one is "still too toxemic" or still too gunked-up--that it requires still further detoxification--shows that one is not very sincere in their attempts to find out. For these reasons I think that the stock explanation of "continuing detox" is erroneous in attempting to cavalierly dismiss the symptoms I described earlier that are experienced by those who do not do well longterm on the Hygienic diet. Because for these individuals, the symptoms are persistent and do not diminish over time. And as we have said, usually these individuals had experienced improvement at first before the decline, which is why they are so loath themselves to accept the idea that the Hygienic diet may be insufficient for them.

Other meaningless, unhelpful, or unfalsifiable excuses. And a little more on the humorous side
here, when all else fails, one can always bring out the even heavier artillery of such unanswerable assertions as: "too badly damaged by prior living," "you weren't breastfed long enough," "degeneration of the race from prior generations' poor diet," or my own favorite and one I was once personally accused of: "psychosomatically sabotaging yourself by trying too hard." Could there be any possible validity to these excuses? Perhaps. Who knows? The point is, they could be given for any approach that fails, not just Hygiene, which shows just how relevant they really are in explaining why Hygiene shouldn't give better results than all the rest. Please understand, as I emphasized in the introduction to Part 2 of our interview, that in this critique I am not quibbling with the basic principles of Hygiene, which I agree are valid for the most part. What I am saying is that they are often interpreted in conveniently inconsistent, fallacious, or selective fashion to explain away detrimental results that threaten emotional attachments to dietary details or implementations they may believe stem from those principles. And that this can get people into serious trouble by obscuring their vision of what may really be happening to them.

259

5 tips for staying alert to the traps of excessive dietary idealism


What about ways of guarding against falling into these different traps of idealism? Are there things that can be done to prevent going unconscious about it? 1. Self-honesty instead of denial. Well, number one, I would say simply acknowledging all that we've
been looking at here--at the least considering it as a real "possibility"--is a big first step. The big problem I see is that even where people become aware of these problems, as happens in the M2M, they most often go into denial about it. Denial is the biggest hurdle. Just be honest with yourself about how you feel and if you are pleased with the results you are getting or not.

2. Focus first on results rather than theoretical certainty. Second, becoming less set on arriving at
a perfect theory and more interested in getting good results helps one remain more objective. Be wary of the desire for too much certainty about any one theory. Theories are very useful tools, but they usually change and become modified or at least refined over time in response to scientific advances. Being too certain theoretically can override your ability to perceive any detrimental results that might cast that certainty in doubt. If you think of a theory as an approximation subject to modification by results--which is what science is all about--you'll be much less likely to get into trouble.

3. Utilize reasonable timeframes to gauge trends. Third, being willing to place reasonable time
limits on one's dietary experiments--say, a few to several months per dietary adjustment to assess at least some sort of trend--will help keep you from falling into the frog-in-slowly-boiling-water syndrome if you don't get results.

4. Exercise at some activity that gently challenges your limits, to hone sensitivity to changes in your capacities and health. Fourth, a regular exercise program doing something you enjoy,
particularly one involving a high level of activity--such as some type of endurance sport, or weight-lifting, or yoga--not just those particularly, but something where you can accurately measure or get feedback about your results at a high level of activity, can be very helpful in not only achieving health, but in more realistically assessing it. You don't have to put large amounts of time into it; rather, intensity or focus is the key, where you play the edges of your capacities for at least brief periods of time without too much undue strain. When you do this, you will notice shortfalls in results and the impact of diet or changes in health much more quickly than if you are idling along far below what you are capable of.

5. Don't ignore feedback about how you are doing from people who know you well. And fifth,
actively seek out the opinions of other people whose judgment you trust about how they think you are doing. Maintain a dialogue with people and don't isolate yourself from feedback and others' opinions, or be so ready with dogma that you drown out what they are saying. Ultimately, if you aren't feeling well, looking well, and doing well in your daily life, why are we bothering with any of this?

Is there anything we haven't covered that you'd like to add before we close out this interview?
Yes. As much as I've tried to carefully document the scientific aspects in Parts 1 and 2 of these interviews, I have no doubt newer science will inevitably supersede portions of it. Also, given the complexity of some of the research, it's possible there may be a few instances where I overlooked differing interpretations of fact. I fully expect given the controversial nature of what I am reporting and saying here that there will be those who will point out any such inaccuracies that may have occurred, which I welcome. This is how knowledge advances, and I'm sure it will help me refine my own thinking or look at new interpretations as well.

Big picture more important than disputes over details. I trust, however, people can judge for
themselves that the overall thrust of the views I have been presenting do not depend on a few specific

260

details, but rather on how all of them as a whole point to a larger overarching pattern. That's my real interest--the broad thrust of all this. I don't really expect to convince very many people and I know many will be upset. I can well imagine as we speak that many of your readers will be busy writing their own rebuttals. What I am interested in is starting a dialogue on these issues in the vegetarian and Natural Hygiene communities, and generating awareness about them. I don't think most know yet that there is considerable scientific data on humanity's "original" diet now, or that as we have discussed in Part 3 here, there are significant numbers of Hygienists not doing well.

Tolerance for our own mistakes, tolerance for others. In this same vein, I'd also like to suggest to
Natural Hygienists and other "veg-raw" folks not to be too fundamentalist about their dietary beliefs and practices. Especially with other people--but even yourself. Leave yourself open to the fact you could be wrong sometimes--because we all are. Don't paint other people who don't believe the way you do about diet, fasting, and health as willful ignoramuses, or people such as M.D.s as willfully evil promoters of bad or destructive information. Most of us are doing the best we know how with what we've got. We need to stop making people into villains, and diet into a religion that we feel a need to identify with as being somehow morally or socially "right" about. Because no matter how much you think you know about it, you can always be humbled by new information. I've really been razzed at times by close family and friends about the changes that have occurred in my thinking about diet as a result of my experiences and research. I'm sure more changes will come. It's somewhat embarrassing how overzealous I've seen myself be at times, even while considering myself openminded and tolerant at the time. If you want to create an environment where people will be forgiving of you when you change your mind about something, you can't be coming across as an inflexible proselytizing zealot to them or they won't be tolerant of you either.

Open-mindedness is an ongoing process, not something one achieves and then "has." It's something you have to continually pay attention to and engage yourself at. It doesn't happen automatically. In fact, the tendency if we do nothing is for our minds to crystallize and become more closed as we age. We all die eventually no matter what diet we eat. I would hope we can all stay forever young at heart even as our bodies inevitably age. That's the most important thing.

Further observations about "failure to thrive" on vegan diets


*** "...the most sobering was that a year or two ago one older member who ended up dropping out of the M2M wrote to us that they had developed congestive heart failure. They said Hygienic practitioners told them it was due to lack of adequate protein and B-12 over an extended period. We later heard word from them through an intermediary in the M2M that it may have been due to having followed a diet too high in fruits percentage-wise for many years prior to that."
As discussed in the postscript to Part 2, although this is admittedly speculative, symptoms of heart disease in this situation would also be consistent with hyperinsulinism as a possible cause, given the high level of carbohydrate from the excessive fruits, combined with low protein consumption.

*** "Another sobering occurrence was that in the most recent issue of the M2M, one of our participants reported that their two or three-year-old infant son--who was still breastfeeding (or had been until recently) and who, along with the mother, was eating vegan Hygienic foods, including getting sunlight regularly--had developed bowed legs as a

261

manifestation of rickets due to vitamin D deficiency." Case of rickets in vegan toddler. This example of serious nutritional deficiency was criticized by one
vegan raw-food advocate online, who made the accusation that crucial information must have been purposely withheld from my account, and suggested the rickets could have been due to an induced calcium deficiency from the binding action of phytates if the diet were rich in grains. For the record, here are the particulars of the father's and his son's diets:

Family environment and parents' dietary beliefs/practices. (From issue #23 of the N.H.
M2M): The father himself had been vegetarian for 13 years, and for the last four had maintained a diet of "80% living foods"--which we can take to mean roughly no more than about 20% of the diet (probably by volume since calories were not mentioned) would have been other than raw (but possibly juiced, or blended) foods. He also mentions eating potatoes and rice occasionally in this and another issue of the M2M. If we are being fair in making an estimate, we'd probably have to say that at most, rice would have constituted no more than, say, 10% of the father's own diet (half of the 20% non-living foods, given the mention of potatoes as a similarly occasional item). Particulars of father's diet (reflected to some degree in toddler's diet). The father also gave the following rundown of the particulars of his own diet (which, as we'll see, has an influence on the character of his son's as well) and to some extent his family's diet. (Little information was available for the mother's diet, except that it was influenced to some degree by her husband's diet, and the implication is hers too was likely vegan, or mostly so.) A typical breakfast would consist of bananas or grated apples with reconstituted raisins and a little tahini. Or alternatively a fruit smoothie consisting of fresh-squeezed apple juice, tahini, dates and bananas. Lunch was a salad of various vegetables at hand such as the usual items we are all familiar with: tomatoes, bell peppers, cucumbers, celery, onion, broccoli, carrots, lettuce, sprouts, all topped with a nut or seed dressing--often tahini--and/or avocado dressing. Fruit was a favored item for snacks later. Dinner was another raw salad for the whole family with the possible addition of steamed vegetables or potatoes. The son's diet was not much different, though perhaps heavier in fruit. Like the father's, his was "a mostly living foods vegan diet." The father also states that his son was unvaccinated--a sign generally indicative in the vegan community of serious commitment overall to these types of dietary/lifestyle regimes. The son was still breastfeeding at the age of 2 years and 8 months. Dietary consumption consisted of "6-8 bananas a day, apple juice, oranges, kiwis, apples, celery, dates, steamed potatoes, all sorts of vegetables, organic spelt bread, ricecakes, and his favorite spread right now is made with avocado, raw tahini, lime juice, raw cider vinegar, kelp, and a couple of capsules of bluegreen algae."

Note that the father's statements quoted above depict a diet with a regular though not excessive level of grain consumption, would which tend to refute our critic's idea of an induced calcium deficiency stemming from inappropriately high levels of phytate in the diet.

Calcium deficiency rather than vitamin D deficiency as potential cause? The critic's suggestion
that the attribution of the rickets to vitamin D deficiency overlooks the possibility of calcium deficiency does have merit. The father's account (in issue #28 of the N.H. M2M) of visits to the pediatrician, however, strongly implies the doctors diagnosed or at least themselves believed the symptoms were due to vitamin D deficiency. In rechecking my sources here, it is worth noting that the father does make a statement elsewhere in the M2M that at the time of an earlier childhood bout of illness, they had a hair analysis done "that showed some heavy metal contamination and calcium levels in the blood at the low end of normal." (At the same time, here, it should also be noted that the accuracy of hair analysis can be controversial due to alleged problems with repeatability.) How much effect a low-normal calcium level (assuming it persisted, or existed in the first place) may have had at the time of the rickets diagnosis is not discussed.

Elimination of problem using supplements/animal foods demonstrates insufficiency of diet, regardless of exact cause. What all of this does suggest, however, is that regardless of the exact cause
of the rickets, the particular diet followed--one fairly typical of the way recent, and more sensible Natural

262

Hygiene vegan diets in general are practiced--was insufficient in some way. Our critic seems to ignore this general point. Another criticism by the same raw-food vegan critic who took a dim view of our example here was that "fats will impede mineral absorption as well. Babies fed on early formulas with too much linoleic acid [a fatty acid] become anemic." This speculation is easily refuted by the father, as it turns out, who mentions in passing at one point (in issue #23 of the M2M) that the family was once threatened with having their son taken away by the authorities "because we wouldn't feed him formula and baby rice."

Tendency is to rationalize as to speculative possibilities while ignoring probability/plausibility. To wrap up this example, it's worth pointing out that these kinds of speculative
rationalizations are heard all the time in the vegan community, and are a good illustration of how dietary extremists are almost without fail more interested in finding ways "to explain things away" rather than simply accept that these things can and do happen on vegan diets. No matter how strong the example, you can count on an extremist first looking for a way to rationalize it in preference to dealing with it on its own terms. We see here another prime earmark of the idealist's (dubious) logical method of criticism and peptalking: engage in speculation as to numerous possibilities (to deflect attention from a more direct interpretation of the evidence) that redirect attention away from the critic's favored regime as the cause, without regard for probability, reasonableness, parsimony of explanation, and plausibility in the equation of arriving at the most likely explanation given the evidence. The lesson from our example of rickets above, again, is that even if we cede that we may not be able to pin down the proximate cause with absolute certainty in such an anecdotal example (even one verified by mainstream doctors), nevertheless the remedy of adding animal products in the form of goat dairy, plus a vitamin D and multivitamin supplement solved the problem. The unavoidable implication is that the diet was deficient in some way, whatever those deficiencies or their causes may have been.

Special measures to make vegan diets work can be seen as compensations for lack of evolutionary congruence. As outlined in detail in Part 1 of these interviews, the evolutionary evidence
makes it clear that vegan diets are not the original, natural diet of humans, and never have been at any point in our history. Nevertheless, we can cede the point that with the right kind of manipulations, it may be possible in whatever number of cases to compensate for the unnaturalness of leaving animal products out of the human diet by instituting various dietary practices that may be utilized in a particular vegan diet.

That even informed advocates acknowledge vegan diets should be carefully planned suggests that less margin for error is a real issue. Unfortunately, however, as we see above, some
individuals simply don't do well on vegan diets even when conscientious attempts may be made. Often, more scientifically oriented vegan advocates will object in such cases that the diet wasn't "intelligently planned" enough. Yet if one must go to such lengths to be so conscientious and careful, what does this suggest? The most logical conclusion is that there isn't as much margin for error on vegan diets in terms of nutrient deficiencies, while with the inclusion of animal products--a part of the natural human diet--or with the addition of artificial supplements to compensate for their absence, there is. (I do not mean, of course, to suggest that milk is ideal as an animal food. In view of the evolutionary and clinical evidence regarding potential long-term problems due to high saturated fat levels in dairy (among other things), flesh food would be the logical choice as the animal food we are most fit to eat. However, the vegetarian way of life does not permit this, so dairy (or eggs) often becomes the only choice available for those vegetarians who are willing to consider any animal products at all.)

*** The yo-yo syndrome as potential indicator of failure to thrive on strict diets:
One behavioral type of symptom that may be experienced on raw-food diets that was not discussed at much length is what could be called the yo-yo syndrome. Although in extreme cases this can be expressed as

263

binge/purge or alternate fast/feast behavior, more typically it is typified by on-the-wagon/off-the-wagon behavior. Those for whom vegan or raw-food diets don't work well may blame themselves as a consequence of their idealism, and go through a continual cycle of try/fail, try/fail, try/fail, attempting to get the diet to work, no matter how many times it is attempted. Typically, guilt feelings ensue and become reinforced by such behavior, and blame is then laid on "addictions" or lack of self-discipline. Since this type of behavior could potentially manifest itself with any diet that one puts up as an ideal different from past habits, however, I do not know at this time of any foolproof way one can tell for sure in any particular case whether it is indicative of nutritional insufficiency as opposed to "lack of will," so to speak. However if it occurs in conjunction with one or more of the physical symptoms mentioned in the interview, one would have to consider the strong possibility the behavior is more likely biologically driven.

Viewing the yo-yo syndrome as valuable feedback to help pinpoint problems breaks the cycle of guilt over so-called "lapses." At the least, one can use the presence of the behavior as a
feedback signal that one is either not physically satisfied by a diet, or alternatively that something about one's implementation of the diet is not satisfying innate human needs for gustatory and emotional satisfaction from the foods eaten, over and above their nutritional sufficiency. (Anyone who has experienced a fast of any length of time can attest to the fact that food and eating satisfy not just physical needs, but also "feed" a certain human need for psychological or oral satisfaction in eating; not to mention the role that food and "eating time" play in structuring a "day in the life" of human behavior.) A "biological" view of this behavior, of course, would point out the likelihood that on-the-wagon/off-thewagon behavior may have little to do with "discipline" problems or "addictions," but could be the simple result of one's body forcing them to eat other things in spite of their determination not to, in order to get needed nutrients not available (or not in high enough quantity) from the "allowed" items in the diet. It also needs to be pointed out that even where one is following a prescribed diet that might be supposedly "right" for them (for instance, perhaps an evolutionary-type diet like we have discussed here), people still have individual needs that they will often have to arrive at individual solutions for.

Unhooking from guilt frees attention to seriously consider and evaluate practical solutions one may have been blind to before. A different approach, therefore--no matter what one's dietary
philosophy or knowledge--is that one can instead ask: What kind of drive toward satisfying some physical, emotional, or psychological need may have pushed them into the "lapse"? The trick here of course is to figure out what the "indulgence" may be signaling. But at least one is no longer blinded by the selfdefeating cycle of guilt and recrimination, which frees up one's vision to look at serious alternatives. For example: Is there something in the food, or about the food itself that the body may be craving even if one has previous conditioning the food is "bad"? Or, even if the food is bad, is there something in it that one is attracted to in attempting to satisfy some other legitimate need one could get from another, better food they have been avoiding? Perhaps there is even something about the experience of eating the food or the particular psychological satisfaction it brings, rather than strictly the dimension of nutritional considerations that one finds satisfying about it: for instance, texture, taste, the bodily feelings generated after eating it, etc.

Experimental attitude requires new mental relationship with the question of "certainty."
Taking this perspective can be a way out of the self-blame and reactiveness of guilt, or self-destructive and futile "discipline," into a mode where one uses their behavior as a kind of feedback guide to better satisfy themselves on the way to reaching better health. Doing this, however, requires taking an experimental attitude and being willing to temporarily suspend the tendency toward instant judgment or wanting instant certainty, and to instead "live in the question" while realizing some answers in life require that one find them on their own without the assurance of an external authority.

264

*** Why is being "hungry all the time" on a veg-raw-food diet such a problem for some individuals even in the short-term before any deficiencies could have arisen? Diet is lower in overall nutrient/energy-density than the one the human body evolved on.
With the benefit of recent studies indicating that the human gut evolved to be less energy-intensive in performing its functions, and more dependent on higher-density, higher-energy foods with higher protein and fatty acid content (much of them animal in origin) to support the evolution of the large energyintensive human brain (see postscript to Part 1), it appears we have a very likely explanation for why so many raw-foodists are so hungry all time: They simply aren't eating enough concentrated food. (While it is true that some people's bodies seem to be able to functionally adapt to this to one degree or another, or they may be blessed with more efficient metabolisms to begin with, there are many more who can't make a go of it very well.)

Human digestive system not optimized for maximum extraction of nutrition/energy from a diet of all high-fiber foods. In avoiding animal foods and--if totally raw-foodist--also many of the more
concentrated vegetarian foods like cooked grains, legumes, tubers, etc., such individuals have relegated themselves to eating a diet significantly lower in overall nutrient and energy-density. (See The Calorie Paradox of Raw Veganism for a detailed discussion and analysis of the energy-density aspect of this equation.) When you combine this with the fact that the human gut is also not as efficient in extracting nutrients from these higher-fiber foods as it is from denser foods (which is not to say sufficient amounts of fiber do not have their rightful place as a valuable component of the diet), you have a good recipe for the urge so many raw-foodists have to eat large volumes of food, while still remaining hungry much of the time. The binge/purge or fast/feast behavior some raw-foodists get themselves into also becomes understandable as a confluence of this underlying metabolic cause combined with mental overidealism. Thus, while the resulting syndromes might be similar to bulimic-type behavior seen in more mainstream dieters, the underlying causes are somewhat different.

Eating like a gorilla leads to a life centered around food like a gorilla. It is also worth noting that
even gorillas, who have the kind of intestinal fauna and flora to handle a very high-roughage diet more efficiently, still spend considerably larger portions of their day eating than humans normally do. Humans by contrast have been evolving away from this to some degree, towards a more active social existence, which itself takes time. By attempting to "eat like a gorilla," one to some extent has no choice but to spend more of their day eating (or thinking about it, if they are going hungry). This in itself tends to lead to a food-centered existence that breeds obsession, which is reflected in the perception among others that rawfoodists often become fixated on food to the detriment of other aspects of human life.

*** Becoming highly dependent on "mainstay" foods in a veg-raw diet: The frequency of dependence on avocados and/or nuts is explained by human digestive system's design for denser foods. The smaller gut/dense, fattier foods/large-brain connection that arose
during evolution also goes a long way toward explaining why many veg-raw-foodists may become heavily dependent on foods like avocados and nuts to make the diet actually work. Many raw-foodists have something of a guilt complex about the large numbers of avocados and nuts they often eat. Intuitively they seem to feel that their diets contain disproportionate amounts of these items, yet at the same time they cannot stop eating them. This dependence on these foods (sometimes not even realized as such) may lead to guilt feelings, or may become expressed in strictures or rules meant to clamp down on what is felt are "gluttonous" urges toward eating these foods. This is especially true in this day and age when the highcarbohydrate, low-fat gospel reigns supreme, which casts something of a pall over the joys otherwise to be had for vegetarians in eating these more-fatty, hunger-satiating foods.

Raw-foodist eating patterns and difficulties are predictable/understandable given evolutionary design of human gut. Yet from the perspective of the evolution of the smaller human gut

265

geared toward the digestion of denser, fattier foods (in the past it was primarily animal flesh, of course) it is easy to see why raw-foodists often become highly dependent on avocados and nuts in their diet and suffer without them. The only other alternatives--both ruled out for raw-foodists--are to eat cooked grains, legumes, and tubers to get more concentrated nutrient intake, or dairy products or eggs. Those having been eschewed, the choice occurs almost by default, though it's really probably almost a foregone conclusion given the restrictions veg-raw-foodists live under, when analyzed from the perspective of the evolution of the human diet and digestive system.

Once all other, more energy-dense foods are eliminated, "fruitarianism" is the logical/inevitable outcome. To make matters worse, many raw-foodists living under the above strictures
and who actually do try to either eliminate nuts and avocados, or at least seriously limit them, understandably tend to find the only remaining somewhat concentrated foods--fruits--looming large as attractive taste treats, and often overindulge in these otherwise fine foods. But when made too large a part of the diet--and given that modern fruits have been bred for higher sugar contents than their ancestors--this can lead to another set of problems resulting either in hyperinsulinism or sugar addiction. Of all the varieties of vegetarians and "living foodists" and raw-foodists, it is the so-called "fruitarians" who get into the most health trouble the most quickly. And again, it is not hard to see how this can happen with the progressively more limited set of foods allowed under the vegetarian raw-foods philosophy. It is not just the "ethical" consideration that fruitarians put forth (as a motivating factor) that fruits are the only "karmaless" food that's obtainable without harming a plant to obtain food from it. The digestive, metabolic (energy-producing), and satiation (hunger-satisfying) characteristics of foods that play out based on the dietary limitations one lives under, in fact, inexorably lead to these behaviors in logical, understandable ways.

The fallacy of fruitarianism: word games vs. the real world of practice and results
The following material on fruitarianism had to be cut from the original print interview due to space restrictions, but is included here on the web in expanded and rewritten form, since fruitarianism has proven to be a seductive philosophical and behavioral trap that those who are particularly prone to idealism or extremism within the vegetarian movement sometimes fall prey to. It can and often does lead to serious and long-lasting health consequences. (One unfortunate member of the Natural Hygiene Many-to-Many, for example, lost all their teeth as a result of a high-fruit diet; although the potential problems of high-fruit diets extend considerably beyond just loss of teeth.)

H&B: Let's hear more about the fruitarianism question--with regard to humans and apes both. I know you have plenty of opinions on the thinking and behavior of those subscribing to this particular system of diet.
W.N.: Well, first there is the problem of defining just what you mean by the words "fruit" and "fruitarian." There is a lot of gamesmanship, sleight-of-hand, and word redefinition that goes on among fruitarian advocates to redefine "fruit" away from the common definition (soft, pulpy, sweet, juicy fruits from tree or vine) so that it includes the so-called "vegetable fruits" like peppers, cucumbers, tomatoes and the like, or "nut-fruits" and so on, so as to broaden what is considered "fruitarian." In a botanical sense, these foods can be considered fruits, and thus--if we stretch things a bit--perhaps "technically" permissible in what might be called a "fruitarian" diet. The problem, however, is that most fruitarians don't even stop there either. Most go further and allow or even specifically recommend "greens" and/or "green-leafed vegetables" as essential, and of course neither of these qualify as fruit even in the botanical sense. Once you get this far, any sense of integrity about what "fruit" really means has been sacrificed to the realm of fast-talking slipperiness.

Word games over what qualifies as "fruit" usually expand the definition so far that the distinction means little. But you have to look at the fact that most people don't normally think of these

266

other items as fruits unless they are trying to wiggle out of the straightjacket the normal definition creates when you really start to think about being able to survive on nothing but fruits alone. So if one wants to play games and define a fruit as almost anything under the sun that is a "seed-bearing mesocarp" or whatever, well, fine, but in my view that's losing touch with the reality of what people mean in common parlance and mutual understanding of the language. Certainly if you define fruit broadly enough as a botanist might, you may be able to make it as what I call a "technical fruitarian" (meaning extremely technically defined). We should recognize that this is just a game, though, because it's a moving target that people trying to be fruitarians use so it won't bother their conscience to use the label. Frankly, the way fruit is sometimes defined by fruitarians (to include nuts, seeds, greens, and/or green vegetables) doesn't really distinguish the diet much, if any, from an all-raw version of a Natural Hygiene diet of fruits, vegetables, nuts, and seeds.

Were our evolutionary primate predecessors really true "fruitarians"? If you are defining fruit
the way it is commonly defined as relatively high-sugar-content, juicy tree fruits plus things like melons, berries, and so forth, then we have something more concrete to talk about. As far as the normally defined fruits go, there is a partial grain of truth to apes as fruitarians, and for the ancestor common to us both, but only if you remember we are talking what the greatest percentage of the diet was. While over half may have been fruit at one time (perhaps with the gracile Australopithecines 3-4 million years ago, although I don't know of any data to confirm this with any surety or not), there were also significant portions of other foods. To really make a case for anything approaching fruitarianism as ancestral to humans, you would have to go back to at least the common ape-like ancestor ancestral to both chimps and humans approximately 7 million years ago, and even this would not have been total fruitarianism. The closest approximation to fruitarianism (and not completely so, at that) in an ape-like species might have been in the time period of around 40 million years ago, but you have to realize these creatures were apes and not human. Humans themselves (the genus Homo, beginning with Homo habilis over 2 million years ago) have never been fruitarian. Even chimps, who are the nearest modern ape species to fruitarianism, do not eat above 2/3 fruit or so in the case of the common chimp, or perhaps as high as 80% in the case of the bonobo chimp--in either case still a significant amount short of 100%, especially when you consider they both also eat leaves, pith, insects, and a bit of meat too.

Simply put: Humans are not apes. As a number of Natural Hygiene practitioners have noted,
however, few humans can eat even 2/3 fruit over long periods of time without getting into serious difficulties. We had several M2M folks try near-fruitarian diets, and no one had any lasting success with it, although some have done fine for several months at first, perhaps even a year or two. In fact, those that we knew of in the M2M reported getting into trouble trying to do so and later regretted their naivete in attempting it, due to the problems that eventually followed. The two most common repercussions of longterm attempts at fruitarianism are usually that the teeth are the first to go, then people's blood-sugar processing abilities, along with deficiencies.

People may do well at first, but this is because they are living off of past nutritional reserves, and when the stored reserves run out, the game's over. This is a theme we've probably beaten to
death here, but it warrants repetition, especially with regard to fruitarian diets: It is not enough for a diet to be "clean"--it must also be a sufficient diet. Fruitarianism and near-fruitarianism are the worst possible case, because in addition to progressive long-term deficiencies, the body's insulin-production capabilities are being simultaneously overwhelmed with the high carbohydrate load in the form of higher glycemic-index foods containing simpler sugars like glucose, sucrose, and fructose.

Advocates of "fruitarianism" frequently change their definition of it over time. Most people
who initially promote a total fruitarian diet are forced to back off and begin allowing the use of some nuts, seeds, and green vegetables from experience. This may extend the period over which a "fruitarian" can maintain their regimen, but it doesn't remove the underlying problem of the long-term consequences of

267

excessive sugar consumption and/or hyperinsulinism, not to mention low intake of B-vitamins, certain minerals, etc., that are likely to result if the diet is continued long enough. (Although the following is just speculation, and there may certainly have been other potentially causative factors, it is worth considering that the relatively early death of near-fruitarian advocate T.C. Fry at age 70 recently--from atherosclerotic plaques in his legs that led to a coronary embolism--might possibly have been due at least in part to hyperinsulinism, which can promote atherosclerosis and heart disease. See Chet Day's investigative article, "The Life & Times of T.C. Fry" at the Health & Beyond website for more on Fry's life and the events leading up to his death. Caveat: You will need the helper app Adobe Acrobat Reader for your web browser to read the PDF file that you'll be served up while online.)

Noteworthy 1970s-era expos of numerous alleged fruitarians found no successes, and widespread misrepresentation of diets actually eaten. 4-Part "Fruit for Thought" article series, by the American Vegan Society's Jay Dinshah. An
interesting little piece of Hygienic history here that most are probably unaware of is that Jay Dinshah, a long-time vegan who was previously a staffer with ANHS (the American Natural Hygiene Society) many years ago, published an extensive article series on fruitarianism in the late 1970s, titled "Fruit for Thought," in his own newsletter Ahimsa--the most in-depth piece of journalism on the subject I have ever seen. [Dinshah 1976-77]

Fruitarian gurus weren't actually practicing what they preached, but followers who did ran aground. As part of the series, Dinshah told of his investigations, ranging over many years, of all the
fruitarian advocates he knew of at the time claiming to live on fruitarian diets. He originally thought himself that fruitarianism seemed theoretically possible. But over the years, what he discovered was that none of them--and this included famed fruitarian advocates like Johnny Lovewisdom, Walter Siegmeister (pen name Raymond Bernard), and Viktoras Kulvinskas--were actually living on fruitarian diets, even as they defined the diet for themselves. Yet there were many others Dinshah met who had taken the advice of these people quite seriously and had gotten themselves into very serious health troubles.

Excuses, excuses. Lovewisdom was eating plenty of vegetables but made the excuse his orchards and
gardens in the Ecuadorian tropics were not in the right place to enable the fruit to be of the quality to support him in good health. Breatharian advocate Siegmeister was living not even as a fruitarian but as a vegan. And Kulvinskas was basically eating as a garden-variety raw-foodist vegetarian with periodic lapses back to cooked food--while he had previously championed liquitarianism (juices only), fruitarianism, and even breatharianism. [I am not aware of what Kulvinskas' living-food-type recommendations and practices are these days, many years later.] Through another route, I've seen reporting on Wiley Brooks [Mapes 1993] a reputed, so-called "breatharian," who readily admits to eating Big Macs and Twinkies now and then, but claims those are not what actually sustain him--it is really the air sustaining him in spite of what he is eating! (Now there is some truly creative, totally untestable, idealistic logic for you!)

Failures the rule, no successes ever came to light. Of the people who had lived on truly restrictive
(soft/sweet, tree-fruit-type) fruitarian diets, Dinshah's article series brought to light that their practices had put them in a bad way health-wise. Also included in Dinshah's article series was the substance of a conversation he had once had with Dr. Gerald Benesh [who has been previously interviewed in Health & Beyond] who reported that "in all his years of practice, in fasting, rebuilding, and advising people in even that wonderful climate [near Escondido, CA in the 1960s] and with the fine fruits available in the area and from below the Mexican border, he had nonetheless never been able to bring even one of his people to the point where they could live in good health on just fruits." [Dinshah 1976-77, Part 1, p. 7]

268

Little change in fruitarian movement between now and then. As well, Dinshah took a long, hard
look in his series at the rationale and reasoning behind fruitarianism as published in books on the subject at the time--echoing claims and propositions very similar if not identical to what we hear today--and found the literature of the time rife with logical as well as outright factual errors and assumptions. (For a look here on Beyond Veg at these types of errors, see the material available in Selected Myths of Raw Foods, and in our Waking Up from the Fruitarian Dreamtime section.)

Notable Natural Hygiene practitioner at the time related specific problems seen in numerous fruitarian patients. Printed in tandem with the article series was also a separate reprint of an
article by Benesh himself [Benesh 1971] that had previously been published in Herbert Shelton's Hygienic Review, detailing a few of the serious problems he had seen develop in people he had cared for who attempted a fruitarian diet--even on high-quality fruits available in season--for more than a few to several months. Benesh listed the following symptoms of people on long-term fruitarian diets that he had seen in his own Natural Hygiene practice, which we should note are not so very different from those mentioned earlier in this interview for the majority of other total-raw-foodists who experience long-term troubles: [R]idged nails, gingivitis, dental caries, dry skin and brittle hair, lowered red blood cell count and low hemoglobin percentage. Over a long period of time (at least one year or more) the blood serum level drops to a point of an impending pathological state if not corrected. Many of them display serious signs of neurological disorders, while some experience emotional upsets and extreme nervousness and often complain of insomnia. When their nutritional program is corrected these signs disappear and the patient finds himself in a much improved state of health. I recently spoke with a health-minded medical doctor, who embarked on this lopsided program and did very well, experiencing a high state health for about a year, when almost suddenly a loss of weight was experienced and neurological signs were evident. This doctor took a series of blood and serum tests plus other pertinent tests, which verified what I have observed in fruitarians and excessive fruit-eaters, and corroborates my findings. Another cardinal lack that occurs quite often is a distinct lack of vitamin B-12. This lack of B-12 gives rise to the neurological signs that indicate a serious deprivation of this vital element needed to keep the nervous system operating at a so-called normal level.

Potential explanation for digestive difficulties in long-term fruitarians. Long-term attempts at


fruitarianism can also lead to other problems. According to Benesh, the excessive amounts of organic acids from a fruitarian diet are unable to be adequately buffered by the digestive system, with the result that these acids end up in the large colon where they interfere with proper balance and/or amount of intestinal flora. The ultimate result can be compromised bowel function, which in a few cases can result in serious health problems such as colitis, if the person continues to persist on the diet. Benesh theorizes that the excessive acids cause salts to be drawn from the cells of the colon resulting in flaccidity and interference with peristalsis. (Note that if a similar process occurred in the small intestine as well, this might be a possible explanation for why extended attempts at fruitarianism sometimes result in the apparent inability to digest so-called "heavier" foods [nuts, cooked potatoes or rice, etc.], which may pass through the digestive tract relatively little digested, as reported by some fruitarians. Benesh also notes that hemorrhoids are not uncommon, and that fecal analysis may reveal occult blood as a result of minute petechial hemorrhages of the small blood vessels in the mucosal lining of the intestine.

Beyond the myth of fruitarianism is the empowerment that freedom from fantasy brings.
Fruitarianism is a myth that dies hard, but the price of the illusion is having to learn the hard way. If the reality seems disillusioning, consider what is gained instead: freedom from a deceptive, if well-intentioned, fantasy that ends up hurting people. Freedom from fibs that people tell themselves and others that only

269

compromise the capacity for self-honesty and the ability to more objectively assess how their health is actually being affected.

When evaluating claims, look beyond the word games. What you should do when you encounter
those advocating fruitarianism is to look at what they actually eat. To find out the truth, look at what they do, not what they say. So far, what has inevitably been found is that such fruitarians are either in poor health if they have truly been eating solely sweet and succulent fruits for any length of time; or if in seemingly good health, they invariably include other foods that the rest of us would consider vegetables or "greens"; and most often also nuts and avocados as well (which are hard to do without and still obtain sufficient fat and protein on this kind of diet). In some cases the game-playing can reach rather absurd levels to where even things like sea vegetables and dulse, steamed potatoes or rice on occasion (eggs have even been called "hen-fruit" before) are somehow defined as fruits or technically "permissible" in a fruitarian diet. So when you find fruitarians splitting hairs over what technically qualifies as a fruit, you should be honest enough to recognize fruitarianism for the word game it actually is, instead of getting carried away by idealistic fantasies. There is no empirical support for successful fruitarianism (as it would be commonly thought of) either in hominid evolution and just as importantly in the real world of results for people who have tried it. If you run across people who do claim to be successful, you should be very skeptical, since fibbing, prevaricating, and word-gamesmanship have proven upon investigation to constitute a necessary pillar of support in the lore and history of fruitarianism.

A more "evolved" path, or only more extreme? So if it is true that the "successful" fruitarians are so
only because they define fruit very broadly as any "seed-bearing mesocarp," what real difference does it make? Why are they so concerned to define themselves as one? In the end, it usually comes down to an ideal that fruitarianism is somehow more physically "pure" or more spiritually "evolved" than gardenvariety vegetarianism. People want to think of themselves as special (which in itself is natural and understandable, of course), and fruitarianism is painted as being the "next step" in dietary evolution. What it really is, however, is a next step toward extremism, philosophical dishonesty, and ultimately failure of health.

Judge the diet, not yourself--by bottom-line results, not high-sounding philosophy. Think very
seriously before giving credit to those who claim the mantle of fruitarianism or other extremist/idealist dietary programs, and especially before putting your health at risk based on their advice. Instead, put the power of reality-based thinking and knowledge to work in opening up workable avenues for dietary experimentation based on honesty. Rather than judging yourself and your behavior by a dietary philosophy, turn the equation around and evaluate the worth of a diet by the concrete results it gives.

The Late Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance
by Loren Cordain, Ph.D. Copyright 1999 by Loren Cordain. All rights reserved. Introduction: The principle of evolutionary discordance
To set the context for this discussion, let's first briefly recap the basic evolutionary processes to which all species are subject. These principles are fundamental to understanding the repercussions of grains or any other food on the genetics that govern human biology. Mutations in genes cause variability in the population-wide genetic makeup of a species.

270

Selective pressures due to environment (which includes diet) cause individuals with genetic makeups most suited to that environment to thrive the best under those selective pressures. These individuals achieve better rates of survival and differential reproductive success, thus leaving behind more descendants compared to those without the more advantageous traits. This process is iterative (repeats itself) with each generation. Thus with repeated generations, the characteristics of organisms in a population become more finely tuned to their environment, which again, includes the diet they habitually eat. After a point, assuming environmental factors are not themselves undergoing great change, the population(s) making up a species will tend to reach a point of relative genetic equilibrium. That is, the species' genetic makeup remains fairly stable with a genetic makeup as well-tuned to its environment as it is likely to be, though including some variability due to genetic mutations. When or if environmental conditions again change significantly, most individuals in the population experience what can be termed in plain language as "evolutionary discordance"--the negative results of having genes not as well-suited to the new environment (which again includes diet) as the former one. Such discordance results in poorer survival rates and less reproductive fitness for the majority of the population. Those individuals whose genetic variability is better suited to the new conditions survive and reproduce better, thus leading to another round of evolutionary/genetic adaptation. Such genetic adaptation, however, takes time, that is, many successive iterative generations to achieve. Evolution is conservative, and relatively permanent changes in the genetic makeup of a population do not take place without sustained changes in environment, which--once again, in the context of this paper--includes diet. The time span for relative genetic equilibrium to be reestablished can span many thousands of years for a species (i.e., humans) which reproduces a new generation approximately once each 20 to 25 years.

An appreciation of the above process of how evolutionary change occurs is fundamental to both the science of evolutionary biology itself and can help to elicit a deeper understanding of dietary fitness and health: it is the "how" and "why" that explains the reasons species come to have the nutritional requirements they do. The foods that humanity originally evolved to eat and those we now eat in modern civilization are in many cases significantly different--yet our basic underlying genetic inheritance remains basically the same as it was before, and has evolved only very slightly since then. Thus, many of the foods we now eat are discordant with our genetic inheritance. (This is not simply an idle or "just so" hypothesis. As we proceed, we will look at the considerable clinical evidence supporting this picture.) Such "evolutionary discordance" is a fundamental aspect of the evolutionary equation that governs fitness and survival (in which health plays a key role), which includes the question of the diet humans are evolved to handle best from the genetic standpoint. To begin with, we will be examining evolutionary discordance from a general standpoint by looking at the mismatch between the characteristics of foods eaten since the "agricultural revolution" that began about 10,000 years ago compared with our genus' prior two-million-year history as hunter-gatherers. As the article progresses, however, we'll be taking a look at some of the actual genetics involved so it can be seen that "evolutionary discordance" is not merely a theoretical concept but a very real issue with relevance in how diseases can be genetically expressed in response to dietary factors. With this key concept in mind, let's now begin with a look at the history of grains and legumes in the human diet (quite recent in evolutionary time), after which we'll move on to some of the evolutionarily discordant effects of their consumption on human beings, as seen in modern clinical and genetic studies.

Evidence for the late evolutionary role of grains in the human diet

Timeframe for cereal grain domestication. There are 8 major cereal grains which are
consumed by modern man (wheat, rye, barley, oats, corn, rice, sorghum, and millet) [Harlan 1992]. Each of these grains were derived from wild precursors whose original ranges were quite

271

localized [Harlan 1992]. Wheat and barley were domesticated only ~10,000 years ago in the Near East; rice was domesticated approximately 7,000 years ago in China, India, and southeast Asia; corn was domesticated 7,000 years ago in Central and South America; millets were domesticated in Africa 5,000-6,000 years ago; sorghum was domesticated in East Africa 5,000-6,000 years ago; rye was domesticated ~5,000 years ago in southwest Asia; and oats were domesticated ~3,000 years ago in Europe. Consequently, the present-day edible grass seeds simply would have been unavailable to most of mankind until after their domestication because of their limited geographic distribution. Also, the wild version of these grains were much smaller than the domesticated versions and extremely difficult to harvest [Zohary 1969]. How recent in the human evolutionary experience is grain consumption in terms of our total dietary experience? The first member of the human genus, Homo, was Homo habilis who has now been dated to ~2.33 million years ago (MYA) [Kimbel et al. 1996]. Homo erectus, who had postcranial (the rest of the body below the skull) body proportions similar to modern humans, appeared in Africa by about 1.7 MYA and is thought to have left Africa and migrated to Asia by 1 MYA or perhaps even earlier [Larick and Ciochon 1996]. Archaic Homo sapiens (called by some, Homo heidelbergensis) has been dated to 600,000 years ago in Africa and to about 400,000 years ago in Europe or perhaps earlier [De Castro et al. 1997]. Anatomically modern Homo sapiens appear in the fossil record in Africa and the Mideast by about 90,000-110,000 years ago and behaviorally modern H. sapiens are known in the fossil record by ~50,000 years ago in Australia and by about ~40,000 yrs ago in Europe. The so-called "Agricultural Revolution" (primarily the domestication of animals, cereal grains, and legumes) occurred first in the Near East about 10,000 years ago and spread to northern Europe by about 5,000 years ago [Cavalli-Sforza et al. 1993]. The industrial revolution occurred roughly 200 years ago, and the technological revolution which brought us packaged, processed foods is primarily a development that has occurred in the past 100 years and has seen enormous growth in the last 50 years. To gauge how little geologic or evolutionary time humans have been exposed to foods wrought by the agricultural revolution, let's do a little paper experiment. Take a stack of computer paper (the kind in which each page is connected to one another) and count out 212 eleven-inch (28-cm) pages. Then unravel the stack of paper and lay it out end to end--it will form a continuous 194-foot (59-meter) strip. Now, let's assume that 1 inch (2.54 cm) equals 1,000 years in our 194-foot strip of computer paper; thus, the first part of the first page represents the emergence of our genus 2.33 MYA and the last part of the last page represents the present day. Now, take a slow walk down all 194 feet of the computer paper, and carefully look at each of the individual eleven-inch sections. When you get to the very last eleven-inch section (the 212th section), this represents approximately the beginning of agriculture in the Mideast 10,000 years ago; therefore, during the preceding 211 sheets humanity's foods were derived from wild plants and animals. This little experiment will allow you to fully grasp how recent in the human evolutionary experience are cereal grains (as well as dairy products, salt, and the fatty meats of domesticated animals). Humans may have indeed eaten these foods for "millennia," but millennia (even 10 millennia) in the overall timeframe of human existence represents 0.4%. Because the estimated amount of genetic change (0.005%) which has occurred in the human genome over this time period is negligible, the genetic makeup of modern man has remained essentially unchanged from that of pre-agricultural man [Eaton et al. 1985]. Consequently, the human genome is most ideally adapted to those foods which were available to pre-agricultural man, namely lean muscle meats, limited

272

fatty organ meats, and wild fruits and vegetables--but, significantly, not grains, legumes, dairy products, or the very high-fat carcasses of modern domesticated animals.

Processing technology required. Clearly, grass seeds have a worldwide distribution and
would have been found in most environments that early man would have inhabited. However because almost all of these seeds are quite small, difficult to harvest, and require substantial processing before consumption (threshing, winnowing, grinding, and cooking), it would have been virtually impossible for pre-behaviorally modern humans (circa 35,000-40,000 years ago) to exploit this food source. To harvest and process grains on a large scale, sickles, winnowing trays (baskets), threshing sticks, grinding stones, and cooking apparatus are required. There is no reliable evidence to indicate that this combination of technology was ever utilized by hominids until the late Pleistocene. The advent of grinding stones in the Mideast approximately 15,000 years ago heralds the first large-scale evidence of regular cereal grain consumption by our species [Eaton 1992]. There is substantial evidence that certain modern-day hunter-gatherers such as the Australian Aborigine and the American Great Basin Indians utilized grass seeds [Harlan 1992]; however, these grass seeds were not utilized as a staple and represented only a small percentage of the total caloric intake and were eaten for only a few weeks out of the year. For virtually all of the rest of the studied hunter-gatherer populations, cereal grains were not consumed.

Optimal foraging theory. In view of the substantial amount of energy required (as just
outlined) to harvest, process, and eat cereal grains, optimal foraging theory suggests that they generally would not be eaten except under conditions of dietary duress [Hawkes et al. 1985]. It seems likely that during the Late Paleolithic and before, when large mammals abounded, our ancestors would almost have never consumed the seeds of grass. Comparison with other foraging primates. Except for some species of baboons, no primate consumes gramineae (grass) seeds as a part of their regular natural diet. Primates in general evolved in the tropical rainforest in which dicotyledons predominate--consequently monocotyledons (gramineae) would not have been available to our primate ancestors. Primate digestive physiology. The primate gut is not equipped with the enzyme systems required to derive energy from the specific types of fiber which predominate in gramineae. Consequently, unless cereal grains are milled to break down the cell walls and cooked to crystallize the starch granules (and hence make them more digestible), the proteins and carbohydrates are largely unavailable for absorption and assimilation. Thus, until the advent of regular fire use and control (as evidenced by hearths ~125,000 years ago), it would have been almost virtually energetically impossible for our species to consume cereal grains to supply the bulk of our daily caloric requirements. Repercussions of antinutrient load. As has been suggested by John Yudkin almost 30 years ago, cereal grains are a relatively recent food for hominids and our physiologies are still adjusting and adapting to their presence. Clearly, no human can live on a diet composed entirely of cereal grains (for one thing they have no vitamin C). However, that is but one consideration, since eating raw cereal grains (as well as cooked cereal grains) wreaks havoc on the primate gut because of the high antinutrient content of grains. When cereal grain calories reach 50% or more of the daily caloric intake, humans suffer severe health consequences. One has to look no further than the severe pellagra epidemics of the late 19th century in America and the beri-beri scourges of southeast Asia to confirm this. Additionally, in not only human beings, but in virtually every animal model studied (dog, rat, guinea pig, baboon, etc.), high cereal grain consumption promotes and induces rickets and osteomalacia [Robertson 1981; Ewer 1950; Sly 1984; Ford 1972, 1977; MacAuliffe 1976; Hidiroglou 1980; Dagnelie 1990]. Recent research has also implicated zinc deficiency due to the effects of excessive cereal grain consumption in retarding skeletal growth [Reinhold 1971; Halsted

273

1972; Sandstrom 1987; Golub 1996], including cases of hypogonadal dwarfism seen in modernday Iran. The pathologies introduced by higher levels of cereal grain consumption discussed above are due primarily to the effects of phytates in grains, which bind to minerals, preventing adequate uptake. To this point, we haven't even touched upon the other antinutrients which inflict damage on a wide variety of human physiological systems. These antinutrients include protease inhibitors, alkylrescorcinols, alpha-amylase inhitors, molecular-mimicking proteins, etc. We will look further at these additional problems below. Clearly, however, cereal grains cannot contribute substantial calories to the diet of primates unless they are cooked and processed.

Digestive considerations and technology required


Question: Granted that grains would not have made up a large portion of the diet. Nevertheless, if people could in some way have comfortably eaten some amount of wild grains without technology, then given the opportunistic nature of human beings, there's not much reason to think they wouldn't have, is there? Commentary: People can put many plant items as well as non-edible items (stones, bones, feathers,
cartilage, etc.) into their gastrointestinal tracts by way of putting them into their mouths. The key here is the ability of the GI tract to extract the nutrients (calories, protein, carbohydrate, fat, vitamins, and minerals). Bi-gastric herbivores (those having second stomachs) have evolved an efficient second gut with bacteria that can ferment the fiber found in leaves, shrubs, grasses, and forbs (broad-leaved herbs other than grass) and thereby extract nutrients in an energetically efficient manner. (That is, there is more energy in the food than in the energy required to digest it.) Humans can clearly put grasses and grass seeds into our mouths; however, we do not have a GI tract which can efficiently extract the energy and nutrients. The starch and hence carbohydrate and protein calories in cereal grains occur inside the cell walls of the grain. Because the cell walls of cereal grains are almost completely resistant to the mechanical and chemical action of the human GI tract, cereal grains have been shown to pass through the entire GI tract and appear intact in the feces [Stephen 1994]. In order to make the nutrients in cereal grains available for digestion, the cell walls must first be broken (by milling) to liberate their contents and then the resultant flour must be cooked. Cooking causes the starch granules in the flour to swell and be disrupted by a process called gelatinization which renders the starch much more accessible to digestion by pancreatic amylase [Stephen 1994]. It has been shown that the protein digestibility of raw rice is only 25% whereas cooking increases it to 65% [Bradbury 1984]. The main cereal grains that humans now eat (wheat, rice, corn, barley, rye, oats, millet, and sorghum) are quite different from their wild, ancestral counterparts from which all were derived in the past 10,000 years. We have deliberately selected for large grains, with minimal chaff, and which are easily harvestable. The wild counterparts of these grains were smaller and difficult to harvest. Further, separation of the chaff from the grain was time-consuming and required fine baskets for the winnowing process. Once the chaff is separated from the grain, the grains have to be milled and the resultant flour cooked. This process is timeconsuming and obviously could have only come about in very recent geologic times. Further, the 8 cereal grains now commonly eaten are endemic to very narrow geographic locations and consequently by their geographic isolation would have been unavailable to all but a selected few populations of hominids. As touched upon previously, the issue of antinutrients in raw cereal grains is a very real issue. There are components in raw cereal grains which wreak absolute havoc with human health and well-being. The primary storage form of phosphorous in cereal grains is phytate, and phytates bind virtually all divalent ions, i.e., minerals for our purposes. Excessive consumption of whole-grain unleavened breads (50-60% of total calories) commonly results in rickets [Robertson 1981; Ewer 1950; Sly 1984; Ford 1972, 1977; MacAuliffe 1976; Hidiroglou 1980; Dagnelie 1990], retarded skeletal growth [Reinhold 1971; Halsted

274

1972; Sandstrom 1987; Golub 1996] including hypogonadal dwarfism, and iron-deficiency anemia (will provide the references upon request). The main lectin in wheat (wheat germ agglutinin) has catastrophic effects upon the gastrointestinal tract [Pusztai 1993a]. Additionally, the alkylrescorcinols of cereals influence prostanoid tone and induce a more inflammatory profile [Hengtrakul 1991], as well as depressing growth [Sedlet 1984]. Given the barriers to grain consumption that primitive hominids would have faced, who did not possess the more sophisticated technology only seen since about 15,000 years ago, optimal foraging theory, again, strongly suggests any consumption would have been at extremely minimal levels. Given also the lack of adaptation of the human gut to prevent the negative effects of their consumption which are only mitigated (and only partially) by such technology, it is extremely unlikely cereal grains were ever more than a very minute fraction of the human diet until very recent times.

Genetic changes to the human gut in evolutionary perspective


Question: What evidence is there for the speed at which genetic changes that govern the way the gastrointestinal tract functions can occur? Isn't there evidence showing that, for example, the genes governing lactose intolerance can be quite rapid in evolutionary terms? What basis is there for believing that the human gut is really the same as that of our hominid ancestors during Paleolithic times? Commentary: There are calculations which estimate how long it took to increase the gene for adult
lactase persistence (ALP) in northern Europeans from a pre-agricultural incidence rate of 5% to its present rate of approximately 70% [Aoki 1991]. (Note: The enzyme lactase is required to digest the sugar lactose in milk, and normally is not produced in significant quantity in human beings after weaning.) In order for the gene frequency to increase from 0.05 to 0.70 within the 250 generations which have occurred since the advent of dairying, a selective advantage in excess of 5% may have been required [Aoki 1991]. Therefore, some genetic changes can occur quite rapidly, particularly in polymorphic genes (those with more than one variant of the gene already in existence) with wide variability in their phenotypic expression. ("Phenotypic expression" means the physical characteristic(s) which a gene produces.) Because humans normally maintain lactase activity in their guts until weaning (approximately 4 years of age in modern-day hunter-gatherers), the type of genetic change (neoteny) required for adult lactase maintenance can occur quite rapidly if there is sufficient selective pressure. Maintenance of childlike genetic characteristics (neoteny) is what occurred with the geologically rapid domestication of the dog during the late Pleistocene and Mesolithic [Budiansky 1992]. The complete re-arrangement of gut morphology or evolution of new enzyme systems capable of handling novel food types is quite unlikely to have occurred in humans in the short time period since the advent of agriculture. Some populations have had 500 generations to adapt to the new staple foods of agriculture (cereals, legumes, and dairy) whereas others have had only 1-3 (i.e., Inuit, Amerindians, etc). Because anatomical and physiological studies among and between various racial groups indicate few differences in the basic structure and function of the gut, it is reasonable to assume that there has been insufficient evolutionary experience (500 generations) since the advent of agriculture to create large genetic differences among human populations in their ability to digest and assimilate various foods. Of the population differences in gastrointestinal function which have been identified, they generally are associated with an increased ability to digest disaccharides (lactose and sucrose) via varying disaccharidase activity. Although insulin metabolism is not a direct component of the gastrointestinal tract, there is substantial evidence to indicate that recently acculturated populations are more prone to hyperinsulinemia and its various clinical manifestations, including non-insulin-dependent diabetes mellitus (NIDDM), obesity, hypertension, coronary heart disease and hyperlipidemia [Brand-Miller and Colagiuri 1994].

275

It is thought that these abnormalities, collectively referred to as "syndrome X" [Reaven 1994], are the result of a so-called "thrifty gene" [Neel 1962] which some groups have suggested codes for glycogen synthase [Schalin-Jantti 1996]. Consequently, the ability to consume increasing levels of carbohydrate without developing symptoms of syndrome X is likely genetically based and a function of relative time exposure of populations to the higher carbohydrate contents of agriculture [Brand-Miller and Colagiuri 1994]. There are no generally recognized differences in the enzymes required to digest fats or proteins among human populations. Additionally, all human groups regardless of their genetic background have not been able to overcome the deleterious effects of phytates and other antinutrients in cereal grains and legumes. Iranian populations, Inuit populations, European populations, and Asian populations all suffer from divalent ion (calcium, iron, zinc, etc.) sequestration with excessive (>50% total calories) cereal or legume consumption. All racial groups also have not evolved gut characteristics which allow them to digest the food energy which is potentially available in the major type of fiber contained in cereal grains. Further, most of the antinutrients in cereal grains and legumes (alklyrescorcinols, amylase inhibitors, lectins, protease inhibitors, etc.) wreak their havoc upon human physiologies irrespective of differing genetic backgrounds. Thus, most of the available evidence supports the notion that except for the evolution of certain disaccharidases and perhaps changes in some genes involving insulin sensitivity, the human gut remains relatively unchanged from paleolithic times.

Celiac disease as evidence of genetic and evolutionary discordance


Simoons classic work on the incidence of celiac disease [Simoons 1981] shows that the distribution of the HLA B8 haplotype of the human major histocompatibility complex (MHC) nicely follows the spread of farming from the Mideast to northern Europe. Because there is strong linkage disequilibrium between HLA B8 and the HLA genotypes that are associated with celiac disease, it indicates that those populations who have had the least evolutionary exposure to cereal grains (wheat primarily) have the highest incidence of celiac disease. This genetic argument is perhaps the strongest evidence to support Yudkin's observation that humans are incompletely adapted to the consumption of cereal grains. Thus, the genetic evidence for human disease (in this case, I have used celiac disease; however, other models of autoimmune disease could have been used) is supported by the archeological evidence which in turn supports the clinical evidence. Thus, the extrapolation of paleodiets has provided important clues to human disease--clues which may have gone unnoticed without the conglomeration of data from many diverse fields (archaeology, nutrition, immunology, genetics, anthropology, and geography). For a celiac, a healthy diet is definitely cereal-free--why is this so? Perhaps now the evolutionary data is finally helping to solve this conundrum.

Biotin deficiency and the case of Lindow Man


Lindow Man, whose preserved body was found in a peat bog in Cheshire, England in 1984, is one of the more extensively studied of the so-called "bog mummies" [Stead, Bourke, and Brothwell 1986]. The principal last meal of Lindow Man likely consisted of a non-leavened whole-meal bread probably made of emmer wheat, spelt wheat, and barley. Unleavened whole-grain breads such as this represented a dietary staple for most of the less-affluent classes during this time. Excessive consumption of unleavened cereal grains negatively impacts a wide variety of physiological functions which ultimately present themselves phenotypically (i.e., via changes in physical form or growth). The well-documented phytates of cereal grains sequester many divalent ions including calcium, zinc, iron, and magnesium, which can impair bone growth and metabolism. Further, there are antinutrients in cereal grains which directly impair vitamin D metabolism [Batchelor 1983; Clement 1987]; and rickets are routinely induced in animal models via consumption of high levels of cereal grains [Sly 1984].

276

Less well-appreciated are the ability of whole grains to impair biotin metabolism. My colleague, Bruce Watkins [Watkins 1990], as well as others [Blair 1989; Kopinksi 1989], have shown that biotin deficiencies can be induced in animal models by feeding them high levels of wheat, sorghum, and other cereal grains. Biotin-dependent carboxylases are important metabolic pathways of fatty-acid synthesis, and deficiencies severely inhibit the chain-elongation and desaturation of 18:2n6 (linoleate) to 20:4n6 (arachidonic acid). Human dietary supplementation trials with biotin have shown this vitamin to reduce fingernail brittleness and ridging that are associated with deficiencies of this vitamin [Hochman 1993]. Careful examination of the photograph of Lindow's man fingernail (still attached to a phalange of the right hand [Stead 1986, p. 66]) shows the characteristic "ridging" of biotin deficiency. It is likely that regular daily consumption of high levels (>50% daily calories) of unleavened cereal-grain breads, which Lindow man may have consumed, caused a biotin deficiency, which in turn caused nail ridging.

Antinutritional properties of legumes


Question: So far we have been discussing grains. What about legumes? Could they have been realistically eaten as a staple by primitive groups without cooking, and if they are natural to the human evolutionary experience, why do they cause gas which indicates fermentation of indigestible products in the gut? If they are not natural for us, how do we account for the !Kung and other primitive groups who eat them? Commentary: As with grain consumption, there are hunter-gatherers who have been documented eating
legumes. However, under most cases, the legumes are cooked or the tender, early sprouts eaten raw rather than the mature pod. Some legumes in their raw state are less toxic than others. However, most legumes in their mature state are non-digestible and/or toxic to most mammals when eaten in even moderate quantities. I refer interested readers to: Liener IE (1994) "Implications of antinutritional components in soybean foods." Crit Rev Food Sci Nutr., vol. 34, pp. 31-67. Gupta YP (1987) "Antinutritional and toxic factors in food legumes: a review." Plant Foods for Human Nutrition, vol. 37, pp. 201-228. Noah ND et al. (1980) "Food poisoning from raw red kidney beans." Brit Med J, vol. 2, pp. 236237. and Pusztai A et al. (1981) "The toxicity of Phaseolus vulgaris lectins: Nitrogen balance and immunochemical studies." J Sci Food Agric, vol. 32, pp. 1037-1046.

These references summarize the basics about legume indigestibility/toxicity; however, there are hundreds if not thousands of citations documenting the antinutritional properties of legumes. Legumes contain a wide variety of antinutrient compounds which influence multiple tissues and systems, and normal cooking procedures do not always eliminate these [Grant 1982]. There are a variety of compounds in beans which cause gas. Mainly, these are the non-digested carbohydrates raffinose, stachyose, and sometimes verbascose, which provide substrate for intestinal microflora to produce flatus [Calloway 1971].

Starch digestion and alpha-amylase


Question: If it's true that starches such as grains and legumes are a relatively recent addition to the human diet, why then do humans have such an extraordinary ability to secrete the starch-digesting enzyme alpha-amylase, which is present in both saliva and pancreatic secretions? Some biochemists have characterized the level of secretion as "alphaamylase overkill."

277

Commentary: The highest levels of alpha-amylase occur in the human pancreas followed by the parotid
(salivary) glands. The amylase isozyme levels in parotid glands are of an order of magnitude less than those in pancreas [Sobiech 1983]. Because starch boluses do not remain in the mouth for more than a few seconds, parotid-derived alpha-amylase has little influence upon immediate starch digestion. Additionally, if the starch is wheat-based, there are endogenous alpha-amylase inhibitors in wheat (also in legumes) which effectively inhibit salivary amylase [O'Donnell 1976]. Further, wheat alpha-amylase inhibitors also influence pancreatic amylase secretion [Buonocore 1977] and have been shown to result in pancreatic hypertrophy in animal models [Macri 1977]. Legume starch contains trypsin inhibitors which inactivate native pancreatic trypsin so as to abnormally increase pancreatic cholecystokinin levels and also cause pancreatic enlargement in animal models [Liener 1994]. The point here is that humans obviously have adequate salivary and pancreatic amylase levels to digest moderate amounts of certain kinds of starch. However, antinutrients in our main starch sources (grains and beans) when consumed in excessive quantities may negatively impact endocrine function.

Question: Isn't it true, though, that amylase inhibitors are very heat-labile and denatured by cooking so that they then present little problems in digestion? Comment: Both alpha-amylase inhibitors (in cereals and legumes) and trypsin inhibitors (primarily in
legumes) are not fully denatured by normal cooking processes. It is reported, "Protein alpha-amylase inhibitors may represent as much as 1% of wheat flour and, because of their thermostability, they persist through bread-baking, being found in large amounts in the center of loaves." [Buonocore 1977] Further in his treatise on antinutrients, Liener states, "However, because of the necessity of achieving a balance between the amount of heat necessary to destroy the trypsin inhibitors and that which may result in damage to the nutritional or functional properties of the protein, most commercially available edible-grade soybean products retain 5 to 20% of the trypsin inhibitor activity originally present in the raw soybeans from which they were prepared." [Liener 1994]

Genetic changes in human populations due to agriculture


Question: What exactly are some of the known genetic differences among populations as a result of the spread of agriculture; what is the timescale for any changes; and what is the evidence? Commentary: It took roughly 5,000 years for agriculture to spread from the Mideast to the far reaches of
northern Europe. In this part of the world, agrarian diets were characterized by a cereal staple (wheat or barley early on; later rye and oats), legumes, dairy products, salt, and the flesh of domesticated animals (sheep, goats, cows, and swine). There is strong evidence to suggest that the retention of lactase (the enzyme required to digest lactose in milk) into adulthood is related to the spread of dairying [Simoons 1978]. Most of the world's populations which were not exposed to dairying did not evolve the gene coding for adult lactase retention. Favism is an acute hemolytic anemia triggered by ingestion of fava beans in genetically susceptible subjects with severe deficiency of glucose-6-phosphate dehydrogenase (G6PD). G6PD deficiency is thought to confer protection against malaria only in those geographic areas where favism exists [Golenser 1983]. A substance in fava beans called isouramil (IU) triggers the hemolytic anemia in G6PD-deficient individuals, and it is this interaction of IU with G6PD erythrocytes which renders these red blood cells incapable of supporting the growth of the malarial pathogen (Plasmodium falciparum). Thus, the spread of agriculture (fava beans in this case) to geographic locations surrounding the Mediterranean was responsible for the selection of G6PD in early farmers. Celiac disease is an autoimmune disease in which the body's white blood cells (T-lymphocytes) destroy intestinal cells causing malabsorption of many nutrients. The disease is caused by consumption of gliadin

278

(a peptide found in wheat, rye, barley, and possibly oats). Withdrawal of gliadin-containing cereals causes complete remission of the disease symptoms. Only genetically susceptible individuals (certain HLA haplotypes) develop the disease upon consumption of gliadin-containing cereals. There is a geographic gradient of susceptible HLA haplotypes in Europe, with the lowest incidence of susceptible HLA haplotypes in the Mideast and the highest frequency in northern Europe that parallels the spread of agriculture from the Mideast 10,000 years ago. This information is interpreted as showing that agriculture (via wheat, rye, and barley) genetically altered portions of the human immune system [Simoons 1981]. Diseases of insulin resistance, particularly non-insulin-dependent diabetes mellitus (NIDDM) occur in greater frequency in populations that are recently acculturated compared to those with long histories of agriculturally based (high-carbohydrate) diets. It has been hypothesized that insulin resistance in huntergatherer populations perhaps is an asset, as it may facilitate consumption of high-animal-based diets [Miller and Colagiuri 1994]; whereas when high-carbohydrate, agrarian-based diets replace traditional huntergatherer diets, it (insulin resistance) becomes a liability [Miller and Colagiuri 1994] and promotes NIDDM.

Question: What about D'Adamo's ideas that the ABO blood groups correlate with adaptation to different dietary patterns? Type O is said to be the original hunter-gatherer blood type, with Types A and B having originated more recently in response to agriculture and supposedly being blood types that are more adapted to vegetarian diets. Commentary: In regard to D'Adamo's ideas concerning ABO blood groups, diet, and disease
susceptibility, I suspect that the relationship is significantly more complex than what he has proposed. There are numerous examples in the literature showing an association with blood types and diet-related disease [Hein 1992; Dickey 1994]; however, it is unclear whether a causal relationship is present. It is generally conceded that human blood types have evolved in response to infectious disease [Berger 1989]. Because there are 30 common blood cell surface antigens (groups) in addition to the ABO group, it seems improbable that if blood typing is associated with certain dietary-induced maladies that they would be exclusively a function of only ABO groups. The two references I have cited demonstrate a relationship with Lewis blood types, not ABO. Consequently, it is more probable that a complex relationship exists between blood cell surface antigens, diet, and disease that likely involves multiple blood-group types. Further, because of the confounding effect of genetic disequilibrium (the associated inheritance of genotypes that do not follow Hardy-Weinberg equilibrium patterns), the relationship may only be serendipitous in nature and not causal as proposed by D'Adamo.

Evidence of genetic discordance as seen in autoimmune diseases


Up to this point, we have only briefly touched upon the role cereal grains have in inducing autoimmune disease (except for a brief look at celiac disease). There is substantial evidence (both epidemiological and clinical) showing the role cereal grains may play in the etiology of such diverse autoimmune diseases as multiple sclerosis (MS), insulin-dependent diabetes mellitus (IDDM), rheumatoid arthritis, sjogrens syndrome, dermatitis herpetiformis, and IgA nephropathy. Although this proposal may at first seem preposterous, there is strong data to suggest that cereal grains may be involved in all of these diseases through a process of molecular mimicry whereby certain amino acid sequences within specific polypeptides of the gramineae family are homologous to (have the same structural form as) a variety of amino acid sequences in mammalian tissue. These homologous amino-acid (AA) sequences can ultimately confuse our immune systems so that it becomes difficult to recognize "self" from "non-self." When this happens, T-cells, among other immune-system components, launch an autoimmune attack upon a body tissue with AA sequences similar to that of the dietary antigen.

279

It seems that grass seeds (gramineae) have evolved these proteins with similarity to mammalian tissue to protect themselves from predation by mammals, vertebrates, and even insects. This evolutionary strategy of molecular mimicry to deter predation or to exploit another organism has apparently been with us for hundreds of millions of years and is a quite common evolutionary strategy for viruses and bacteria. It has only been realized since about the mid-1980s [Oldstone 1987] that viruses and bacteria are quite likely to be involved in autoimmune diseases through the process of molecular mimicry. Our research group has put together a review paper compiling the evidence (and the evidence is extensive) implicating cereal grains in the autoimmune process, and with a little bit of luck it should be published during 1998. [Editorial note as of June 1999: The paper has now been published; the citation is: Cordain L (1999) "Cereal grains: humanity's double-edged sword." World Review of Nutrition and Dietetics, vol. 84, pp. 19-73.] Without the evolutionary template and without the evidence provided us by the anthropological community showing that cereal grains were not part of the human dietary experience, the idea that cereal grains had anything to do with autoimmune disease would probably have never occurred to us. This new electronic medium has allowed instant cross-fertilization of disciplines which probably would have rarely occurred as recently as five years ago.

How peptides in cereal grains may lead to molecular mimicry and autoimmune disease
In the human immune system, there are a number of individual mechanisms which allow the body the ability to determine self from non-self so that foreign proteins (i.e., bacteria, viruses, etc.) can be recognized, destroyed, and eliminated. Perhaps the most complex system which nature and evolution have engineered to accomplish this is the human leucocyte antigen (HLA) system. This system was discovered when early physicians found out that tissue from one human could not be grafted to another without rejection. The physiological function of this system was not to foil the efforts of transplant surgeons, but to initiate an immune response to parasites (viruses, bacteria). All cells of the body manufacture HLA proteins, whose function is to bind short peptides (protein fragments) and display them on the cell surface. Most of the peptides are derived from the body's own proteins (self-peptides). However, when the body is infected by a virus or bacteria, the HLA molecules pick up peptides derived from broken-down proteins of the virus or bacteria and present them to T-lymphocytes. The purpose of T-lymphocytes is to continually scan the surfaces of other cells to recognize foreign peptides while ignoring self-peptides. Once a T-cell receptor "recognizes" a foreign peptide, a complex series of steps is set into play which ultimately destroys the cell presenting the foreign peptide as well as living viruses or bacteria in the body which also have peptide sequences similar to those which were presented. When the HLA system loses the ability to recognize self (self-peptides) from non-self (foreign peptides), T-lymphocytes attack self-tissue, resulting in what is known as an autoimmune disease (i.e., celiac disease, IDDM, multiple sclerosis, dermatitis herpetiformis, ankylosing spondylitis, etc.). The HLA proteins which present foreign peptides to circulating T-lymphocytes are coded by DNA sequences on chromosome 6. The entire HLA system includes more than 100 genes, and occupies a region more than four million base pairs in length which represents 1/3,000 of the total human genome. On chromosome 6, the HLA is sub-divided into Class I (HLA-A, HLA-B, HLA-C) and Class II segments (HLA-DR, HLA-DQ, HLA-DP). Individuals with autoimmune disease inherit characteristic HLA combinations which identify their disease. People with celiac disease have genetic markers (HLA-DR3, HLA-B8, and HLA-DQ2) which are associated with the disease; people with insulin-dependent diabetes mellitus (IDDM) almost always have DQ and DR genotypes. Thus, the manner in which foreign proteins are presented to circulating T-cells by HLA proteins tends to be different for individuals with autoimmune diseases compared to those without these maladies. As mentioned previously, the incidence of a variety of autoimmune diseases follows a southeasterly gradient from northern Europe (highest incidence) to the Mideast (lowest incidence). This gradient occurs

280

because the incidence of susceptible HLA haplotypes increases as one moves northwesterly from the Mideast. This gradient--which occurs for both the incidence of autoimmune diseases and HLA haplotypes-is not a serendipitous relationship, but occurred as a result of the spread of agriculture from the Mideast to northern Europe [Simoons 1981]. Consequently, as agriculture spread into Europe there were environmental elements associated with this demic expansion ("demic" means the spread of genes through either the migration or interbreeding of populations) which progressively selected against HLA haplotypes (combinations of HLA genes inherited from the two chromosomes in each cell) that were originally present in the pre-agrarian peoples of Europe. Now, the question is, what were those environmental selective elements? In the case of celiac disease, it doesn't take a rocket scientist to determine that it was wheat. Increasing consumption of wheat caused increased mortality from celiac disease--thus, the incidence of celiac disease and its susceptible HLA haplotypes (HLA-B8, HLA-DQ, HLA-DR) are lowest in those populations with the most chronologic exposure to wheat (Mideasterners and southern Europeans) and greatest in those populations with the least exposure (northern Europeans). Similar arguments can be made for IDDM and a host of other autoimmune diseases. There are a substantial number of animal studies showing that consumption of wheat by rats increases the incidence of IDDM [Scott 1988a; Scott 1988b; Scott 1991; Elliott 1984; Hoofar 1993; Storlien 1996; Am J Physiol 1980;238:E267-E275; Schechter 1983]. How is it that wheat can wreak such havoc with the autoimmune system? Our research group believes that wheat contains peptide sequences which remain undigested and which can enter into systemic circulation. These peptide sequences are homologous to a wide variety of the body's tissue peptide sequences and hence induce autoimmune disease via the process of molecular mimicry. E.g., macrophages ingest the circulating wheat peptides. HLA molecules within the macrophage then present amino-acid sequences of the fragmented peptide to circulating T-lymphocytes, which through clonal expansion create other T-cells to "attack" the offending dietary antigen and any other self-antigen which has a similar peptide sequence--i.e., the body's own tissues. The original non-agricultural HLA haplotypes conferred selective advantage in earlier evolutionary times because these genotypes provided enhanced immunity from certain types of infectious diseases. However, with the advent of cereals in the diet they represented a liability. Thus, the genetic data clearly shows that a recently introduced food type has resulted in genetic discordance between our species and those from the gramineae family.

Possible autoimmune connection between dietary peptides and some forms of autism
Autism in children is a neuro-developmental disorder characterized by few or no language and imaginative skills, repetitive rocking and self-injurious behavior, and abnormal responses to sensations, people, events, and objects. The cause of the syndrome is unknown, but there is increasing evidence that it may be autoimmune in nature. Reed Warren's group [Singh 1993] found that 58% of autistic children maintained antibodies to myelin basic protein (a protein found in the myelin sheaths of nerves and suspected of being the target protein [self-antigen] for T-lymphocytes in the autoimmune disease multiple sclerosis). Additional support for the concept that autism may be autoimmune in nature comes from work showing that 46% of autistic children maintain major histocompatibility complex (MHC) alleles associated with the disease [Warren 1996]. The function of the MHC is to present self- and foreign peptides to circulating T-lymphocytes at the surface of all cells throughout the body. Thus, if foreign peptides are presented by the MHC, circulating Tlymphocytes can mount an immune response on the cell or cells that present, via the MHC, those foreign peptides, and destroy them. The MHC not only presents foreign peptides, but it also presents peptides derived from the proteins of genes comprising the MHC itself. The susceptiblity genes for autism are: DRB1*0404, DRB1*0401, and DRB1*0101 [Warren 1996]. In a particular portion of these genes (the third hypervariable region [HVR-

281

3]), there is a common amino-acid sequence shared by all three genes. This amino-acid sequence is either QKRAA (glutamine-lysine-arginine-arginine-alanine-alanine) or QRRAA. Thus, either the QKRAA amino-acid motif or the QRRAA amino-acid motif can be presented to circulating T-lymphocytes. This particular shared epitope increases the susceptibility to a number of autoimmune diseases, including rheumatoid arthritis [Auger 1997]. (An epitope is the part of an antigen recognized by an antigen receptor, i.e., a specific amino acid sequence of a protein.) The QKRAA or QRRAA amino-acid motif also occurs quite frequently in pathogens which reside in the human gastrointestinal tract including Escherichia coli, Proteus mirabilis, Lactobacillus lactis, Brucella ovis, and many other anaerobic gut bacteria [Auger 1997]. The QKRAA or QRRAA sequences are found specifically in a particular type of protein contained in gut bacteria, called DnaJ proteins. DnaJ proteins normally have a bacterial partner/ligand protein called heat-shock proteins (HSP70). It is the QKRAA or QRRAA amino-acid sequence of DnaJ which allows it to bind HSP70. When the MHC presents endogenously derived DRB1 alleles which contain the QKRAA or QRRAA amino-acid motif, then circulating HSP70 proteins (which normally bind DnaJ proteins) can bind the body's own MHC-presented QKRAA or QRRAA sequences. Circulating CD4+ T-lymphocytes recognize this HSP70/QRRAA sequence as foreign, and mount an immune response on all cells presenting this (HSP70) amino-acid motif. We believe that myelin basic protein contains an amino-acid sequence that is homologous to an amino-acid sequence found in HSP70, and it is this three-way mimicry between DRB1 peptides, bacterial peptides, and self-peptides which causes self-tolerance to be broken. So, how does a paleodiet have anything to do with this process? Paleodiets are characterized by their lack of cereal grains, legumes, dairy products, and yeast-containing foods. Both cereal grains and legumes contain glycoproteins (conjugated proteins that have a carbohydrate as the non-protein component) called lectins which bind intestinal epithelial cells and change the permeability characteristics of these intestinal cells [Liener 1986; Pusztai 1993b]. Not only do these lectins cause an increase of the translocation of gut bacteria to the peripheracy, they cause an increased overgrowth of gut bacteria as well as a change in the gut flora [Liener 1986; Pusztai 1993b]. Further, cereal and legume-derived lectins (WGA and PHA respectively) cause increased expression of intracellular adhesion molecules (ICAM) in lymphocytes [Koch 1994] which allows bacterial/immune complexes to move from gut to the affected tissue. Additionally, cereal and legume lectins increase lymphocytic expression of common inflammatory cytokines such as tumor necrosis factor alpha (TNFa), interleukin 1 (IL-1) and IL-6 which are known promoters of autoimmune disease. The cell walls of cereals and legumes contain a storage protein, GRP 180, which also can act as a ligand to self-presented MHC peptides [Dybwad 1996]. Further, peptides contained in dairy proteins (bovine serum albumins--BSA, among many) also may contain peptide sequences which can interact with endogenously presented peptides [Perez-Maceda 1991]. Cereal, legume, dairy, and yeast-free diets potentially have therapeutic benefit in many autoimmune related disorders via their ability to reduce gut permeability and decrease the exogenous antigenic load both from pathogenic bacteria and from potentially self-mimicking dietary peptides.

Conclusion
The study of the evolutionary relationship of diet to health and the repercussions of evolutionary discordance on human biology is still in its infancy. As an interdisciplinary study that depends on the correlation of insights from numerous scientific fields such as archaeology, anthropology, ethnobotany, evolution, biology, genetics, autoimmunity, and clinical nutritional research, its insights have to this point been appreciated only by a relative few.

282

However, the case of cereal grains and legumes in the diet presents one of the clearest examples where data from these various disciplines is beginning to come together in a way that can be tested and verified by controlled clinical studies. As such it represents an example of the power of the evolutionary approach to provide new directions for research that can give us insights into human diet that may not have been heretofore possible. Cereal grains are a particularly good example of both the promise and the perils of the evolutionary process--in both its physiological and cultural guises. Without cereal grains as the agricultural base for modern civilizations, it is exceedingly doubtful whether humanity would have developed the culture or the technologies that have led to the accomplishments and scientific insight we now enjoy. Obviously, modest amounts of cereal grains can be a part of the diets of most people with effects that, at least given our current state of knowledge, can be considered negligible. At the same time, however, to rely on cereals as more than a supplemental part of the diet increases the likelihood they will lead to problems due to their evolutionary discordance. For some people with the "wrong" genetic heritage (even though it may be a very normal aspect of the inherent variability of the human genome), any cereal grains at all are destructive enough--as in the case of celiac disease--that they are simply not an option if such individuals wish to avoid health problems due to their consumption. As well, with the advent of the young and newly expanding field of autoimmune research, there is increasing recognition of the role that autoimmunity may play in more disease conditions than has perhaps heretofore been thought. Given the potential that peptides in cereal grains have shown for precipitating autoimmune dysfunction, it may be that linkages with further disease conditions will be discovered as time goes on, which is cause for concern and points to the need for more intensive research in this area. I hope you have enjoyed our look at this example of the power of the interdisciplinary approach employed in the field of paleolithic diet to bring new insights to the study of human diet and uncover compelling new areas for nutritional research.

--Loren Cordain, Ph.D.

Dietary Macronutrient Ratios and their Effect on Biochemical Indicators of Risk for Heart Disease
Comparing High-Protein/Low-Carbohydrate Diets vs. High-Carbohydrate/Low-Fat Diets by Loren Cordain, Ph.D. Copyright 1999 by Loren Cordain. All rights reserved.
IMPORTANT WORD DEFINITIONS: CHD: Coronary heart disease, i.e., a form of cardiovascular disease or atherosclerosis. HUFA: Highly unsaturated fatty acids. In vitro: Biochemical effects obtained in an artificial environment outside the living organism. In vivo: Biochemical effects as they occur inside the living organism. Isocaloric: Calorie-for-calorie. Used when describing controlled replacement of one dietary macronutrient for another. Lipids: Fatty acids, i.e., dietary "fats." Mesenteric fat: Fat surrounding/protecting the intestinal area. n3 and n6 fats: Omega-3 and omega-6 fatty acids, respectively. Perinephral fat: Fat surrounding and/or protecting the kidney. Plasma: Refers to levels of nutrients as measured in the blood. Post-prandial: After a meal. Used in describing biochemical responses in the postmeal period. Polymorphism: A variant form of a gene. Often genes have several naturally occurring variants. PUFA: Polyunsaturated fatty acids. Serum: Synonym for blood, as in measured blood levels of nutrients.

283

Introduction
Conventional wisdom about macronutrient ratios challenged in last decade. The conventional
wisdom of orthodox nutritionists for the past 20-25 years regarding macronutrient intake has been that a high-carbohydrate, low-fat diet is the optimal diet for humans and benefits virtually all pathological conditions ranging from heart disease to cancer. In the past 10-12 years, however, this concept has been seriously questioned in terms of deleterious changes (elevated triglycerides and VLDL, and lowered HDL) which occur in blood-lipid profiles from such advice. Increasingly, influential scientists (Scott Grundy, Walter Willett, Gerald Reaven) and institutions (Harvard School of Public Health) have recognized this shortcoming of high-carb, low-fat diets, and are now recommending monounsaturated fat in lieu of carbohydrate [Grundy 1986; Mensink et al. 1987]. Despite the thousands--perhaps tens of thousands--of clinical trials which have been conducted manipulating the macronutrient (protein, carbohydrate, and fat) content of diet, however, there are perhaps no more than a half-dozen which have examined the influence of a high-protein, low-carbohydrate diet (with varying fat levels) upon human health and metabolism. This is somewhat ironic, in that this macronutrient pattern appears to be the one which nourished humankind (members of the genus Homo) for all of our time (2.5 million years) on this planet, except for the last 10,000 years since the advent of grainbased agriculture.

The specific dietary context in which fat and the other macronutrients occur is key, but often overlooked
There is little doubt that Paleolithic man consumed (probably preferentially) the fatty portions of wild game animals. During certain times of the year (late summer and early fall) the total lipid content of large herbivorous animals was considerable, and at these times saturated fat consumption could have been high. At other times, however, it would have been modest (except perhaps for high-latitude peoples who were the most dependent on animal meat due to a colder climate less hospitable to plant life), even while the overall yearly consumption of all forms of fat could have been relatively high. However, despite the consumption of a largely animal-based diet that at times can include a high overall level of fats, most modern-day hunter-gatherers exhibit low serum cholesterol levels, low blood pressure, and low to non-existent mortality rates from coronary heart disease (CHD) [Eaton 1988; Bang and Dyerberg 1980; Leonard et al. 1994]. At first, this data seems paradoxical in light of the almost universal recommendations of a low-fat, highcarbohydrate diet in the treatment of CHD. However, there are important and subtle differences between the high-fat/animal-based diets of pre-agricultural man and the high-fat diets of modern man which can account for this paradoxical situation:

Differences in trans fats and oxidation of cholesterol. There is increasing recognition that
for the atherosclerotic process to occur, there must not only be elevated levels of LDL cholesterol, but that the lipids and cholesterol carried by LDL must be oxidized [Steinberg et al. 1989]. The macrophages which take up oxidized LDL molecules and eventually become the foam cells of the atherosclerotic plaque have a scavenger receptor which is different from native LDL receptors and which does not down-regulate. Consequently, continually elevated levels of oxidized LDL in the plasma tends to promote the atherosclerotic process. High levels of dietary linoleate increase LDL oxidizability ([Louheranta et al. 1996]. Because refined vegetable oils were not present in pre-agricultural diets, the linoleate levels would have been lower than in Western diets, wherein the vegetable fat consumption has increased 300% since 1910 and the animal-fat consumption has decreased slightly [ASCN/AIN Task Force 1996]. Thus the relatively high levels of vegetable oils consumed in the Western diet, along with relatively high levels of saturated fats, promote a lipid profile in which LDL cholesterol is elevated and more prone to oxidation and hence to the development of CHD.

284

Differences in protein intake levels. The protein content of the paleolithic diet was
significantly higher than the average 12-15% of the Western diet. Recent studies [Wolfe 1995; Wolfe et al. 1991] show that isocaloric (calorie-for-calorie) replacement of carbohydrate with protein lowers total cholesterol, LDL, VLDL, and triglycerides (TG) while elevating HDL cholesterol--all of which are favorable responses in terms of blood-lipid levels. Consequently, a high dietary protein content even in the face of increasing overall fats, or increasing saturated fat, serves to lower serum cholesterol levels and reduce the risk for CHD. Differences in carbohydrate intake. The carbohydrate content of pre-agricultural diets was generally lower than the 45-55% of the Western diet. Consequently, the post-prandial (after-meal) lipemic excursions (changes in blood lipid levels)--during which LDL molecules are most prone to oxidation--would have been reduced, since the addition of carbohydrate to a fat-rich meal exacerbates this swing [Chen et al. 1992]. Pre-agricultural eating patterns show that fat and protein were generally eaten together, whereas carbohydrate meals were eaten separately. This eating pattern would have reduced post-prandial lipemic excursions. Additionally, the reduced carbohydrate content of pre-agricultural diets would have improved the portions of the blood-lipid profile (TG, VLDL, HDL, Lp(a)) which are worsened by high-carbohydrate diets [Reaven 1995]. Differences in fatty-acid intake profiles. Work from our laboratory [Cordain et al. 1998], as well as that of others, has shown that the fatty-acid profiles of storage as well as structural fat are quite different when contrasting wild to domesticated animals. Of the dietary saturated fats, 12:0, 14:0, and 16:0 are known to elevate plasma cholesterol levels, whereas 18:0 is neutral or perhaps hypocholesterolemic (cholesterol-lowering). The saturated fat of bone marrow and depot fat in wild animals contains greater levels of 18:0 and lower levels of 14:0 and 16:0 when compared to domestic animals. Additionally, the structural (e.g., polyunsaturated) lipid content in game meat is also quite different than that in domestic meat. There generally are higher levels of all n3 ("omega 3") fats and higher levels of all 20 and 22-carbon fats of both the n3 and n6 ("omega 6") varieties in game meat. The n6/n3 ratio of beef averages about 15, whereas in wild animals it is about 4-5. Again, higher levels of n6 lipids in domestic animals, particularly linoleate, tend to increase LDL oxidizability, whereas the higher levels of n3 fats in game animals are cardio-protective. Consequently, the consumption of saturated fats in pre-agricultural diets occurred against a background of dietary lipids which was much different than the background fats in the modern diet. Recent evidence clearly shows that the composition of fatty acids in a meal can improve serum lipid values despite widely varying fat levels [Nelson et al. 1995].

Absence of dairy fats. Pre-agricultural diets by definition would not have included dairy fats.
In modern Western diets, about a third of the saturated fat is contributed by dairy foods (milk, butter, cheese, ice cream). In metabolic ward studies, butter fat raised LDL cholesterol levels significantly higher than beef tallow [Denke and Grundy 1991]. Further, milk consumption is the best worldwide predictor of CHD mortality of all dietary elements [Artaud-Wild et al. 1993]. Bovine milk fat is quite low in the long-chain cardio-protective n3 fats, and has a high n6 ratio. Additionally the calcium to magnesium (Ca/Mg) ratio in milk and dairy products is quite high compared to the average 1:1 ratio in foods available to pre-agricultural man. Elevated Ca/Mg ratios have been shown to be positively related to CHD [Varo 1974]. This data suggests dairy products would be quite atherogenic, particularly when consumed in a background of other dietary elements in the Western diet. Large disparity in activity levels. Modern man eating a high-fat diet is generally quite inactive compared to pre-agricultural man [Cordain, Gotshall, and Eaton 1997]. High levels of activity serve to improve insulin sensitivity and lower TG and VLDL while increasing HDL cholesterol.

In all likelihood, the dietary fat levels of pre-agricultural man could have been quite high (even by modern standards). However, because of differing types and amounts of carbohydrate, protein, and fatty acids, as

285

well as differing levels of fiber and antioxidant vitamins and phytochemicals from a diet rich in plant foods as well as meats, these types of diets generally would not have elevated cholesterol levels (as confirmed by values seen in modern hunter-gatherers [Bang and Dyerberg 1980; Leonard et al. 1994]), nor have increased LDL oxidizability. One final comment: Not only does the high sodium content of the Western diet predispose us to hypertension, osteoporosis, urinary tract stones, menierre's syndrome, stomach cancer, insomnia, asthma and initiation and promotion of all types of cancer, it also seems to do the same in our closest relative, the chimp [Denton et al. 1995].

Saturated/unsaturated fat composition of wild animal tissues, and consumption levels in modern vs. pre-agricultural peoples
Our data on the fatty-acid distribution in tissues of wild animals presented at a recent conference on the return of n3 fats to the food supply, held at the National Institutes of Health in Bethesda, Maryland has been recently published in World Review of Nutrition and Dietetics. [Cordain et al. 1998]. This data refutes contentions made by some that the overall PUFA in wild-animal tissues is low. To the contrary, it is relatively high in both brain (26%) and muscle (36%) as our data shows, and which corroborates earlier work of Crawford et al. [1969]. The difference in polyunsaturated fatty acids (PUFAs) between the Western diet and the so-called "paleolithic diet" is that the PUFAs in the Western diet are predominantly based upon 18-carbon lipids (vegetable oils) with huge amounts of 18:2n6 (linoleic acid) predominating. The PUFA content of the paleolithic diet is higher than that of the Western diet (19.2% vs. 12.7% [Bang and Dyerberg 1980]) with much higher levels of HUFA (>20-carbon lipids) of both the n6 and n3 families. Once again, it should be emphasized as well that while pre-agricultural peoples certainly did consume saturated fat, it cannot compare with the levels consumed by modern Western populations. Bang and Dyerberg's data [1980] on Eskimo populations who ate a high-meat diet is particularly illustrative of this. Of the total dietary fats, saturated fats comprised 22.8% in Inuit people whereas saturated fats comprised 52.7% of the total dietary fats in a control population of Danes. To point to saturated fat consumption in pre-agricultural groups as license to eat freely of such fats ignores the ecological constraints that would have made modern levels of consumption highly unlikely for our paleolithic ancestors, and ignores as well the voluminous clinical data that shows their detrimental effects. High levels of saturated fat consumption on a year-round basis only became possible when domesticated animals were bred and fed in a manner which allowed accumulation of depot fat on a year-round basis. Wild animals almost always show a seasonal variation in storage fat, and even the very fattest wild land mammals contain 60-75% less total fat than the average domesticated animal. Thus, until the advent of the "Agricultural Revolution" 10,000 years ago, it would have been extremely difficult, or perhaps impossible, to eat high levels of saturated fat on a daily basis throughout the year.

Limitation of the Keys equation in predicting expected serum cholesterol levels from fat and cholesterol in the diet
In our group over the last month or so, we have bandied about the idea of the ancestral macronutrient compositions (i.e., percent fat, protein, and carbohydrate) and how they influence health. Clearly, in the normal Western diet (approximately 45-50% carbohydrate, 35-40% fat, and 10-15% protein), if dietary saturated fats are reduced, then total and LDL cholesterol are also reduced. Keys [1965] has published an equation which has been used extensively to predict changes in serum cholesterol from dietary lipids and cholesterol. Others [Mensink 1992] more recently have confirmed Keys' equation.

286

However, in perhaps the most well-controlled, modern dietary study of Greenland Eskimos [Bang and Dyerberg 1980], it has been shown that ischemic heart disease is very uncommon in these people (3.5% vs. 45-50% mortality rate in Western countries). The dietary macronutrient content of these partially Westernized Eskimos was 38% carbohydrate, 39% fat, and 23% protein, whereas the values for the control group of Danish people were 47% carbohydrate, 42% fat, and 11% protein. Mean total cholesterol levels in the Eskimos (5.03 mmol/liter) were significantly lower than in the Danes (6.18 mmol/liter) whereas triglycerides (TG) (0.57 vs. 1.23 mmol/liter) and VLDL (0.43 vs. 1.29 mmol/liter) were much lower in the Eskimos, and HDL levels were significantly higher (4.00 vs. 3.34 mmol/liter). Based upon the Keys et al. equation, the actual difference between the Eskimos' and Danes' total cholesterol levels should have been 0.67 mmol/liter, whereas in actuality it was 1.15 mmol/liter. This data suggests that the Keys equation may be invalid under circumstances wherein high quantities of animal products replace traditionally grain-dominated diets. Possible reasons for this discrepancy include the following characteristics of the Eskimos' diet: Higher protein levels in the face of lowered carbohydrate may induce different lipoprotein transport mechanisms [Wolfe 1995], and/or Differences in polyunsaturated fats between the two diets (high levels of n3 fats, and high levels of preformed long-chain fats of both n3 and n6 families).

The bottom line here is that present-day hunter-gatherers maintain quite low serum lipid levels despite high consumptions of animal-based foods.

Comment: To clarify the above, it appears that the likely reason the Keys equation fails to correctly predict cholesterol levels in situations such as the Eskimo study above is that it does not take into account the effects of carbohydrate on insulin secretion. Hyperinsulinemia now appears as if it may be one of the largest risk factors for CHD. Both high-protein and low-carbohydrate intakes, which were seen in the Eskimo study, promote inhibition of excess insulin.
Exactly. Ancestral, pre-agricultural diets were quite high in animal protein, and the carbohydrate that was consumed was generally of a low glycemic index. These populations also selectively consumed the fatty portions of the killed animal (brain, bone marrow, depot fat, perinephral fat, mesenteric fat, tongue, organs, etc.). However, available evidence from living hunter-gatherers show that these surrogates of our StoneAge ancestors maintain low risk factors for CHD (blood lipid profiles, blood pressure, insulin sensitivity, body composition, etc.). All of this on a diet which contains an average 50-65% of its total calories derived from animal foods, which therefore necessarily entails lower carbohydrate consumption. Clearly, the Keys equation breaks down when either the macronutrient content (high protein and low carbohydrate) or the fatty-acid composition of the diet (or both) varies beyond the range of conditions from which Keys originally derived his regression. Although there is much circumstantial evidence to indicate that the Keys equation is erroneous under these conditions, there is no empirical data that I am aware of which has specifically investigated or confirmed this concept.

Clarification of the role of saturated fats in promoting high cholesterol Comment: Some who promote diets based on those of traditional peoples--who may at times have eaten higher levels of saturated fat--suggest that the modern (high) levels of CHD do not have anything to do with saturated fat from animal sources. Rather, they point to modern processing techniques as having introduced new food substances into the human diet with detrimental effects, particularly excess polyunsaturates, hydrogenated oils, and refined carbohydrates.

287

There is much evidence to support the second half of this sentiment, but the evidence does not agree with the first part.

Hydrogenated oils (trans fatty acids), refined carbohdrates, and polyunsaturates.


There is now substantial evidence that hydrogenated oils (trans fats) are atherogenic via their hypercholesterolemic effects [Willett et al. 1994]. And in terms of their cholesterol-raising properties they may be worse than saturated fats, because they cause a decrease in HDL cholesterol [Ascherio et al. 1997]. Refined carbohydrate (sucrose in particular) has been known for more than 30 years [Yudkin 1972] to be implicated in its CHD-promoting effects, probably through increases in VLDL (the precursor to LDL), triglycerides, total cholesterol, and perhaps decreases in HDL [Hollenbeck et al. 1989]. Recently it has been recognized that although dietary polyunsaturates may lower serum cholesterol levels, they may actually increase the risk for CHD by increasing the susceptibility of LDL to oxidation [Louheranta et al. 1996]. So I am in agreement that hydrogenated fats, refined carbohydrates, and excessive polyunsaturated fats (primarily linoleic acid, 18:2n6) contribute to the development of CHD via hypercholesterolemic and LDL-oxidizing mechanisms.

Saturated animal fats in overall dietary context. However, I cannot agree with the
statement that saturated fats from animals in modern diets have nothing to do with CHD. It may be possible that the hypercholesterolemic effects of saturated fats (12:0, 14:0, 16:0) can be negated or somewhat ameliorated by extremely low levels of dietary carbohydrates (particularly in insulinresistant subjects) or by high levels of dietary protein (>20% of total calories) via protein's VLDLsuppressing effects [Kalopissis et al. 1995]. However, it is clear beyond a shadow of a doubt that dietary saturated fats (12:0, 14:0, and 16:0) elevate serum cholesterol levels within the context of the "average American diet." A recent metaanalysis of 224 published studies encompassing 8,143 subjects (many under metabolic ward conditions) has unequivocally demonstrated the hypercholesterolemic effect of dietary saturated fats [Howell et al. 1997]. The cellular basis for this observation stems from the regulation of lowdensity lipoproteins (LDLs). When the amount of cholesterol or saturated fat coming into the body is increased, there is an expansion of the sterol pools within liver cells, and to a lesser extent, peripheral cells, which causes a down-regulation of LDL receptors. As a consequence, LDL in plasma increases [Dietschy 1997]. Some have argued that increases in total plasma cholesterol and LDL may not necessarily have a direct relationship to mortality from CHD [Stamler et al. 1986]. Clearly, there are a wide variety of independent risk factors for CHD including hypertension, homocysteine (increased by deficiencies primarily in folate, vitamin B-6, and secondarily in B-12), catecholamines, n6/n3 fatty-acid ratio, antioxidant status (vitamins E, C, beta-carotene, phytochemicals, etc.), dietary fiber, cigarette smoking, and ethanol (alcohol) consumption, which influence a variety of physiological systems involved with CHD. However, there is powerful evidence (n = 356,222) to indicate that the relationship between serum cholesterol levels and the risk of premature death from CHD is, nevertheless, continuous and graded [Stamler et al. 1986]. Therefore, the recommendation by some that it is harmless to consume high levels of dietary saturated fats within the context of the "average American diet" appears to not only be erroneous, but probably deadly. Our hunter-gatherer ancestors consumed high levels of animal food (probably >55% of their total daily calories); however, the context under which this was done was much different than present-day conditions. As I have previously mentioned, the carbohydrate content of the diet was low (~<35% of total calories) and composed of plant foods with high soluble fiber and low starch content. The protein content of the diet would have exceeded 20% and may have been as high as 30-40%. The

288

polyunsaturated fats consumed would have had a low n6/n3 ratio, and there would have been both ample levels of 20 and 22-carbon fats of both the n6 and n3 variety. Since marrow contains 7075% monounsaturated fats and was a favored food, it is likely that although the fat content of the diet may have been as high as 40%, it was composed of not only a much more favorable n6/n3 polyunsaturated fat ratio, but higher levels of monounsaturated fats and non-atherogenic saturated fats such as stearic acid (18:0) as well.

The role of essential fatty acids (EFAs) and the balance of omega-6 to omega-3 fats. Question: Regarding saturated fats in context, is it not the case that:

A low-fat diet that is deficient in (polyunsaturated) essential fatty acids (EFAs) will cause heart disease, whereas A diet high in saturated fat but containing plenty of polyunsaturated n3 fats (i.e., omega-3s, found in animal foods) will keep arteries clear, such as in the Eskimo?

The first part of this is essentially correct. Indeed, levels of EFAs (essential fatty acids--particularly the chain-elongated 20 and 22-carbon forms of both n6 and n3 families) are inversely related to levels of coronary heart disease (CHD). Paradoxically (at least in terms of the American Heart Association dietary recommendations), Hindu vegetarians from India whose diet is composed largely of low-fat grains and pulses (legumes) maintain CHD rates equal to [Begom et al. 1995] or higher [Miller et al. 1988] than those in the USA and countries of Europe, despite their diets' lower total fat content when compared to American and European diets. Indian populations have consistently exhibited high plasma n6/n3 ratios, low levels of 20:5n3 and 22:6n3, and high levels of 18:2n6 when compared to Western populations [Miller et al. 1988; Reddy et al. 1994; Ghafoorunissa 1984; McKeigue et al. 1985]. All of these EFA profiles are conducive to CHD and occur because of the lack of an appropriate balance of n6/n3, and because of the almost total lack of 20 and 22carbon fatty acids in commonly consumed plant-based foods. Regarding the second part of the above comment, it is partially correct to say that omega-3 (n3) fats provide protection against CHD, but it has little to do, directly, with keeping the arteries clear (i.e., atherosclerosis). N3 fats provide protection from CHD in that they lower triglycerides and perhaps VLDL; additionally, they reduce platelet adhesitivity and decrease thrombotic tendencies as well as reducing cardiac arrhythmias [Leaf et al.1988]. However, recent large-scale meta-analyses [Harris 1997] show that n3 fats actually cause a 5-10% rise in LDL cholesterol and a small rise (1-3%) in HDL. Eskimo populations indeed do consume higher levels of both saturated fat and polyunsaturated n3 fats than do Western populations; they also exhibit significantly lower serum LDL and total cholesterol levels than Europeans [Bang and Dyerberg 1980]. Thus, logic (derived from the meta-analytical data) dictates that the n3 fats are not the element responsible for the lower total and LDL serum cholesterol in these populations. Careful analysis of Bang and Dyerberg's data [1980] reveals a much higher protein intake (26% of total calories) compared to the 11% value in Danes. High protein intakes are known to cause drastic inhibition of hepatic VLDL synthesis [Kalopissis et al. 1995] (VLDLs are the source of LDLs), and high-protein diets in humans have been clinically shown to reduce total cholesterol, LDL cholesterol, and triglycerides while simultaneously increasing HDL [Wolfe 1995]. Further, acute consumption of high levels of low-fat (6.5%), lean-beef protein is not associated with a post-prandial rise in insulin but rather an increase in glucagon levels [Westphal et al. 1990]. Consequently, the major reason why Eskimo diets keep serum cholesterol levels low and atherosclerosis at bay is because of their high protein content primarily. There is no doubt that n3 fats also contribute to

289

lowering CHD, but it is not directly mediated by a lowering of LDL cholesterol but rather by other mechanisms previously outlined.

Effect of fat, protein, and carbohydrate on glucagon levels Follow-up question: To clarify the above statement that low-fat (6.5%) beef stimulates higher glucagon levels, isn't it the protein content of beef rather than the fact it is also low in fat that stimulates glucagon? My impression was that fat intake level has little effect on the insulin/glucagon response to food. (Editorial note: Release of insulin not only causes uptake of glucose into cells, but also promotes fat storage. Glucagon is the other side of the equation, causing mobilization of body fat for conversion to energy; that is, it causes fat to be burned.)
As far as I know, there are no good, recent data evaluating the effects of varying protein/fat mixtures upon insulin/glucagon responses in humans. Most of the data involves manipulating carbohydrate, with varying amounts of fat; protein is usually held constant. The Westphal et al. [1990] paper evaluates protein/carbohydrate mixtures on serum glucagon responses. Pure dietary carbohydrate (50g glucose) shows no rise in plasma glucagon, whereas pure protein (actually 93.5% lean beef, 6.5% fat, as mentioned above) causes the greatest rise in glucagon after 1 hour; with roughly equal areas under the curve after 3 hours when comparing pure protein to protein/carbohydrate mixtures (50g glucose/50g protein). Thus, there appears to be a dose/response effect on glucagon with protein/carbohydrate mixtures, and from the data, it can probably be interpreted that there is a dose/response effect with pure protein. As far as insulin response goes (as opposed to glucagon response), fat/carbohydrate mixtures cause a greater rise than carbohydrate meals alone, presumably because of the stimulatory effect of fat upon glucose-dependent insulinotropic peptide (GIP) [Collier et al. 1988]. Thus, as indicated by the question, there is a dose-dependent effect of dietary protein upon glucagon secretion which is largely independent of either carbohydrate or fat.

Protein levels and their effect on blood lipids


Some would dismiss the idea that dietary protein can have any influence upon cardiovascular disease with the argument that there is no difference in CHD incidence in populations consuming high vs. low-protein diets. However, a serious problem with this argument is the lack of much substantial variability in current protein consumption levels worldwide to produce support for this line of reasoning via epidemiological comparisons. Global surveys of the world's populations indicate a remarkably limited range of protein consumption that varies from about 10 to 15% of total calories [Speth 1989]. Further, except for reports of Inuit and Eskimo diets, I know of no references showing any contemporary populations consuming 15-20% of their calories as protein, much less high-protein diets in the 30-40% range of consumption such as our ancestors or recent hunter-gatherers have sometimes eaten. Speth [1989] has extensively studied protein intakes in contemporary worldwide populations and notes that most human populations today obtain between 10-15% of their total energy requirements from protein. For Americans the value is 14%, for Swedes it is 12%; for Italian shipyard workers it is 12.5-12.8%; for Japanese it is 14.4%, and for West Germans it is 11.1%. Even among athletes, values rarely exceed 15%. Speth [1989] shows that Italian athletes consumed between 17-18% of their caloric intake as protein; Russian athletes consumed 11-13%; and Australian athletes competing at the 1968 Olympic Games consumed 14.4% of their daily calories as protein. This data clearly demonstrates the relative homogeneity amongst contemporary global populations in their protein consumption levels.

290

That protein consumption may have anything to do with the atherosclerotic process and hence CHD is an obscure topic which has been rarely examined by the medical and nutritional communities. It is not surprising that few are aware of the literature which supports this concept. However, there are now at least three human clinical trials [Wolfe et al. 1991; Wolfe et al. 1992; Wolfe 1995] demonstrating that isocaloric (calorie-for-calorie) substitution of protein (ranging from 17-27% of total daily calories) for carbohydrate reduces triglycerides, VLDL, LDL, and total cholesterol while increasing HDL cholesterol. Further, acute consumption of high levels of beef protein without carbohydrate evokes an extremely small rise in serum insulin levels and a concomitant substantial rise in glucagon [Westphal et al. 1990]. Both of these acute responses would tend to be associated with a reduced risk for CHD. Lastly, in animal models, high levels of protein are known to dramatically inhibit hepatic VLDL synthesis [Kalopissis et al. 1995]. VLDLs are the precursor molecules for LDL cholesterol. In their classic study of Inuit, Bang and Dyerberg [1980] have shown that the serum cholesterol levels of the Inuit were 0.48 mmol/liter lower than what would have been predicted by the Keys equation, which estimates plasma lipid levels from dietary saturated fats, polyunsaturated fats, and cholesterol. At the time (1980), it was suggested that the paradoxically low serum cholesterol levels may have resulted from the higher omega-3 (n3) fats found in the Eskimo's seafood-based diet. However, after almost 30 years of research, meta-analytical studies have shown that n3 fatty acids slightly elevate (by 5-10%) LDL cholesterol concentrations, but do not materially affect total cholesterol [Harris 1997]. Consequently, it may have been the higher dietary protein intake (23-26% of total calories) in the Inuit compared to the Danish controls (11% of total calories as protein) which accounted for these differences. However, since the Keys equation considers dietary monounsaturated fats as neutral (which more recent research indicates is not the case [Gardner et al. 1995]), it is possible that the higher monounsaturated fat content (57.3% of total fat) in the Inuit diet (vs. 34.6% in the Danes) may have also contributed to the plasma cholesterol differences.

Low-carbohydrate diets by themselves do not eliminate the cholesterol-raising effects of high-saturated-fat diets
Many people who have been influenced by the recent interest in low-carbohydrate dieting would argue that the cholesterol-lowering effect of the Eskimo diet stemmed from its low carbohydrate content. However, in one of the few (and best-controlled) metabolic ward trials of a carbohydrate-free (<20 gm/day) diet, Phinney and colleagues [Phinney et al. 1983] demonstrated a rather large rise in serum cholesterol (159 to 208 mg/dl) in nine lean, healthy males who participated in this 35-day in-patient trial. The protein content of the diet was estimated to be 15%, whereas the fat content of the diet represented between 83-85% of total daily calories. Consequently, during the dietary trial, the protein content remained similar to the average daily intake in the U.S. and was not increased. This experiment shows that a carbohydrate-free diet composed of "ground beef, breast of chicken, water-packed tuna fish, powdered egg solids, and cheddar cheese with mayonnaise, heavy cream, sour cream, and cream cheese as primary lipid sources" was definitely hypercholesterolemic. In a less-well-publicized but highly controlled clinical research center (CRC) study, Gray et al. showed similar results in a 3-week study of 10 healthy males who consumed a diet composed of 73-75% fat, 7-9% carbohydrate, and 16-20% protein. Compared to their standard (normal-carbohydrate) diet, the high-fat diet increased total cholesterol from 156.5 mg/dl to 167.6 mg/dl, and LDL cholesterol increased from 46.6 mg/dl to 55 mg/dl. The total cholesterol/HDL ratio, however, improved on the high-fat diet, going from 3.36 to 3.20. High-fat, low-carbohydrate diets--as in the Phinney [1983] and Gray studies--characteristically induce other beneficial lipid profiles such as increased HDL levels and decreased triglyceride levels. These blood lipid changes (increased HDL and reduced triglycerides) have also been frequently demonstrated in

291

reduced-carbohydrate diets [Jeppesen et al. 1997; Coulston et al. 1983] in which carbohydrate has been reduced, but not as drastically as in the Phinney and Gray trials. So in summary, the animal foods of our Stone-Age ancestors were probably non-atherogenic because they contained high levels of protein (>20% of total calories), lower levels of saturated fats, higher levels of monounsaturated fats, higher levels of n3 polyunsaturated fats, little or no trans fats, and higher levels of HUFA (>18-carbon) fats of both the n6 and n3 varieties than modern Western meat-based diets. The higher consumption of animal-based foods would have necessarily reduced the carbohydrate content of the diet, and this would have also benefited certain aspects of the lipid profile as just enumerated.

Glycemic response of fat combined with carbohydrate


In a previous comment, I suggested that meals of pre-agricultural peoples tended to produce less of a glycemic response than do modern Western meals. This was based on the observation that hunter-gatherer meals generally were not the elaborate mixtures of fat/carbohydrate/protein that are typical of Western meat/potato meals. Hunter-gatherers quite often would eat only the animal killed for a meal without added plant courses. Thus, protein/fat macronutrient mixtures were the norm. Carbohydrates generally were consumed as they were collected, or separate from animal-based meals. It has been well-established that by mixing fat with carbohydrate, the glycemic response worsens [Collier et al. 1988].

What is the relevance of genetic differences in individual blood lipid response to high and low-fat diets?
In view of recent discussions about low-carbohydrate diets and reevaluation of the effects of highcarbohydrate diets, there have been speculations regarding human blood-lipid responses to high and lowcarbohydrate diets, and whether or not there is a genetic basis for differential responders. To follow up on this, there is substantial evidence to show that blood-lipid response to variation in dietary fat and cholesterol intake varies widely among individuals [Mistry et al. 1981; Jacobs et al. 1983; Katan et al. 1988], and that this variability is likely attributable to genetic factors with polymorphisms [variant forms of a gene] at several genetic loci, including genes for apolipoproteins and for low-density lipoprotein (LDL) particle size and density [Dreon et al. 1992]. There is an LDL subclass called pattern "B" which is characterized by a preponderance of small, dense LDL particles, elevated triglycerides, low high-density cholesterol (HDL), and increased coronary heart disease (CHD) risk. LDL pattern "B" occurs in approximately 30% of the male population [Austin et al. 1988]. LDL subclass pattern "A" is characterized by larger, more buoyant LDL particles. Low-fat, highcarbohydrate diets induce a reduction in the atherogenic, small, dense LDL in individuals displaying pattern "B", and also cause reductions in LDL cholesterol greater than in subjects displaying pattern "A" [Dreon and Krauss 1997]. These data clearly suggest that low-fat, high-carbohydrate diets may be more effective in lowering LDL cholesterol and small, dense LDL in about 30% of the population, and less effective in 70% of the population. LDL subclass pattern "B" is influenced by a major gene or genes with a prevalence in the American population estimated to be 25% [Austin et al. 1988]. The specific gene or genes responsible for this trait have not been identified, but there is evidence to show linkage to polymorphic markers near the LDL receptor gene on chromosome 19p [Nishina et al. 1992]. To date, there are no experimental data evaluating the effects of quite low-carbohydrate diets (<30% of total energy) upon blood lipid responses in LDL subclasses "A" or "B". However, Krauss et al. [1995] have clearly shown that all subjects (n = 105), whether subclass "A" or "B", responded to a high-fat diet (46% energy) by substantial increases in LDL cholesterol, and responded to a low-fat diet (23.9% energy) by

292

decreases in LDL cholesterol. The difference was simply in the magnitude of the negative effect experienced, not whether it occurred or not. This information does not support the contention by some that differential responders to high and low-fat diets bias the interpretation of dietary intervention trials, nor does it lend support to the proposal that highfat diets can improve blood-lipid profiles. I contend that any improvement in total cholesterol or LDL cholesterol by uncontrolled, self-administered low-carbohydrate diets are an artifact of: Reductions in total caloric intake, Increases in total protein, Unknowing changes in the dietary polyunsaturated/monounsaturated/saturated fat (P:M:S) ratio, and Combination of the three.

Further, improvements in triglycerides, VLDL, and HDL can be mainly attributed to reductions in carbohydrate. Under isocalorically (calorie-for-calorie) controlled conditions in which dietary saturated fat is increased at the expense of any other lipid or macronutrient, there will be a characteristic increase in LDL cholesterol, as shown time and again with meta-analyses [Howell et al. 1997], under metabolic ward conditions [Phinney et al. 1983], and corroborated by in vitro and in vivo data showing that LDL receptors are downregulated by dietary saturated fat [Brown and Goldstein 1976].

Conclusion
As I hope the foregoing has demonstrated, further studies that have been performed in the years since the low-fat, high-carbohydrate viewpoint first became standard have revealed additional factors affecting blood lipids, and that the previous view has been too simplistic. Serious drawbacks have become apparent in the conventional wisdom about low-fat, high-carbohydrate diets. No longer does this view adequately explain what we have come to know about the effects of macronutrient content with increasing resolution at the biochemical level over the last decade. We are entering an era of dietary research where the details of underlying biochemical processes that govern lipid responses are being increasingly well-understood. Certain of these details validate the positive health effects that may accrue from the dietary pattern suggested by recently emerging studies of diet in human evolution. Hunter-gatherers who eat high levels of protein, lower levels of carbohydrate, and similar or even higher levels of fat (but with a much different lipid profile) compared to modern Western diets exhibit extremely positive blood lipid profiles and quite low rates of CHD. This presents a serious challenge for researchers, since this result would not be predicted by previous theories about fat in the diet. While the detrimental role of high levels of saturated fat by itself has been increasingly well-validated, the overall picture of the various other types of fat is turning out to be more complex. Fat is as essential a nutrient as the other macronutrients. More important than the overall level of fat in the diet are the roles and ratios of specific types of fat, such as the positive role of monounsaturated fats and a high n3/n6 polyunsaturated ratio, and the negative effects of trans fatty acids and deficiencies in EFAs. Where the polyunsaturated fats are concerned, modern diets contain excessive amounts of the n6 fat linoleic acid (that would have been present in lesser amounts in preagricultural diets), which promotes oxidation of cholesterol and consequently formation of atherosclerotic plaque. Also, what saturated fats are consumed by pre-agricultural peoples come from wild animal tissues. Compared to modern domesticated animals, these animal tissues are much higher in the non-atherogenic saturated fat stearic acid and lower in the 14:0 and 16:0 fats that promote high cholesterol.

293

At the same time, as the roles of various fats in the diet are becoming more well-understood, attention has recently begun to turn to investigation of the biochemical effects on blood lipids of the other macronutrients. By comparison with the voluminous studies performed in recent decades on fatty acids, these have been relatively ignored. However, only by devoting the same detailed attention to the effects of carbohydrate and protein on blood lipid response will we fully understand the role of all the macronutrients on health in relation to each other. As previously mentioned, the hyperinsulinemic effect of excess carbohydrates is looming large as a subject warranting much further study. And what studies have been performed initially on higher protein consumption levels show that they exert very positive effects on blood-lipid profiles. In this ongoing investigation, the "paleolithic" picture of the foods and macronutrient ratios that would have prevailed during human evolution provides a valuable template: One that can yield key insights for guiding future study into the food consumption patterns to which the human species is genetically best adapted. --Loren Cordain, Ph.D.

Are Higher Protein Intakes Responsible for Excessive Calcium Excretion?


by Loren Cordain, Ph.D. Copyright 1999 by Loren Cordain. All rights reserved. IMPORTANT WORD DEFINITIONS: Calciuresis: Excretion of calcium in the urine. Dermis: The second layer of skin underneath the epidermis, or surface, layer. Renal: Of, or relating to, the kidneys. Serum: Refers to levels of nutrients as measured in the blood.

The paradox of high bone mass in Paleolithic skeletons


The question of high levels of protein in the diet and the issue of calcium excretion is of particular interest in light of Paleolithic diet research for two reasons. First, because estimates of the levels of protein--and specifically animal protein--in the human diet during at least the last 1.7 million years of human evolution (from the time of Homo erectus) are much higher than considered prudent in some sectors of the nutritional research community today. And second because, at the very same time, the fossil evidence shows Paleolithic humans to have had high bone mass that would have been robust and fracture-resistant compared to that of modern Western humans: in exact opposition to some of current nutritional theory about the alleged role of protein in causing osteoporosis. Here we'll take a look at this apparent paradox.

Protein's effect on bone loss. Three recent papers which concern the topic of high dietary protein intake
and bone loss are: Heaney RP (1998) "Excess dietary protein may not adversely affect bone." Journal of Nutrition, vol. 128, pp. 1054-1057. Massey LK (1998) "Does excess dietary protein adversely affect bone? Symposium overview." Journal of Nutrition, vol. 128, pp. 1048-1050. Barzel US, Massey LK (1998) "Excess dietary protein can adversely affect bone." Journal of Nutrition, vol. 128, pp. 1051-1053.

Heaney [1998] concluded that if the calcium-to-protein ratio (mg:g) is greater than or equal to 20, then there is probably adequate protection for the skeleton. Using an assumed plant/animal subsistence ratio of 35:65 as representative of estimated intake during Paleolithic times, the calcium/protein ratio for modern

294

diets based upon Stone-Age food categories (fruits, vegetables, and lean meats, including fish, poultry, shellfish, game meat, and organ meats), would be about 2.47. (The plant/animal-food ratio of 35:65 comes from our research group based on an updated reanalysis of recent-day hunter-gatherer diets as recorded in the Ethnographic Atlas; we hope to publish this analysis in an upcoming journal paper.) If one assumes a subsistence ratio exactly opposite (65% plant and 35% animal), as my colleague Boyd Eaton has [Eaton et al. 1985], and uses his wild plant and animal food database, then the calcium/protein ratio would be 6.29 for Stone-Age humans. The median calcium/protein ratio for 50 to 59-year-old women as shown using NHANES III (National Health and Nutrition Examination Survey) data is 9.30 [Heaney 1998].

The paradox of Stone-Age humans. From these data, it appears that it would be virtually impossible
for Stone-Age humans, living in their native environment and eating wild plant and animal foods, to have come remotely close to a dietary calcium/protein ratio equaling or exceeding 20. Paradoxically, the fossil record shows Stone-Age humans to have thick bones with large cortical cross-sectional areas [Ruff et al. 1993]. Consequently, it is likely that the skeleton of Stone-Age man and woman would have been more robust and fracture-resistant than that of Western, industrialized men and women [Ruff et al. 1993; Cordain, Gotshall, and Eaton 1997].

Multiple factors aside from protein also affect calcium balance


How could this apparent paradox be possible? There are several notable differences between modern diets and Paleodiets other than protein consumption that can affect calcium balance.

Sodium chloride (salt) consumption. One is that Stone-Age men and women did not consume
supplemental dietary sodium chloride (salt), which like protein can also cause increased calciuresis (calcium excretion) [Nordin et al. 1993] and loss of bone mass [Devine et al. 1995]. Because the kidney must obligatorily excrete calcium with sodium [Nordin et al. 1993], high levels of dietary sodium are now generally recognized to be the single greatest dietary risk factor for osteoporosis [Matkovic et al. 1995; Devine et al. 1995; Cappuccio 1996]. It should go without saying that in this context, "high" levels of dietary sodium are simply normal levels in Western societies.

Imbalance in the calcium/magnesium ratio. Further, the calcium/magnesium ratio was about 1:1 in
pre-agricultural diets, whereas in modern Western diets it can be as high as 4:1 [Varo 1974]. High dietary calcium can cause magnesium deficiencies, even when normal levels of magnesium are ingested [Evans and Weaver 1990]. Because supplemental magnesium appears to prevent bone fractures and can result in increased bone density [Sojka and Weaver 1995], it is possible that the high consumption of dairy products (which are high in calcium), at the expense of magnesium-rich fruits and vegetables, may unexpectedly result in reduced bone-mineral density. A few details regarding the specific effects of magnesium on calcium retention are of particular interest here in light of the aforementioned differences between Paleolithic diets compared to the typical modern Western diet. In pre-agricultural diets consisting of meats, fruits, vegetables, nuts, etc., the calcium (Ca) to magnesium (Mg) ratio is approximately 1:1. Because the Ca:Mg ratio of milk and dairy products is 12:1, the inclusion of milk and milk products into post-agricultural diets can raise the overall dietary Ca:Mg ratio to 3 to 4:1 [Varo 1974]. In animal models, it has been shown that rats develop clinical signs of magnesium deficiency after three weeks on high-calcium, normal-magnesium diets [Evans and Weaver 1990; Luft et al. 1988; Sellig et al. 1974]. Ironically, high-calcium diets may have a deleterious effect upon bone mineralization because of their hypomagnesic (magnesium-depleting) effect. Conversely, magnesium deficiency is a known cause of hypocalcemia (low calcium) [Rude et al. 1976]. (In other words, for either calcium or magnesium utilization to be optimum, both must be in balance with each other.) The resultant hypocalcemia stems from parathyroid hormone (PTH) unresponsiveness [Rude et al. 1978], since the effects of PTH are magnesiumdependent [Estep et al. 1969]. Gross, clinical hypocalcemia and hypomagnesia tend not to occur in

295

otherwise healthy post-menopausal, osteoporotic women; however, serum measures (blood levels) of magnesium concentrations are not good indicators of magnesium status, and subjects with magnesium deficiencies (as measured intracellularly) frequently maintain normal serum magnesium levels [Ryzen et al. 1990]. Consequently, over a lifetime, a marginal or reduced intracellular magnesium level may adversely influence PTH responsivity which in turn likely compromises bone-mineral content. A recent review article [Sojka and Weaver 1995] showed that post-menopausal women given magnesium supplements over a twoyear period had a significant increase in their bone-mineral density, whereas meta-analyses of calcium supplementation and bone-mineral density have been equivocal.

Acid/alkaline dietary load. Additionally, bone mass is also dependent upon the relative acid/alkaline
dietary load [Massey 1998; Barzel and Massey 1998]. Acid generated by the diet is excreted in the urine and can cause calciuresis. Meat and fish have a high potential renal acid load (PRAL) whereas fruits and vegetables have a negative PRAL, meaning they reduce acid excretion. The human kidney cannot excrete urine with a pH lower than 5; consequently the acids (mainly phosphate and sulfate) of acid-producing foods such as meats, fish, and some cereals must be buffered partially by calcium which is ultimately derived from the skeleton [Massey 1998; Barzel and Massey 1998]. Because fruits and vegetables can act as alkaline buffers for the acids derived from meats and fish, they have been recently shown to decrease urinary calcium excretion even when dietary protein and calcium are held constant [Appel et al. 1997]. In other words, without reducing either dietary protein or calcium in the diet, calcium balance is improved when the percentage of fruits and vegetables in the diet is increased. Thus, the high levels of fruits and vegetables that Stone-Age people consumed may have partially counteracted the calciuretic effects of high-protein diets.

Excessive intake of cereals (grains). The effects of cereal phytate in limiting mineral absorption are
well-known and thereby unfavorably impact calcium uptake, though this is often not noted when considering the overall picture of calcium balance in relation to the potential for osteoporosis. In addition, the acidifying effects of cereals upon the urine [Barzel and Massey 1998] (via the kidney), cause calcium carbonates from bone mineral reserves to be used to buffer the slight metabolic acidosis caused by cereals.

Exposure to sunlight and vitamin D. An additional factor is that our Stone-Age ancestors would have
likely had higher plasma levels of vitamin D than modern man because of their greater exposure to sunlight. Vitamin D, synthesized in the dermis via ultraviolet radiation, enhances calcium absorption and can prevent bone loss.

Physical activity levels. Lastly, because our ancestors were more active than modern humans [Cordain
et al. 1997], their increased activity levels may have also improved their bone mass despite a high protein intake.

Conclusion
In summary, calcium retention or excretion is dependent on additional key factors besides protein consumption. To fully assess the net effect requires an analysis of the entire diet and lifestyle in its overall context rather than focusing on any one factor in isolation.

--Loren Cordain, Ph.D.

296

Metabolic Evidence of Human Adaptation to Increased Carnivory


by Loren Cordain, Ph.D. Copyright 1998 by Loren Cordain. All rights reserved. IMPORTANT WORD DEFINITIONS: Dentition: Teeth. Hominids: Bipedal primates, including human beings and our bipedal ancestors in evolutionary prehistory. Obligate, as in "obligate carnivore" (cats are obligate carnivores in their natural environment): Able to exist or survive only in a particular environment or by assuming a particular role (definition from The American Heritage Dictionary, 3rd edition). Pongid: The class of ape mostly closely related to humans, consisting of--in order of genetic and evolutionary relatedness to us--the chimpanzee, gorilla, and orangutan. Plasma (as in "plasma levels of taurine"): Refers to levels of nutrients as measured in the blood. Precursor (as in "precursor fatty acids"): Dietary nutrients which can be converted by the body into other needed nutrients not obtained in the diet itself. Hepatic: Of, or related to, the liver. Lipids: Fatty acids, i.e., dietary "fats." Coprophagy: Ingestion of fecal material, whether intentional or unintentional. I am in agreement with previous posts that human dentition is adapted for a generalized diet composed of both plant and animal foods, and that human populations show amazing variability in their plant-to-animal food subsistence ratios. However, it is important to recognize that hominids have evolved important metabolic and biochemical adaptations which are indicative of an increasing physiological dependence upon animal-based foods. Further, comprehensive compilations of hunter-gatherer subsistence strategies indicate that whenever it is ecologically possible, humans will almost always consume more animal food than plant food.

Background: Hunter-gatherer plant/animal subsistence ratios.


Our laboratory has recently compiled the plant/animal-food subsistence ratio data in the Ethnographic Atlas [Murdock 1967] for all worldwide hunter-gatherer populations which have been studied either historically or by contemporary anthropologists. The analysis shows that in the majority (61.3%) of worldwide huntergatherers, gathered plant food represents 35% or less of the total food utilized. Only 2.2% of the world's hunter-gatherers derive 66% or more of their total foods from plants. Further, not a single hunter-gather population derives 86% or more of its total calories from plant foods. The most frequently occurring (mode) plant/animal subsistence ratio for worldwide hunter-gatherers is 1625% plant/75-84% animal, and the median value is 26-35% plant/65-74% animal. These values corroborate five careful modern studies of hunter-gatherers showing a mean energy (caloric) intake from animal-food sources to be 59% [Leonard et al. 1994].

297

Comparing the human gut with ape guts and biochemical adaptations of carnivores.
Pongids (the primates that humans are most closely related to), because their diet is largely plant-based, must maintain large and metabolically active guts to process the fibrous plant foods which compose over 93% or greater of their dietary intake. In contrast, the human gut is much smaller and less metabolically active than the ape gut. Presumably this adaptation (reduction in gut size and metabolic activity) evolved in humans because the inclusion of nutrient-dense, animal-based foods by our early hominid ancestors allowed the selective pressure for a large, metabolically active gut to be relaxed [Leonard et al. 1994; Aiello and Wheeler 1995]. In addition to the smaller gut that humans maintain relative to apes, there are other metabolic and biochemical clues which point to increased utilization of animal food by humans over our evolutionary history. By evaluating the metabolic and biochemical dietary adaptations of cats (obligate carnivores) and those in humans (omnivores), it becomes apparent that evolution has shaped both hominid and feline metabolic machinery towards a diet in which animal food was predominant. Obligate carnivores, such as cats, must obtain all of their nutrients from the flesh of other animals and have therefore evolved certain biochemical adaptations which are indicative of their total dietary dependence upon animal-based foods. Most of these biochemical adaptations involve either the loss (or reduced activity) of certain enzymes required for the synthesis of essential nutrients. These adaptations generally occurred because the evolutionary selection pressure to maintain these metabolic pathways was relaxed as cats gradually increased the amount of animal food in their diet as they evolutionarily progressed from omnivory into obligate carnivory.

Taurine
Taurine is an amino acid which is not found in any plant-based food [Laidlow et al. 1990] and which is an essential nutrient in all mammalian cells. Herbivores are able to synthesize taurine from precursor amino acids derived from plants, whereas cats have completely lost the ability to synthesize taurine [Knopf et al. 1978]. Since all animal-based foods (except cow's milk) are rich sources of taurine [Laidlow et al. 1990], cats have been able to relax the selective pressure required for taurine synthesis because they obtain all of this nutrient that they need from their exclusive meat-based diet. Humans, unlike cats, still maintain the ability to synthesize taurine in the liver from precursor substances; however, this ability is quite limited and inefficient when compared to herbivores. Vegan vegetarians following diets devoid of animal products display low levels of both plasma and urinary taurine [Laidlow 1988]--levels which are indicative of the poor ability of humans to synthesize taurine. Similar to cats, this inability to efficiently synthesize taurine has come about because the selective pressure to produce this amino acid has been gradually reduced due to humankind's long reliance upon animal food, a food which is quite high in taurine.

20- and 22-carbon fatty acid requirements


Plant-based foods contain 18-carbon fatty acids of both the omega-3 and omega-6 families, but are virtually devoid of the 20- and 22-carbon fatty acids that are required for the normal functioning of all mammalian cells, whether the mammal is an herbivore or carnivore. Herbivores have evolved hepatic (liver) enzymes (desaturases and elongases) which allow these precursor, plant-based 18-carbon fatty acids to be chainelongated and desaturated to their 20- and 22-carbon products. Cats have extremely low levels of the enzymes required to make 20- and 22-carbon fatty acids from 18carbon fatty acids [Salem et al. 1994]. Again, the selection pressure to synthesize 20- and 22-carbon lipids (fatty acids) has been almost entirely removed because cats obtain sufficient quantities of these long-chain

298

fatty acids by eating animal tissues which are rich sources of these lipids. Humans, though not as inefficient as the cat, also have relatively inefficient elongase and desaturase enzymes [Salem et al. 1994]. Again, this metabolic change has occurred largely because the need to desaturate and chain-elongate 18-carbon plant fatty acids to their 20- and 22-carbon products has been reduced because humans, like cats, have obtained a large portion of their 20- and 22-carbon lipids directly by eating other animal tissues.

Vitamin A synthesis
All animals, whether herbivore or carnivore, require vitamin A. Vitamin A is not found in any plant-based food; consequently, herbivores must synthesize it in the liver from beta-carotene consumed from plantbased foods. Cats have lost the ability to synthesize vitamin A from beta-carotene [MacDonald et al. 1984], and must obtain all of their vitamin A from the organs (liver, kidney) of their prey. Again, cats have lost the ability to synthesize vitamin A because the selective pressure (need) to provide adaptive energy for the synthesis of proteins to catalyze the production of vitamin A was reduced as cats progressively increased the amount of animal foods in their diets. Recently, it has been shown that humans have a limited capacity to absorb beta-carotene in plants [de Pee and West et al. 1995] (the bioavailability of beta-carotene from plants is low for humans compared to its bioavailability from other sources), presumably because humans, like cats, have consumed vitamin A-rich animal food sources for eons and are in a transitional state from omnivory to obligate carnivory.

Vitamin B-12
Vitamin B-12 is an essential nutrient for both herbivorous and carnivorous mammals. Because B-12 is not found in higher plants, herbivorous mammals must solely rely upon absorption of B-12 from bacteria that synthesize it in their gut. Cats can neither synthesize B-12 nor absorb it from their gut; consequently they have become wholly dependent upon animal flesh as their source for this essential nutrient. Humans, like cats, cannot depend on the absorption of bacterially produced vitamin B-12 from the gut, and are reliant upon animal-based sources of this essential vitamin, since it does not occur in a biologically active form in any of the plant foods which humans normally eat. While some viable B-12 is synthesized in the human colon, the site of absorption is at the ileum, which is "upstream" from the colon at the lower end of the small intestine; thus for humans, B-12 synthesized in the colon is unavailable and must come from the food eaten [Herbert 1988]. Regarding possible B-12 synthesis in the small intestine above the ileum, the consensus of scientific literature indicates any amounts that may potentially be produced are not significant or reliable enough to serve as a dependable or sole source for most individuals. Additionally, while most cases of B-12 deficiency in omnivores are due to problems of impaired absorption rather than a deficiency of nutritional intake [Herbert 1994], the opposite situation prevails in vegetarians eating only minimal amounts of animal by-products [Chanarin et al. 1988]. Further, studies in vegans have shown that despite physiological recycling and conservation mechanisms that become increasingly efficient as B-12 intake falls below normal daily requirements--so that very little is lost from the body--the likelihood is high that B-12 deficiency will eventually develop (after 20 years or more) in pure vegans who refrain without fail from ingesting any animal-based products or do not take B12 supplements [Herbert 1994]. Recent work has delineated four stages of inadequate B-12 levels in strict vegetarians [Herbert 1994; Herbert 1988]. Vegans in early Stage I depletion--prior to ongoing depletion of stores and the declining blood levels of Stage II, biochemical deficiency and impaired DNA synthesis of Stage III, and clinical deficiencies of Stage IV--are able to maintain normal serum B-12 levels. However, this occurs by drawing from stored reserves in the liver and elsewhere which gradually become depleted, eventually to the point where actual deficiency develops many years later in those who maintain strict habits.

299

It is this negative metabolic B-12 balance which occurs soon after exogenous B-12 ceases to be ingested in appreciable quantities, many years prior to actual deficiency, which points to the human requirement for animal-based B-12 sources if one is to maintain a positive B-12 balance. One need not show actual cases of deficiency and end-stage megaloblastic anemia, but only the trend of long-term negative B-12 balance, to demonstrate the human metabolic need for animal-based foods to maintain a neutral or positive homeostatic balance. It is now possible to determine negative homeostatic B-12 balance directly by distinguishing and measuring levels of the two forms in which B-12 is carried in the blood: either bound to the transcobalamin II "delivery" molecule (called TCII, for short), or bound to the haptocorrin molecule which is a form of "circulating storage." Haptocorrin maintains equilibrium with body stores, meaning that haptocorrin levels reflect current reserve stores of B-12. TCII, however, being the "delivery" molecule for B-12, transports and gives up its B-12 to cells that are actively using B-12 in DNA synthesis, and has a half-life in the blood of only 6 minutes. Thus, when exogenous intake of B-12 falls below normal, levels of TCII-carried B-12 begin to reflect the deficit rapidly, and subnormal levels will show up within one week, demonstrating negative B-12 balance. [Herbert 1994] It is probable, therefore, that vegans with long-term normal B-12 balance are ingesting--inadvertently or otherwise--at least small amounts of B-12, if not from supplements, then from unreported animal-based sources in their food or contaminated by such sources. (In one study of vegans for which this has been observed, the cause was due to eating unwashed vegetables that had been grown in gardens containing intentionally manured soils, from which the B-12 came [Herbert 1988]. Ironically, the manure in this case was their own excrement, which as pointed out above harbors bacteria that produce B-12 in the human colon--where B-12 cannot be absorbed. Not unless, of course, it is reingested as in the unintentional coprophagy occurring in this instance, so that it can pass back through the small intestine again to the ileum where B-12 is actually absorbed.) An indication of the masking effect of previously stored B-12 reserves in obscuring ongoing negative B-12 balance can be seen in long-term vegan mothers and their infants. Such mothers may maintain blood levels of haptocorrin B-12 in the normal range for lengthy periods (years) due to increasingly efficient recycling of B-12 as their reserve stores become depleted, and in adult vegans with such improved B-12 reabsorption, such clinical deficiency may take 20-30 years to manifest. However, infants of such mothers are born with almost no reserve stores (little or none are available in the mother's body to pass on to them) and go into clinical deficiency much more rapidly. [Herbert 1994] In summary, the absence of the ability of humans to absorb bacterially produced B-12 in the colon, and the evidence that strictly behaving vegans will show negative TCII-carried B-12 balance even when total serum levels are in the normal range, is indicative of the long evolutionary history of animal-based foods in our diet.

Recap
These metabolic and biochemical adaptations in humans in response to increasingly meat-based diets, as well as the anthropological evidence provided by both contemporary and historical studies of huntergatherer diets, provide strong evidence for the central role of meat and animal tissues in the human diet. Although it is true that human populations can survive under broad plant/animal subsistence ratios, the consensus evidence supports the notion that whenever it was ecologically possible, animal calories would have always represented the majority of the total daily energy intake. --Loren Cordain, Ph.D.

300

Longevity & health in Ancient Paleolithic vs. Neolithic People


Not what you may have been told by Ward Nicholson Copyright 1997, 1999 by Ward Nicholson. All rights reserved. Contact author for permission to republish. Special update as of April 1999: LATE-BREAKING ADVANCES IN PALEOPATHOLOGICAL AGEESTIMATION TECHNIQUES have suggested that studies based on earlier techniques (as in the paper discussed here) may underestimate the age at death of older individuals and overestimate that of younger individuals. It's possible the range of estimation errors involved could be substantial. Thus, the profile of age-distribution results in compilation studies like the one discussed below may be flattened or compressed with respect to "true age." On the other hand, however, this consideration does not affect the "relative age," so to speak, of comparisons between age at death of different skeletal specimens, as summarized here, nor does it materially impact inferences about health status as indicated by skeletal data. Thus, for that reason, the results presented here still remain of considerable interest in the comparison of ages/health status of late Paleolithic peoples vs. the Neolithic agricultural peoples who followed them. At a later date, updated information may be provided to supplement this report concerning estimated age-at-death figures.

How does the health/longevity of late Paleolithic hunters-gatherers compare with that of the Neolithic farmers who succeeded them? Periodically one will hear it stated in online discussion
forums devoted to raw foods and vegetarianism that Paleolithic peoples only lived to be 25 (or 30, or 35) years, or whatever age. (The lack of exactitude in such figures illustrates how substantiating one's "scientific facts" is not usually a very highly emphasized value in these forums.) The intended point usually being that those terribly debauched flesh-eating cavemen--and women, presumably--were not living very long due to their consumption of meat. As is often the case with such "facts," however, if one looks at the documented sources, one sees a different picture. Here we present a summary of a classic paper on the health and longevity of late Paleolithic (preagricultural) and Neolithic (early agricultural) people. [Source: Angel, Lawrence J. (1984) "Health as a crucial factor in the changes from hunting to developed farming in the eastern Mediterranean." In: Cohen, Mark N.; Armelagos, George J. (eds.) (1984) Paleopathology at the Origins of Agriculture (proceedings of a conference held in 1982). Orlando: Academic Press. (pp. 51-73)] Note that these figures come from studies in the field of "paleopathology" (investigation of health, disease, and death from archaeological study of skeletons) of remains in the eastern Mediterranean (defined in Angel's paper to also include Greece and western Turkey), an area where a more continuous data sample is available from ancient times. Due to the unavoidable spottiness of the archaeological record in general, however, samples from the Balkans, the Ukraine, North Africa, and Israel were included for the earliest

301

(Paleolithic and Mesolithic) periods. While the populations in the region were not always directly descended from one another, focusing the study within the eastern Mediterranean minimizes bias in the data due to genetic change over time. The table below is adapted and condensed considerably from Angel's full table included in the above paper. Angel comments on the indicators given in the table below that archaeologically, lifespan is the simplest indicator of overall health. Growth and nutrition status can be generally indicated by skull base height, pelvic inlet depth index, and adult stature--the latter two of which are shown here in addition to lifespan.

HEALTH & LONGEVITY OF ANCIENT PEOPLES

Historical Time Period

Pelvic Inlet Depth Index % (higher is better) 97.7

Average Adult Stature Male cm (ft/in)

Median Lifespan (yrs)

Female cm Male (ft/in)

Fem.

30,000 to 9,000 B.C. ("Late Paleolithic" times, i.e., roughly 50/50 plant/animal diet--according to latest figures available elsewhere.) 9,000 to 7,000 B.C. ("Mesolithic" transition period from Paleolithic to some agricultural products.) 7,000 to 5,000 B.C. ("Early Neolithic," i.e., agriculture first spreads widely: As diet becomes more agricultural, it also becomes more vegetarian in character--relatively much less meat at roughly 10% of the diet, and much more plant food, much of which was grain-based.) 5,000 to 3,000 B.C. ("Late Neolithic," i.e., the transition is mostly complete.) 3,000 to 2,000 B.C. ("Early Bronze" period) 2,000 B.C. and following ("Middle People") Circa 1,450 B.C. ("Bronze Kings") 1,450 to 1,150 B.C. ("Late Bronze") 1,150 to 650 B.C. ("Early Iron") 650 to 300 B.C. ("Classic")

177.1 (5'9.7) 172.5 (5'7.9)

166.5 (5'5.6) 159.7 (5'2.9)

35.4

30.0

86.3

33.5

31.3

76.6

169.6 (5'6.8)

155.5 (5'1.2)

33.6

29.8

75.6(?) 85 78.8 82.6(?) 79.5 80.6 83.5

161.3 (5'3.5) 166.3 (5'5.4) 166.1 (5'5.4) 172.5 (5'7.9) 166.8 (5'5.7) 166.7 (5'5.6) 170.5

154.3 (5'0.7) 152.9 (5'0.2) 153.5 (5'0.4) 160.1 (5'3.0) 154.5 (5'0.8) 155.1 (5'1.1) 156.2

33.1 33.6 36.5 35.9 39.6 39.0 44.1

29.2 29.4 31.4 36.1 32.6 30.9 36.8

302

(5'7.1) 300 B.C. to 120 A.D. ("Hellenistic") 120 to 600 A.D. ("Imperial Roman") Medieval Greece Byzantine Constantinople 1400 to 1800 A.D. ("Baroque") 1800 to 1920 A.D. ("Romantic") "Modern U.S. White" (1980-ish presumably) 86.6 84.6 85.9 87.9 84.0 82.9 92.1 171.9 (5'7.7) 169.2 (5'6.6) 169.3 (5'6.7) 169.8 (5'6.9) 172.2 (5'7.8) 170.1 (5'7.0) 174.2 (5'8.6)

(5'1.5) 156.4 (5'1.6) 158.0 (5'2.2) 157.0 (5'1.8) 154.9 (5'1.0) 158.0 (5'2.2) 157.6 (5'2.0) 163.4 (5'4.3) 41.9 38.8 37.7 46.2 33.9 40.0 71.0 38.0 34.2 31.1 37.3 28.5 38.4 78.5

One can see from the above data that things are rarely as clear-cut as dietary purists would like them to be. For any period in time, there is good and there is bad. The main thing to note here about the short average lifespans compared to modern times is that the major causes are thought to have been "occupational hazards," i.e., accidents, trauma, etc., stresses of nomadism, and so forth. It is not always clear how strongly other conclusions can be drawn about the effect of diet from these figures, but all other things being equal- Median longevity decreased slightly during the first several millennia after the introduction of agricultural foods during which plant foods became a greater part of the diet, and meat a lesser part, than previously. This would seem to indicate that meat/protein consumption itself would not have been the factor responsible for decreased longevity (since less of it was being eaten after the late Paleolithic). From some of the later time periods involved where civilizations were on the rise and fall, it appears that social factors have the biggest impact on longevity, particularly since longevity never rose above about age 45 for long, often falling below that figure for centuries at a time, until the 1900s, since which time it has almost doubled. Perhaps the most reliable conclusion to be drawn from the data here is that while diet is a significant influence on longevity, it is only part of the mix, and perhaps not as powerful a determinant as other factors. Angel himself comments on the interplay among them: The table shows two differing breakdowns of health with subsequent advances. [Note: in the original table, there were additional data besides the above that indicated health status as based on skeletal indices.] First, there was a fairly sharp decline in growth and nutrition during the confusions and experiments of the transformation from hunting to farming, with its many inventions and increasing trade and disease between about 10,000 and 5,000 B.C. Partial recoveries and advances in health occurred during the Bronze Age rise of civilization; then real advance (e.g., a 7 to 11-year increase in longevity) occurred with the rise of Hellenic-Roman culture. Second, there was an increase in disease and crowding during the decline and religious

303

metamorphosis of the Roman Empire, eventually leading to an irregular breakdown of general, but not nutritional, health under a complex disease load, from about A.D. 1300 to 1700. (p. 58)

Other interesting tidbits on diet and health from Angel's paper relating to the Paleolithic/Neolithic transition:
In prehistoric times (which would include Paleolithic, Mesolithic, and Neolithic periods in the table above), human infant mortality was 20-30%. (For wild animals, the figure is 60-80%.) Few people lived much past the end of their fertile reproductive period. Paleolithic females died younger than males due to the stresses of pregnancy and childbirth while still carrying the burdens of food-collecting and moving camp. "The best explanation for relatively short [Paleolithic] life span is the combination of stresses of nomadism, climate, and warfare. The latter is especially clear in the Jebel Sahaba population, where projectile wounds affecting bone are very common and 'almost half the population probably died violently.' [Wendorf 1968]" (pp. 59-60) [Note: violence/trauma as a major cause of death was also true of the Mesolithic as well.] A somewhat more sedentary pattern during the Mesolithic increased longevity of females slightly due to lessened migration stress. On the other hand, the incipient decreases seen in stature indicate a somewhat increased level of disease (such as malaria, hookhorm), likely resulting from more settlements near water and marshes. (Increased seafood consumption in lieu of red meat may also have had the effect of reduced caloric consumption, a contributor to nutritional stress as well.) Drop in stature due to nutritional stress begins appearing in places during the Mesolithic although in general it is still good. One site shows signs of seasonal growth arrest. [Note: Growth arrest lines in bone are seen in the young of populations experiencing seasonal food shortages and consequent nutritional shortfall.] There are also a few site-specific (i.e., localized, not widespread) indications of anemia (i.e., porotic hyperostosis, which is bone marrow-space thickening and porosity), possibly due to thalassemia, a new disease which apparently evolved a few thousand years earlier. Hunting continued at a high enough level, however, so that protein and vitamin D levels were maintained at sufficient levels to sustain relatively healthy growth, and only small losses in adult stature are seen overall compared to the Paleolithic. Mesolithic subsistence was characterized by four new practices and inventions: (1) The use of "composite" tools fashioned from multiple rather than simply single materials, including harpoons, arrows, and sickles; (2) the bow-and-arrow (which partially replaced spears and atlatls [an atlatl is a spear-throwing device]; (3) domestication of the dog for hunting (which also became pets); and (4) harvesting of wild grain (prior to actual cultivation later). A 100m rise in sea level at this time due to climatic warming led to encroachment of water further inland promoting a northward spread of malaria into populations not yet adapted. The rise in sea level tended to restrict migration; however, trading for obsidian (a type of volcanic stone/glass prized for sharp-edged tools) helped offset this, and promoted knowledge and spread of farming practices and also sailboats and fishing. During the Neolithic, population density increased from 10 to 50-fold over the Paleolithic, supported by the spread of grain-farming. Angel estimates meat consumption fell to 10-20% of the Paleolithic level with this transition in subsistence. Neolithic sites show an increasingly settled way of life as exemplified by evidence of food storage. However, farming was hard work, and skeletal evidence shows signs of the heavy effort needed, which--combined with a diet adequate in calories but barely or less than adequate in minerals from the depleting effects of phytate (phytates in grains bind minerals and inhibit absorption)--led to a state of low general health. The considerable decrease in stature at this time

304

(roughly 4-6 inches, or 12-16 cm, shorter than in pre-agricultural times) is believed to have resulted from restricted blood calcium and/or vitamin D, plus insufficient essential amino acid levels, the latter resulting from the large fall in meat consumption at this time (as determined by strontium/calcium ratios in human bone remains). Most disease stressors in evidence at this time came from crowded settlement, and included hookworm, dysentery, and malaria consequent upon more frequent location of settlements near marshes/streams without tree cover. Also at this time, genetic adaptation to endemic infectious diseases such as malaria began to occur. Low nutritional and health status continued from the late Neolithic with only slight fluctuations until Classical times 5,000 years later, as told in the evidence of skull base height 15% below the Paleolithic norm, a pelvic inlet depth index 7% below, and 3 to 4 times higher rates of dental disease. (Efficient early childhood growth is reflected in skull base height and in evidences of dental health, while pelvic inlet depth index and long-bone roundness are indicators of the degree of late childhood nutrition.) Strontium/calcium ratios point to low levels of red meat consumption. However, zinc levels were on a par with those of modern times (a mineral that typically is gotten in the largest quantities from animal foods) strongly suggesting it was coming from fish, since red meat consumption was low, and the zinc levels found are beyond the amounts possible from plantfood consumption only. Given this animal food source for critical skeletal-building minerals--which would normally also be reflected in good values for skull base height, pelvic inlet depth, and adult stature--the poor mineral status reflected in these measurements points to part of the explanation as the effect of continued phytate intake from grains, a substance which binds minerals preventing efficient absorption.

Angel sums up the Paleolithic-to-Neolithic-and-beyond transition as follows [p. 68]:


Disease effects were minor in the Upper [Late] Paleolithic except for trauma. In postglacially hot areas, porotic hyperostosis [indicative of anemia] increased in Mesolithic and reached high frequencies in Neolithic to Middle Bronze times. [Reminder note: The end of the last Ice Age and the consequent melting of glaciers which occurred at the cusp of the Paleolithic/Neolithic transition caused a rise in sea level, with a consequent increase in malaria in affected inland areas which became marshy as a result.] Apparently this resulted mainly from thalassemias, since children show it in long bones as well as their skulls. But porotic hyperostosis in adults had other causes too, probably from iron deficiency from hookworm, amebiasis, or phytate, effect of any of the malarias. The thalassemias necessarily imply falciparum malaria. This disease may be one direct cause of short stature. The other pressure limiting stature and probably also fertility in early and developing farming times was deficiency of protein and of iron and zinc from ingestion of too much phytic acid [e.g., from grains] in the diet. In addition, new diseases including epidemics emerged as population increased, indicated by an increase of enamel arrest lines in Middle Bronze Age samples.... We can conclude that farmers were less healthy than hunters, at least until Classical to Roman times. [Due to the difficulty in disentangling all relevant factors, as Angel explains a bit earlier] [w]e cannot state exactly how much less healthy they were, however, or exactly how or why. --Ward Nicholson

305

Is Cooked Food Poison?


Looking at the Science on Raw vs. Cooked Foods

by Jean-Louis Tu
Introduction to the controversy The cooking = toxicity = disease paradigm. One of the most common ideas in raw-food circles is
that most diseases are due to toxemia, and that one cannot be truly healthy without discontinuing what is believed to be one of the major sources of such internal toxemia: cooked food, even if such food is unprocessed in any other respect except by heat. While it certainly appears true that many people experience (sometimes impressive) improvements when first coming to raw food, it also appears that long-time pure raw-foodists who have maintained the diet for many years are rare. Anecdotal evidence also suggests (no peer-reviewed research is available on the issue, to our knowledge) that those eating 100% raw foods do not appear to be any healthier on average than people eating predominantly raw, and that raw diets are not the only diets that may work.

Evidence, experience, and arguments addressed in this paper. The material presented here is
based on (A) an extensive review of scientific literature and the logical conclusions to be drawn or inferred from it, as well as (B) personal experience with eating 100% or close to 100% raw food, and reading about many other people's experiences (since little if any scientific research is available on raw-fooders). After looking into and examining here what are, we believe, virtually all of the arguments traditionally offered from both sides for and against cooking, the conclusion we are led to is that the dangers of cooking have been largely overstated. However, at the same time, it would obviously be erroneous to say that eating raw doesn't affect our health in any way, and in fact we do believe the knowledge available indicates eating at least partially raw is important.

Subject is not black-and-white. The objective here will be to investigate one by one all of the known
effects of heating on food, and examine with a critical eye all the classical raw-foodist claims about the necessity to eat raw. In particular, we'll see that some of these claims appear to be true or partially true; others wrong or very doubtful; and also presented will be some benefits of cooking in certain situations-which as we will see depend very much on the particular food in question. The present paper is quite long, unavoidably, due to the complexity of the problem. Things are not blackand-white in this subject, contrary to what many people believe. The hope here is to at least convince the reader of the last point.

Logical sequence of paper. We will begin our look first in Part 1 with such questions as: Is cooked food
toxic? What is the influence of cooking on carcinogenesis? etc. Then, in Part 2, we'll attempt to determine whether raw food is more nutritious than cooked food (what about enzymes, vitamins, etc?). Finally, in Part 3, we'll discuss the question of 100% raw diets versus "predominantly raw" diets.

306

PART 1: Is cooked food "toxic"?


Specific evidence needed rather than vague claims
This part is sometimes fairly technical, but since the raw-foodist claims that cooked food is toxic are so often assertions lacking in specific support, it is necessary that we first examine in detail whether claims about toxicity of cooked food are justified by specific evidence. In particular:

Maillard molecules have been accused by Guy-Claude Burger (founder of Instinctive


Nutrition) of all evils. Below, we show that they do have some (mild) antinutritional properties, but on the other hand are anticarcinogenic. Natural and non-natural carcinogens. It has been said that cooking results in the formation of carcinogens. We review the literature on the subject, and observe that these carcinogens are only formed in meat and fish cooked at high temperatures, and that by themselves they do not explain all incidences of cancers related to food consumption. In particular, there are many other natural and non-natural carcinogens, and the potency of a carcinogen can be affected, positively or negatively, by numerous dietary factors. Toxins in raw foods. We review some of the toxins found naturally in raw foods, and the effect (or absence of effect) of cooking on them. Other common claims about cooking. Then, we investigate other effects of cooking, such as how heating affects fats, and discuss two standard raw-foodist arguments: "Pottenger's Cats" and digestive leukocytosis.

Maillard molecules and heterocyclic amines


The Maillard reaction was discovered in 1916, and typically occurs in all forms of cooking. Our objective here is to examine one by one all the effects of these molecules, and determine whether they have a significant impact on our health or not. We will see that, at least when gentle methods of cooking are used, the effects of Maillard molecules on food quality are minimal.

How the Maillard reaction(s) occur The Maillard reaction is not a single, but in fact a series, of reactions between proteins and
carbohydrates. The reactions occur during storage at room temperature, as well as during cooking, with the rate of reaction accelerating as temperature increases. It should be pointed out that virtually all foods contain both proteins and carbohydrates. Even meat contains very small amounts of carbohydrate, i.e., glycogen (muscles store energy in the form of glycogen) and glucose (blood contains some glucose). Cooked meat contains less Maillard molecules than foods high in protein and carbohydrate, such as milk, that have been heated under the same conditions, but we'll see below that Maillard molecules are the precursor of carcinogenic compounds called "heterocyclic amines" in high-temperature grilled meat and fish.

Browning, aromas, and flavors. So-called "Amadori products" are the result of early Maillard
reactions. Then, brown pigments are created, giving the characteristic color of some cooked foods like bread crust, as well as volatile compounds which give various odors such as roasting aromas. More than 2,000 volatile compounds have been identified (and certainly many more exist) [Finot et al. 1990]. (Note: It may be that Maillard reactions are not responsible for all browning that occurs during cooking and aging; oxidation may also be responsible. For instance, meat browns quite easily despite the minuscule amounts of carbohydrates present with which to react with proteins.)

307

Generation of Maillard products depends on variety of factors. The proportions and the amounts
of different Maillard products depend on processing time, temperature, water activity, and pH, resulting--in particular--in a variety of flavors and colors. This explains why under- or overcooking can spoil the flavor of a meal. (Note: "Water activity" is a number which reflects the active portion of the moisture content of a product, i.e., the part which generates a vapor pressure at the surface of the product. This quantity is correlated with total moisture content. For pure water, water activity is equal to 1. For more technical details, see http://www.rotronic-usa.com/Technical/Aw_define.htm.)

Adaptation and toxicity. One of the main arguments of Instinctive Nutrition is as follows: Since
humans have been cooking for only a relatively short period of time, they can't possibly have adapted (in the Darwinian sense) to so many different chemical by-products which don't occur naturally. (Side note of interest: Instincto literature typically claims cooking has only been occurring for the last 10,000 years, when in fact it has been practiced regularly for, at the least, roughly the last 40,000 years according to paleontological evidence, and perhaps considerably longer. (See Fire and Cooking in Human Evolution on this site, including the postscript, for a more in-depth discussion of the evidence for prehistoric fire use.) The aim of this section is to give a brief review of what is known about the toxicological consequences of the Maillard browning reaction.

Are Maillard molecules unnatural?


First of all, since there are no known raw-food human cultures, even among hunter-gatherers or other more primitive peoples, and since widespread cooking has been around for at least 40,000 years (perhaps considerably longer), one might legitimately wonder whether eating a 100% raw-food diet is still totally natural for human beings, but that is outside the scope of our discussion here.

Both stored and cooked foods contain Maillard products. The second remark is that, since the
reaction can and also does occur at room temperature, certainly many of the Maillard compounds are found in uncooked foods, though in different (usually lower) concentrations than in cooked foods. One may also observe that many raw-fooders are reluctant to consume anything heated above 104F (40C), even if for a few minutes, while they will readily use foods preserved for months (like nuts or olives) which contain a substantial amount of Maillard reaction products and, arguably, are thus hardly more natural than cooked foods.

The body's normal metabolic processes also produce Maillard molecules via non-food pathways. Finally, there has been a recent growing interest in studying the Maillard reaction in vivo (in
living organisms as opposed to in vitro, i.e., in "test tubes" or other situations outside the living organism) and more particularly in relation to diabetes and aging. It is thought that the cross-linking between longlived proteins such as collagen and free sugars (especially fructose, which has a high cross-linking potential) produces Advanced Glycation Endproducts, or AGEs (the products of the Maillard reaction at an advanced stage) which contribute to tissue degeneration [Baynes and Monnier 1989]. For the intrigued reader, fructose is an intermediate product of a chain of reactions called the "sorbitol pathway," one of the several possible pathways of glucose metabolism. (I would like to point out here that other theories of aging, related to telomere length, exist.)

Production of Maillard molecules via elevated blood sugar (diabetes, high-fruit diets) may be more of a concern for raw-fooders. The point of the above is that there is no reason to fear
Maillard molecules excessively, since they are produced naturally inside our body, whether we eat 100% raw or 100% cooked. Moreover, if one of the goals of the raw-fooder is to increase longevity, then it may be more important to regulate blood sugars (since the Maillard reactions that occur among the body's own tissues are accelerated in diabetics [Baynes and Monnier 1989]) than to worry excessively about avoiding dietary Maillard molecules.

308

Note again that dietary Maillard molecules involve FOOD proteins, not the BODY'S proteins; there is no reason to believe that Maillard reaction products consumed in food in any way participate in the body's own internal cross-linking reactions that contribute to aging, since the latter Maillard reaction products are produced as part of normal cellular metabolism, and via a separate biochemical pathway. Given that so many raw-fooders who attempt to entirely avoid cooked foods often tend to eat high-fruit diets (which are higher in sugars), this is a consideration such individuals may want to keep in mind.

Metabolic defenses against AGEs. Furthermore, the body is not defenseless, since AGEs forming on
body proteins such as collagen are recognized and endocytosed (engulfed) by macrophages [Vlassara et al. 1989], i.e., destroyed by white blood cells. (Below, however, we'll see that dietary Maillard molecules can pose risks to the vascular system and kidney in diabetics, but there is no evidence that such toxic effects occur in non-diabetic individuals.)

How important is genetic adaptation regarding cooking? Assessing meaningfulness of the distinction between natural and non-natural toxins
Ames et al. [1990] has discussed the effects of synthetic versus natural chemicals in the body, and his arguments can be extended to the question of Maillard molecules. It is assumed by proponents of Instinctive Nutrition that, because plants have long been part of human evolutionary history whereas cooking is more recent, the mechanisms that animals have developed to cope with the toxicity of natural chemicals will fail to protect us against Maillard molecules. But this assumption is flawed for several reasons:

Mechanisms have evolved for dealing with natural toxins. Various natural toxins cause
cancer in vertebrates; thus, foods are not toxin-free to begin with, and the body has already developed mechanisms to deal with natural toxins. Also, presumably one might expect that since Maillard reactions occur in nature and, as noted above, in the body itself, the body would likely be equipped with detoxification mechanisms (liver or other organs) to handle Maillard products, at least in some measure.

Defenses that animals have evolved against exogenous toxins are mostly of a general type:
o o o o Shedding of surface layers of the mouth, esophagus, stomach, intestine, colon, skin, and lungs. General detoxification mechanisms, such as antioxidant defenses; glutathione transferase for detoxifying alkylating agents, some of which are mutagens, either directly or once activated by microsomal (liver) enzymes. Efflux (direct excretion) of toxins out of liver and intestinal cells. Klohs and Steinkampf [1988] discuss how efflux can cause drug resistance in colon tumors. The same process may provide protection against natural toxins, e.g., plant alkaloids. DNA repair.

Even modern "natural" foods can contain novel substances new to the human experience. Very few of the plants that humans eat in modern times would have been present in
the diet of ancient African hunter-gatherers (our original ancestors). Thus, to implicitly expect that our bodies can handle these new foods and the new toxins they may contain, but not also the toxins cooked foods may contain is based on a certain amount of naivete and overidealism. New foods include: cocoa, potatoes, tomatoes, avocados, mangoes, olives, and kiwi fruit. In addition, cruciferous vegetables such as cabbage were used in ancient times primarily for medicinal purposes, and thus used only occasionally. As a consequence, they can almost be considered as a "new" food.

Some anticarcinogens in the diet protect as well against artificial as against natural carcinogens.

309

Synergistic toxic effects may occur in both natural and non-natural carcinogens. It
has been argued that synergism between synthetic carcinogens can multiply hazards; however, the same is also true of natural chemicals. Some natural/non-natural toxins may share common detoxification pathways. Some natural toxins have the same mechanisms of toxicity as synthetic toxins; thus, even some specific (as opposed to generic) detoxifying mechanisms which work against natural substances may work against some artificial substances, as well.

General nature of most detoxification mechanisms blurs some of the distinction between hazards of natural/non-natural toxins
None of this is to say that adaptation is not important (it is), but that one should not be unduly afraid of the fact that possibly thousands of unknown different chemical variants are produced by cooking, because there are also thousands of toxins in natural foods already. Indeed, while humans could not possibly have developed distinct detoxification mechanism for each one of these molecules, there may be no particular need to.

Comparing degree of toxicity between natural/non-natural chemicals is not necessarily easily predictable or black-and-white. Recall that the body's detoxification mechanisms even for the
natural toxins it has encountered during its evolutionary history are mostly of a more general nature, and may therefore be reasonably expected apply to a wide range of "artificial" chemicals that might be produced by cooking as well. And we know that some molecules humans haven't previously encountered can be safe, even safer than toxicants occurring in nature. Indeed, artificial preservatives are allowed only after testing on laboratory animals (while much has been said against preservatives, at least no one dies or becomes violently sick from their ingestion); malathion, the main synthetic organophosphate pesticide residue in our diet, is not a carcinogen in rats or mice. On the other hand, intoxications by cyanogenic glycosides (present in apricot kernels and cassava tubers) have been reported, and natural carcinogens are quite widespread (see below). (Note: It's not being claimed that malathion, mentioned just above, doesn't have other toxic effects; but as stated, it has not been found to be a carcinogen in rats or mice. And it should be mentioned as well that Ames' research referred to above is sometimes misused by the chemical industry to gloss over very legitimate safety concerns. However, the lesson from Ames' research is that some natural toxins are as toxic as some synthetic ones; and that, as well as some artificial chemicals, some "natural" pesticides which are used in organic agriculture should be tested and possibly prohibited if judged too dangerous.)

Plant toxins the result of evolutionary "arms race" to protect against overexploitation by animals. One must be realistic that the very reason the body has evolved detoxification and excretion
methods in the first place is because it is quite normal and necessary for it to be able to handle a given load of toxicants within some reasonable, if modest, range anyway. Nutrition is a cost/benefit transaction where nutrients are gained at the cost of dealing with accompanying toxins and other unusable material. All plants have evolved toxin defenses as part of their survival strategy to discourage overpredation by animals. (This is common knowledge among evolutionary biologists but generally unrecognized, or at least unappreciated, in dietary circles.) Some toxins, of course, may be less prevalent in certain plant parts than in others (in fruits, for example, which are a "trojan horse strategy" by plants to encourage certain animal species to distribute the seeds, via their feces, by eating the fruits). Nonetheless toxins are present at some level in fruits. (Also, very few mammals can exist primarily on fruits; and in any event, fruits are not continuously available in the wild, necessitating a broader range of feeding habits for mammals who do eat fruit as a component of their diets.) Thus, toxins in one form or another are an unavoidable "price" of the search for food.

Even in "wild nature," nutritional cost/benefit trade-offs drive opportunistic human decisions about eating raw vs. cooked. As we will see with the hunter-gatherer tribes covered in Part

310

3, it may not always be the case that the net available usable nutrients from a food (once toxins and antinutrients are neutralized) is better in the raw plants that are available in a given environment than in their cooked form. Here, we see that rawist ideas about a toxin-free natural diet are exercises in idealism rather than reality, especially when we see some raw-fooders becoming emaciated from missing out on valuable nutrients in their quest to avoid all toxins.

Antinutritional properties of Maillard reaction products


First, it is important to assess which antinutritional properties have the most deleterious consequences on, say, the average American or Westerner. (Definition: An "antinutrient" is a substance that is not necessarily toxic per se, but is detrimental because it either inhibits digestion of certain nutrients or else binds with them during digestion to prevent uptake.) It appears, from Pao and Mickle [1981], that the "problem" nutrients, i.e., the nutrients that at least 30% of the population are deficient in (<70% of the RDA), are: vitamin B-6, magnesium (Mg), calcium (Ca), iron (Fe), vitamin A, and vitamin C. Adults usually have an adequate intake of proteins, but protein quality is more important for infants and hospitalized patients.

Lysine availability
The loss of lysine via Maillard reactions generally occurs in storage. The main effect of Maillard reaction products is that they reduce availability of some amino acids, primarily lysine.

Tests under extreme conditions. Panigrahi et al. [1996] tested the effects of storage (under
tropical conditions) on the lysine content of maize. They found that the most severely discolored maize lost 50% of its lysine content, primarily due to Maillard reactions. Other amino acids were affected, such as arginine (37% reduction) and glycine (15%). The Maillard products also lowered digestibility by pancreatin. The consequence was that the growth rate of chicks fed (storagedamaged) maize was 10% lower than the controls. However, no other effect on the weight of the liver or pancreas of the treatment chicks was observed. From Hurrell [1990], whole milk powders containing 2.5% moisture were stored at 60C (140F) for 9 weeks, during which 40% of the initial lysine was progressively transformed into lactulosyllysine, a product that blocks assimilation of lysine. At 70C (158F), 50% of the initial lysine units were blocked as lactulosyl-lysine after only 2 weeks storage (in a product which still retained its natural color). From 3 weeks onward, however, the product became a deep brown color as lactulosyl-lysine was destroyed and advanced Maillard reactions took place. Albumin stored with glucose for 30 days at 37C (98.6F) was found to have about 30% of its original lysine units intact.

Results under more commonly expected conditions. Extreme storage conditions such as
these would never occur in actual practice, of course, so it is important to evaluate the consequences of more common practices. Thus, conventional sterilization of milk can give 1015% lysine blockage. Keep in mind that milk is a food containing a lot of protein and sugar in solution, so the amount of Maillard molecules produced is much higher in milk than in other (conventionally) cooked foods, in which we would reasonably expect a reduction of lysine availability of 0 to 10%.

In summary, the conclusion of the above is that, except possibly for infants, the reduction of lysine availability is not an important health hazard for humans. We will also see in Part 2 of our article that cooking may sometimes improve the nutritional value of a food for other reasons, so the net benefit may or may not be positive, depending on the food used and the cooking method.

311

Certainly, all arguments speak against cooked milk, but as some foods may benefit from cooking, generalizing from experiments on milk to cooking in general is of doubtful value. Effects of Maillard products on vitamins
Maillard molecules also have antinutritional properties with regard to vitamins. When spray-dried whole milk powder was stored for 9 weeks at 60C (140F) and underwent early Maillard reactions, there was a moderate loss (less than 20%) of some vitamins, especially B-1 and B-6. At 70C (158F), however, there was a much more rapid and dramatic destruction of the vitamins B-1, B-6, B12, and pantothenic acid (over 50% loss in 3 weeks) [Ford et al. 1983]. Again, these conditions of storage were extreme, and milk is particularly sensitive; thus, parallel to the situation above with lysine, it is likely one might expect considerably less loss under more commonly encountered conditions. Also, as we will see in a later section of this paper, the total amount of vitamins lost in foods after conventional cooking procedures (i.e., from all causes, including heat destruction) are relatively modest. Thus, the remaining portion of the loss that may be due to Maillard molecules produced during normal cooking must also be relatively small. Unfortunately, however, at the time this paper was written we were not able to locate more relevant data to assess the exact effect. (The role of heat in the destruction heat-labile vitamins will be covered in Part 2.)

Effects of Maillard products on minerals


Mineral metabolism is complex because of certain factors that can affect absorption, retention, and excretion, and which don't necessarily come into play with respect to other nutrients such as most vitamins. For example, where absorption is concerned, chelation or other binding effects (such as occurs with phytic acid) will impact absorbability of divalent minerals. Where retention is at issue, a calcium ion is not necessarily stored in bone, for instance, because the acid/alkaline balance can affect excretion, or loss of calcium from bone. A few studies have been made to determine whether Maillard molecules induce mineral depletion.

Effect on zinc possibly mildly problematic. Citing two studies, Johnson [1991] notes that Maillard
products in solutions for parenteral nutrition--that is, when fed intravenously--increased urinary losses of minerals: specifically iron, zinc, and copper. Johnson [1991] discusses relevant studies she has conducted, e.g., feeding human volunteers a diet high in Maillard products (via baked goods, which included browned grain cereals in some studies) and other studies on rats. She also discusses the results of other, related research studies. The results of these studies are mixed; there is some evidence of zinc-binding in substrates of browned albumin due to the presence of Maillard reaction products, but human feeding studies don't necessarily show lower zinc absorption as a result of diets that include "browned" foods (i.e., that contain Maillard reaction products). Johnson [1991] also reviews the results of published studies on feeding rats mixtures that include Maillard reaction products. The work of O'Brien et al. [1989] (as cited in Johnson [1991]) of feeding rats mixtures of glucose, monosodium glutamate, and MRPs found increases in urinary losses of several minerals: calcium, magnesium, zinc, and copper. The effect on zinc was highest among the minerals mentioned. Furniss et al. [1989] studied the effect of feeding rats diets that included varying types of Maillard reaction products, and found that feeding casein-glucose MRP mixtures caused a 6-fold increase in the urinary excretion of zinc; casein-lactose MRP mixtures a 2-fold increase. They noted that urine is a minor excretion route for zinc (when compared to fecal excretion) and that Maillard products appear to have no influence on overall zinc retention or on zinc status in rats. However, Lykken et al. [1986] found that zinc absorption in humans was 47% from corn grits and 37% from corn flakes.

Problems with zinc perhaps more of a concern in relation to other factors (phytates in grains). From the above results, keeping in mind once again--as with the data on lysine and vitamins--that

312

the storage conditions were extreme and that the Maillard reaction occurs at a much faster rate in caseinlactose mixtures than in foods other than milk, it seems that the only mineral deficiency likely to be caused by cooking would be a (mild) zinc deficiency. Let's add, as a side note, that zinc deficiencies are not uncommon in people eating raw, especially those who are vegetarians. The reason is that zinc is more available in animal sources than in vegetarian ones, where it is often bound to phytic acid (as in grains). There is evidence [Donovan et al. 1995] that vegetarians often have a lower zinc status than omnivores. (That the same is true with raw vegetarians is anecdotal evidence, since virtually no studies are available on the latter.) For more on the question of zinc in omnivorous compared to vegetarian diets, see Key Nutrients vis-a-vis Omnivorous Adaptation and Vegetarianism: Zinc

Maillard molecules and heterocyclic amines: realistically assessing the risks


Do Maillard products carry potential risks for mutagenicity and carcinogenicity? The relationship between mutagenicity and carcinogenicity.We shall see that Maillard molecules
are the precursors of heterocyclic amines, which form in meat and fish cooked at high temperatures. These latter compounds are known to be mutagenic, but whether they are carcinogenic (might induce cancers) in humans using normal methods of cooking is controversial. Let's recall here how mutagenicity and carcinogenicity relate to one another. A carcinogen is a substance that promotes the disease called cancer. A mutagen is a substance that promotes mutations (modifications of the DNA). A mutation may or may not cause cancer. (Cancer is a multi-stage process that involves much more than a single mutation; DNA can be repaired; and many more mutations affect non-coding portions of the DNA than those that govern phenotype/physiological functioning.) In fact, it is natural for some mutations to occur: at modest levels they are what drive the evolutionary process. Some mutagens are not carcinogens. Almost all known carcinogens are mutagens, but not all, because some substances can have growth-promoting activities, or be involved in non-mutational events in the development of a tumor.

Mutagenicity gauged experimentally in single-celled organisms. So, since the two concepts are
distinct, and carcinogenicity is usually of most direct concern to human health, why bother studying mutagenicity? The answer is that the two concepts are correlated, and the latter can be studied and quantified much more easily and quickly, in test tubes, in single-celled organisms such as bacteria, whereas the very notion of cancer only makes sense for multicellular organisms and takes much longer to develop. We'll also see that the Maillard molecules themselves are not carcinogenic--in fact, they appear to have anticarcinogenic effects. Finally, we mention that many factors in diet may influence, in either direction, the onset and development of cancer. Looking at any one factor in isolation may mean little without also evaluating its relationship to, and potential influence on, other factors in the overall mix.

Mutagenicity and heterocyclic amines


The formation of heterocyclic amines, which involves the Maillard reaction, is thought to require three classes of precursors: creatine or creatinine, free amino acids or dipeptides, and sugar [Skog 1993, 1995]. Creatine and creatinine are present mostly in meat and fish. Mutagens in meat are usually produced (in parts per billion) in the crust during frying, broiling, and baking [Jagerstad et al. 1991]. Another important source is meat extracts, consumed as gravies and meat bouillons. Mutagenicity increases with temperature and with cooking time. Patties made of bovine (cow) muscle or tongue were not significantly mutagenic when fried at 150C (302F) for 3 minutes each on both sides [Jagerstad et al. 1991, pp. 92-93]. An experiment by Knize et al. [1994] on fried beef patties showed that the increase in PhIP and MeIQ formation (two heterocyclic amines) is exponential over the range 0 to 11 minutes and 150 to 230C (302

313

to 446F), which might be a good argument in favor of raw or rare meat versus well-done. The table below presents some examples of cooked foods with their PhIP and MeIQx contents (which account for a large part of cooked-food mutagenicity).

AMOUNTS OF HETEROCYCLIC AMINES IN SELECTED COOKED MEATS


Figures are in nanograms of heterocyclic amine per gram of food. (Note: a result of zero means "undetectable" amounts.)

FOOD / Method of Preparation Well-done flame-grilled chicken breast Well-done grilled chicken Fried fish Very well-done oven-broiled bacon Pan-fried very well-done sausage Sausage links Broiled pork chops Hot dogs Ham slices Microwave-cooked hamburger

PhIP ? 226 69.2 30.3 ? ? ? 0 0 0

MeIPx ? ? 6.44 4 5.4 1.3 0 0 0 0

Total HCAs > 300 ? ? ? ? ? ? 0 0 0

Information Source Knize MG, et al. 1997 Holder CL, et al. 1997 Zhang XM, et al. 1988 Sinha R, et al. 1998 Sinha R, et al. 1998 Sinha R, et al. 1998 Sinha R, et al. 1998 Sinha R, et al. 1998 Sinha R, et al. 1998 Holder CL, et al. 1997

From Sugimura et al. [1990], the mutagenicity of heterocyclic amines has been tested on the two strains TA98 and TA100 of the Salmonella bacteria. The results, compared with Aflatoxin B1 (a mutagen present in some molds) and benzo(a)pyrene (a polycyclic aromatic hydrocarbon, or PAH--more later) are shown in the table below. Results were obtained by using the Ames assay (http://embryo.ib.amwaw.edu.pl/~dslado/invittox/prot/30.htm), which is a standard test to measure the mutagenicity of chemicals on certain strains of bacteria. These bacteria (e.g., Salmonella TA98 or TA100) have been selected (bred) to require histidine (for growth). After a standard time (48 hours) at a standard temperature (37C=98.6F), the number of "revertants" (bacteria which mutate back to a non-histidinerequiring state) is measured.

MUTAGENICITY OF HETEROCYCLIC AMINES in TA98 and TA100 Salmonella


Figures in the table are expressed in number of revertants per microgram. (Higher figures represent greater levels of mutagenicity.)

Substance | Source

Mutagenicity on

314

Salmonella TA98 HETEROCYCLIC AMINES IQ MeIQ IQx MeIQx 4,8-DiMeIQx 7,8-DiMeIQx PhIP Trp-P-1 Trp-P-2 Glu-P-1 Glu-P-2 Phe-P-1 A(alpha)C MeA(alpha)C Broiled sun-dried sardine Broiled sun-dried sardine Fried meat Fried beef Heated mixture of creatinine, threonine, and glucose Heated mixture of creatinine, glycine, and glucose Fried beef Tryptophan pyrolysate Tryptophan pyrolysate Glutamic acid pyrolysate Glutamic acid pyrolysate Phenylalanine pyrolysate Soybean globulin pyrolysate Soybean globulin pyrolysate FOR COMPARISON Aflatoxin B1 Benzo(a)pyrene --6,000 320 2,800 600 433,000 661,000 75,000 145,000 183,000 163,000 1,800 39,000 104,200 49,000 1,900 41 300 200 7,000 30,000 1,500 14,000 8,000 9,900 120 1,700 1,800 3,200 1,200 23 20 120 TA100

As seen from the above, some compounds are potent mutagens in Salmonella. The specific mutagenic activity of PhIP is relatively low, but the content of PhIP in food is more than 10 times higher than those of other heterocyclic amines. It is thought that the main food mutagens are MeIQx, PhIP, DiMeIQx, and IQ [Augustsson 1997, Johansson 1994, Zhang 1988]. However, in contrast with heterocyclic amines' potent mutagenicity in Salmonella, they are, surprisingly, only moderately carcinogenic on rats and mice. Each heterocyclic amine has one or more specific target organs (e.g., liver, colon, breast, etc.). [See also Knasmuller 1992, Nagao 1993.]

315

Heterocyclic amines and cancer risk: lab vs. real-world conditions Doses in lab animals many times higher than normally encountered in food. In evaluating the
potential real-world effects here, note that the preceding discussion and tables concern lab results that utilized extremely high doses to produce easily measurable effects. An important proviso to be kept in mind is that cancers in laboratory rats and mice are induced by doses (per unit of body weight) several orders of magnitude higher than what are usually ingested in normal meals by humans. (Typically, the amounts of HCAs are over a million times higher, based on calculations from data furnished in Augustsson [1999].) In fact, by itself, the ingested daily amount of heterocyclic amines is very probably too small to explain the development of human cancers, and the same is true for numerous other carcinogens. Thus, the simultaneous presence of heterocyclic amines with other genotoxic [i.e., inducing DNA damage] carcinogens and with tumor-promoting agents or tumor-promoting conditions makes it very difficult to make a numerical calculation for risk estimation.

Epidemiological studies done so far remain inconclusive. Some attempts have been made to
determine whether heterocyclic amines have a consistent carcinogenic effect in humans by conducting epidemiological studies (such as trying to correlate the rate of colon cancer with the preference for welldone meat), but are not conclusive (yet?): e.g., Sinha et al. [1997], Probst-Hensch et al. [1997], Ward et al. [1997], and Augustsson et al. [1999]. Let's take a look at the latter study by Augustsson et al. Authors of the study randomly selected 553 controls (that is, people who did not have cancer), 249 cases of rectal cancer, 273 cases of bladder cancer, and 138 cases of kidney cancer. Cancer risk calculations were based on previous laboratory analyses of HCA concentrations observed under given cooking conditions, and on an extensive food-frequency questionnaire including 188 food items, asking for the type of meat and fish ingested, frequency of consumption, portion size, cooking methods, and degree of surface browning. (Participants were asked to compare the color of the foods eaten with a few color photographs showing dishes fried at four different temperatures.) The population was then divided into five quintiles according to total HCA intake, where quintile 1 consisted of the 20% who had the lowest intake, and quintile 5 consisted of the 20% who had the highest, and so on. The result obtained is summarized in the table below:

CANCER RISK BY RELATIVE AMOUNT OF HETEROCYCLIC AMINE INTAKE Expressed as risk relative to quintile 1, where quintile 1 = the lowest (reference level) HCA intake in the study population, and quintile 5 = the highest. Cancer Site Colon Rectum Bladder Kidney RELATIVE RISK (95% confidence interval) Quintile 2 1.1 (0.7-1.7) 1.3 (0.8-2.0) 1.5 (0.9-2.6) 1.3 (0.7-2.3) Quintile 3 0.8 (0.5-1.3) 0.8 (0.5-1.4) 1.5 (0.8-2.6) 1.2 (0.6-2.2) Quintile 4 0.7 (0.5-1.1) 0.7 (0.4-1.1) 1.4 (0.8-2.4) 1.1 (0.7-2.2) Quintile 5 0.6 (0.4-1.0) 0.7 (0.4-1.1) 1.2 (0.7-2.1) 1.0 (0.5-1.9)

These numbers were adjusted for age, sex, daily energy intake (and smoking status for bladder and kidney cancer). Quintile 1 (low HCA intake) here is the "reference category," in the sense that the risk for cancer

316

in this quintile was normalized at 1. Thus, the entry "0.6" for quintile 5 concerning colon cancer means that people who cook their meat and fish at the highest temperatures have a decreased risk (by 40%) compared to those who cook their meat and fish at the lowest temperatures; on the other hand, HCAs increase risk for bladder cancer.

Overall impact, if any, appears to be small. Numbers inside parentheses represent the 95%
Confidence Interval (CI). For instance, in quintile 5 of the general population, there is a 95% probability that the factor by which the risk for bladder cancer due to elevated HCA intake is increased (compared to those who have a low intake) lies in the interval 0.7-2.1. We see that the results lack precision; nevertheless, it is quite apparent that HCA intake cannot have more than a very small impact on the incidence of cancer in the colon, rectum, bladder, or kidney. The article also includes a table (not reproduced here) where the individual effects of IQ, MeIQ, MeIQx, DiMeIQx, and PhIP were recorded. (These HCAs are suspected to be the main possible carcinogens in humans.) Again, each of the HCAs was found to slightly decrease the risk for some cancers, and slightly increase the risk for others. The authors note that: Heterocyclic amines have been shown to be carcinogenic in animals. This carcinogenic effect is induced by high doses, such as 10-400 mg/kg of body weight. The lack of carcinogenic effect of heterocyclic amines in our study may be due to the much lower intake in the study population (median 1 ng/kg of body weight). In other words, doses required to induce cancer in laboratory animals are typically 10 million times higher (or even more) than those obtained by ordinary cooking.

Extreme cooking times/temperatures may represent the main risk. While the median HCA
intake was less than 100 ng, it was observed that seven participants had intakes higher than 1,900 ng--and all of them were cancer cases. A sample of only seven people is certainly too small to provide statistically significant results, but this finding suggests that HCAs may induce cancer at the very highest doses that could conceivably be encountered in food (i.e., in people who consume well-done, fried/roasted meats very frequently). As a final note, the authors also observe that: "[T]he effect of heterocyclic amines may be strongly modified by genetic factors, such as acetylation status (Roberts-Thompson et al., 1996), that vary between different populations." In other words, since most of us do not know what our acetylation status is, it is preferable to be moderate with regard to intake of fried or roasted meat and fish (just in case one may be genetically predisposed to HCA sensitivity). At the same time, in light of the paper by Augustsson et al. [1999], we also note that one needn't fear cooked meat like violent poisons, as many raw purists seem to do. Further complicating the picture is that many other substances that can contribute to cancer exist, some of which are found in raw plant food sources themselves [Ames 1990]. Thus, an environment totally free of carcinogens is not possible. Finally, it should be pointed out that some mechanisms of metabolism and detoxification of heterocyclic amines are known to exist and have been described [Turesky 1990].

Confounding factors regarding composition of domesticated meats vs. wild game are not accounted for in current HCA studies. Aside from cooking, one confounding effect that afflicts the
interpretation of current epidemiological studies with regard to meat in the diet is the widely divergent fat profile of domesticated/feedlot meat in Westernized diets compared to the wild game in hunter-gatherer diets on which humans evolved. Among other differences, domesticated meat has roughly 5 times the fat, and the fat present is 5-6 times as saturated [Eaton 1996, p. 1735]. In addition, the overall balance of

317

foodstuffs and nutrients in the omnivorous diets of hunter-gatherers also differs greatly from that in omnivorous modern Western diets. Failing to account for such differences--especially given that high levels of saturated fats are under investigation as a potential trigger for cancer--represents a widespread but unappreciated fallacy/oversight often involved in studies of the health effects of meat and/or omnivore diets in general, whether or not cooking is involved. For more in-depth information on these points, see the articles Hunter-Gatherers: Examples of Healthy Omnivores; The "Omnivorism = Western Diet" Fallacy; and Logical Fallacies and the Misinterpretation of Research elsewhere on the site.

Mutagenicity/carcinogenicity of Maillard reaction products Maillard molecules are not carcinogenic, and instead appear to have antioxidant properties.
In contrast with heterocyclic amines, Maillard molecules do not have carcinogenic properties. On the contrary, Maillard reaction products seem to have an antioxidative effect in vivo [Chuyen et al. 1990]. From Aeschbacher [1990] (see also Powrie et al. 1986]), Maillard reaction products that occur predominantly in heat-processed carbohydrate-rich foods (mainly bakery products, coffee, caramel) are predominantly of the melanoidin type and some are compounds involved in aroma formation. Studies in vivo show that these compounds are not involved in cancer induction (unlike heterocyclic amines). They might even protect from cancer due to their antioxidant properties. One of the reasons why heated carbohydrate-rich food products don't induce cancer in vivo is that some liver enzymes, called "catalases," are able to abolish the (weak) mutagenic effect of these foods. In contrast, the mutagenic activity of heterocyclic amines (found in heated protein-rich foods) is activated by these catalases.

Other anticarcinogenic effects of Maillard products in vivo include: inhibition of the formation of
(carcinogenic) nitrosamines; scavenging of active oxygen (it is known that oxidative damage to cells, in particular--but not only--to DNA, contributes to carcinogenesis); and changes in chemical structures of carcinogens. From Shibamoto [1989], no significant mutagenic activity from MRPs is obtained below 130C (266F). Boiling alone doesn't produce mutagenic activity. On the other hand, one may argue that heating destroys some natural antioxidants such as vitamin C, but it appears that the two effects compensate each other, so that, in an experiment of Nicoli et al. [1997] on tomatoes, the overall antioxidant properties of the food products were maintained or even enhanced by the development of Maillard reaction products.

Other factors influencing carcinogenesis

Fat. Alldrick et al. [1987a,b] have investigated the role of dietary fat as a modifier of metabolic
enzyme activity. They concentrated on the effect of fat on the conversion of three heterocyclic amines: IQ, MeIQ, and MeIQx. High-fat diets increased the conversion of these compounds into bacterial mutagens, and the magnitude of the increase was dependent on the type of fat used (olive oil gave the greatest response, and sunflower oil the least). Fiber. From Kada et al. [1984], Lindeskog et al. [1988] and Howes et al. [1989], high-fiber diets decrease mutagenicity of heterocyclic amines, possibly due to a binding effect, among others. Heated meat. Paradoxically, heated meat products also contain constituents which might inhibit the carcinogenicity of the heterocyclic amines also present in heated meat [Ha, 1987]. Plant polyphenols. It is known that polyphenolic acids and flavonoids inhibit the mutagenicity of benzo(a)pyrene, which is a polycyclic aromatic hydrocarbon (see below) [Wood et al. 1982, Huang et al. 1983, Mukhtar et al. 1984, Teel et al. 1985].

318

Experiments performed by Alldrick et al. [1986] showed that a number of polyphenolic acids failed to inhibit the mutagenicity of a number of heterocyclic amines, but that three flavonoids (quercetin, morin and myricetin) all inhibited IQ, MeIQ, MeIQx, Trp-P-1 and Trp-P-2 in vitro. However, in vivo, the mutagenicity of MeIQ and Trp-P-2 was actually increased [Alldrick et al, 1989].

Allergenicity Most food allergens are natural proteins whose allergenic properties are either destroyed or unaffected by heating [Yunginger JW, 1991]. Examples: Persons allergic to fresh tuna or fresh salmon
may be able to tolerate ingestion of canned tuna or canned salmon (the prolonged cooking undergone by these products may denature the allergenic components). The major egg (white) allergen, ovomucoid, is heat-stable, and people sensitive to this component may react to foods containing cooked eggs as well to raw eggs. Many peanut allergens are both present in raw and heated peanuts, but high-temperature processed peanut oil is not allergenic, contrary to cold-pressed peanut oil.

Lactose/protein combination an exception. However, beta-lactoglobulin allergenic activity is


enhanced by the Maillard reaction between lactose and protein [Matsuda T et al. 1990]; elevated skin reactivity of heated cow's milk proteins had earlier been demonstrated by Bleumink E [1966, 1968, 1970]. A case of allergy to heated and/or aged pecan nuts has been reported by Malanin et al. [1995]. However, such reports (cases of people having more acute allergic reactions to a given food when it has been heated but not when it is raw) are quite rare in the scientific literature. In summary, except for dairy products, cooked foods generally are less than or equally allergenic when compared to raw foods.

Risks for the vascular system and the kidney?


Let's investigate here the possible effects of Maillard molecules on the vascular system and the kidney. Rats fed 5% to 10% Maillard reaction products (MRPs) show various pathologies (including of the kidney) [O'Brien et al. 1988]. Rats fed for 8 to 10 weeks with a mixture of casein and glucose which had been previously heated for 4 days at 65C (149F) showed abnormalities of the kidney [Erbersdobler et al. 1984].

There is evidence that MRP clearance by the kidney of diabetics is impaired, which causes
elevated serum AGEs (advanced glycation endproducts) [Koschinsky et al. 1997]. This may constitute an added chronic risk for renal-vascular injury in diabetes: previously it had been shown [Vlassara et al. 1995] by injecting AGEs in nondiabetic rabbits that elevated serum AGEs contribute to atherogenesis (clogging of the arteries).

Dietary AGEs are much more problematic for diabetics than for nondiabetics because (a) there is
an accelerated formation of AGEs in the body, and (b) AGE clearance is impaired in diabetics. However, note that in the above-mentioned experiments [O'Brien et al. 1988, Erbersdobler et al. 1984], rats were fed a considerable amount of MRPs: no one cooks their food for 4 days, and formation of MRPs is faster in artificial mixtures of casein + glucose than in whole foods.

No evidence of risks in nondiabetics. More to the point, in the concentrations at which they appear in
traditionally cooked foods, we find no evidence that MRPs can cause any risks in nondiabetic people. (And it is doubtful diabetes would exist but in rare cases if people were to consume an evolutionarily congruent

319

diet with limited carbohydrate levels so as not to promote or exacerbate insulin resistance in the first place.)

Polycyclic aromatic hydrocarbons (PAHs) and sensitivity to cooking method Polycyclic aromatic hydrocarbons are produced when any incomplete combustion occurs
[Lijinsky 1991]. Thus, they are found in polluted air, cooking oil fumes, tobacco smoke, smoked foods [Gomaa 1993], and foods cooked at high temperature. Let's note here that comparisons between cooked food and smoking are of limited value, not only because PAHs are absorbed through the digestive tract in one case and through the lungs in the other, but also because it is thought that the main carcinogen in tobacco smoke is the presence of nitrosamines. Moreover, cooked food (in general) doesn't contain tar or nicotine. Air pollution usually contains many fewer PAHs than what a smoker breathes, and in fact much less than the average level of PAHs in food. (Even raw food can contain PAHs, due to air pollution [Lodovici 1995, Wickstrom 1986].)

Most PAHs are not carcinogenic, although a few are (such as benzo(a)pyrene). They appear mainly in meats cooked at high temperature--grilling, for example. Microwaving doesn't
produce PAHs, and foods other than meats contain negligible amounts of PAHs. Foods low in fat, or cooked beneath the source of heat, contain many fewer PAHs, so the type of food cooked and the method of cooking are important. More quantitatively, the table below shows the amounts of pyrene and benzo(a)pyrene, two PAHs known to be carcinogenic, expressed in micrograms per kilogram of food. There are of course other carcinogenic PAHs, and many more noncarcinogenic ones, but the point here is just to get an idea:

AMOUNTS OF POLYCYCLIC AROMATIC HYDROCARBONS (PAHs) IN SELECTED COOKED MEATS Figures are expressed in mcg per kg of food. Food Source / Cooking Method Pyrene Frankfurter sausage, log fire Frankfurter sausage, cone fire Frankfurter sausage, fried, oven-baked Barbecued pork ribs Pork, charcoal-broiled Steak, flame-cooked Beef patties, charcoal-broiled Smoked haddock 20-450 21-84 1-3 42 24 20 7-8 1 Amount of: Benzo(a)pyrene 6-212 2-31 <1 11 8 4 3-11 <1

320

As we see above, the ranges are quite wide, probably due to extreme sensitivity to the cooking method. Oven baking produces minimal amounts of PAHs, even in the most fatty meats like frankfurter sausages.

Natural toxic constituents in food, and effect (or absence of effect) of cooking
Here, we will only investigate a few examples, since the list of all natural toxic constituents would be extremely long [Ames 1983], besides which not all have been studied yet. It will appear that heating does not destroy all of these constituents, and that some (but not all) of the toxins listed below are found in foods that are commonly eaten cooked, but that are inedible raw, except perhaps in small quantities (like potatoes). Thus a good argument in favor of eating raw is that you reduce your exposure to many natural toxins.

"Avoiding toxins" does not solve the nutritional cost/benefit trade-off. On the other hand, as we
shall see in Part 3, there is a trade-off between toxicity and deficiencies, in that if one tries to avoid as many toxins as absolutely possible, they would have to avoid so many foods that deficiencies would likely ensue from the severe dietary restriction. Also, from the examples that will be presented below, trying to eliminate toxins completely is a hopeless task in any event (for raw as well as cooked-food eaters).

"Alien" proteins vs. cooked forms of proteins. Finally, we notice that with the Paleolithic diet,
which is not an all-raw diet, foods that are not edible raw are avoided but for different reasons than toxicity per se. From the Paleolithic diet point of view, it is more important to avoid "alien" proteins that cause problems via molecular mimicry (i.e., autoimmune reactions) than to avoid cooked forms of proteins we are adapted to. In this view, it is more a consideration that foods requiring processing were ones that were introduced relatively late in human history, so that genetic adaptation to them is not yet complete, irrespective of any cooking/toxicity concerns, which are seen as more minor issues.

Considering the practicalities. Remark: We do not recommend worrying excessively about completely
avoiding (all of) the foods listed below, first of all because the list is not exhaustive and toxins are quite widespread in nature. And secondly because our body has many detoxification mechanisms which are specifically designed to allow it to handle moderate, relatively normal amounts of toxins without deleterious consequences. Our goal, rather, is to show that one needn't be excessively concerned about potential harmful effects of the chemical constituents created by cooking, since all living animals are naturally exposed to quite a variety of toxins in the foods they eat anyway.

Mycotoxins
From Marth [1990], mycotoxins (toxins produced by molds) are completely destroyed at their melting point, which is generally at high temperatures: 164C (327F) for Zearalenone, 170C (338F) for Rubratoxia. When roasting peanuts, the toxicity of aflatoxin B1 is reduced by 70%, and that of aflatoxin B2 by 45%. Thus, heat treatment cannot be considered as a satisfactory means to eliminate mycotoxins.

Beans Raw kidney beans at a level as low as 1% of diet can cause death in rats in two weeks. Beans cooked at 100C (212F) for 30 minutes, and incorporated at a level as high as 20% of diet, do not retard growth when tested against casein. (Feeding rats with casein instead of beans doesn't affect their growth rate, and can therefore be used as a baseline for comparison.) However, when beans which have been cooked at the lesser temperature of 70C (158F) for 30 minutes are incorporated, growth retardation is almost as great as that which occurs when raw beans are fed. The small amount of lectin present in beans cooked at 70C might be responsible for this effect [McPherson 1990]. However, cooking kidney beans doesn't destroy all antinutrients [Grant et al. 1982].

321

Fava beans. The well-known disease "favism" is caused by consumption of fava beans in genetically
susceptible individuals. Such individuals carry a polymorphism of a gene (present in some regions where malaria is prevalent) that is thought to protect against malaria but also results in severe deficiency of glucose-6-phosphate dehydrogenase (G6PD) [Golenser 1983].

Soybeans. From Liener [1994], soybeans contain some heat-labile protease inhibitors and hemagglutinins.
("Heat-labile" means those susceptible to changes by heat; a hemagglutinin is something that causes red blood cells to clump together.) Soy also contains factors that are relatively heat-stable, though of lesser significance, such as:

Goitrogens: substances that cause goiters, i.e., an enlargement of the thyroid gland. Tannins: complex plant compounds that are often bitter or astringent. Phytoestrogens: plant analogues of the hormone estrogen. Flatus-producing oligosaccharides: carbohydrates of small molecular weight that cause
flatulence (gas). Phytates: which bind minerals preventing absorption.

Saponins. Antivitamins.

From Faldet [1992], heat treatment of soybeans destroys or reduces heat-labile antinutritional factors, improves digestibility and availability of sulfur amino acids, and increases fat digestibility by nonruminants, but excessive cooking will reduce protein availability. (Note: Ruminants are hoofed, cudchewing animals such as cattle, sheep, goats, deer, and giraffes, having multiple stomach chambers that are specially adapted for digesting tough cellulose raw.) Thus, there is an optimal heat treatment, which was found to be 120 minutes at 140C (284F), or 30 minutes at 160C (320F).

Grains
The antinutrients here (anti-amylases, phytates) are affected by heating, but phytates require other processing (such as fermentation) for further neutralization, which is still only partial. Also, soaking/germination (sprouting) reduces phytates [Hurrell 1997]. Soaking under optimal conditions (55C, pH 4.5-5.0) can eliminate phytates [Sandberg and Svanberg 1991].

Egg Whites
Raw egg white contains a protein called "conalbumin" which binds to iron. Additionally, raw egg white contains avidin, which binds to biotin and can impair metabolism of other B-vitamins. Note however that raw egg toxicity should not be overstated: 20 raw eggs per day for several weeks would be necessary to create a biotin deficiency.

Mushrooms
The most common commercial mushroom, Agaricus bisporus, causes cancer in mice [Toth 1986]. The dosages required were almost half of their total food intake. Concerning poisonous mushrooms, obviously cooking amanitas (an extremely poisonous variety of mushroom) won't make them edible, but there are examples of mushrooms which become edible after cooking. Since these varieties are very special and can't be found at the grocery, we won't expand further.

Potatoes

322

Potatoes contain solanine and chaconine, which are not hazardous unless large quantities are eaten. They don't accumulate in the body, and are not destroyed by heat.

Spinach and Rhubarb


These contain oxalates, which among other effects inhibit calcium absorption. Again, they are not hazardous unless large quantities are eaten. They don't accumulate in the body, and are not destroyed by heat.

Cyanogenic Glycosides
These are found in lima beans, cassava, and many fruit pits [Beier et al. 1994]. Processing techniques partially destroy cyanogenic glycosides, but some poisonings caused by the consumption of large amounts of cassava or fruit pits have been reported, including apricot kernels. When in contact with stomach acids, the cyanogenic glycosides release cyanide, which is the active component in Zyklon B (used by the Nazis in death camps). So indeed cyanide is toxic in large amounts, but obviously a few apricot kernels will not do much harm.

Taro
Contains some trypsin inhibitors and lectins, which are destroyed by heat [Seo 1990].

Parsnips
These contain toxic psoralens, which are potent light-activated carcinogens and mutagens not destroyed by cooking [Ivie 1981]. Parsnips contain psoralens at a concentration of 40 ppm, and Ivie [1981, p. 910] reports: [C]onsumption of moderate quantities of this vegetable by man can result in the intake of appreciable amounts of psoralens. Consumption of 0.1 kg of parsnip root could expose an individual to 4 to 5 mg of total psoralens, an amount that might be expected to cause some physiological effects under certain circumstances...

Flavonoids
Flavonoids are a broad class of compounds common in plants and in the human diet. The basic characteristic of flavonoids is that their chemical structure includes what is known as the flavonoid skeleton, that is, a skeleton (core) of diphenylpyrans, i.e., C6-C3-C6, where the C stands for carbon, and C6 is a benzene ring [Hertog and Katan 1998]. Over 4,000 flavonoids are known, and new ones are being researched and described [Hollman 1997]. In plants, flavonoids serve a number of functions: as pigments (the color of fruits and flowers is due to flavonoids), antioxidants, sunscreens, etc. Because of the large number of compounds in this class, generalizations about the function of flavonoids are difficult [McClure 1975]. However, the following can be said about certain flavonoids: Quercetin, a very common flavonoid in the human diet, is known to be mutagenic [Nagao 1981]. However, it is not carcinogenic, and has been shown to have anticarcinogenic properties in tests in vitro; see Hertog and Katan [1998], and Chung et al. [1998]. Flavonoids can inhibit enzyme systems in mammals [Hollman 1997]. The flavonoid phloridzin (and its breakdown products) can inhibit respiration in animal tissues [McClure 1975].

323

Certain flavonoids can function as antioxidants and help preserve vitamin C [McClure 1975]. Some flavonoids protect against the mutagenic effect of other substances. (Refer to the previous section on Mutagenicity and Carcinogenicity, subtopic "Other Factors Influencing Carcinogenesis.")

Alfalfa Sprouts
Alfalfa sprouts contain approximately 1.5% canavanine, a substance which, when fed to monkeys, causes a severe lupus erythematosus-like syndrome. (In humans, lupus is an autoimmune disease.) Canavanine is an analog for the amino acid arginine, and takes its place when incorporated into proteins. However, alfalfa that is cooked by autoclaving (i.e., subjected to pressure-cooking) doesn't induce this effect [Malinow 1982, Malinow 1984]. Note here that the monkeys were fed semi-purified diets, with a canavanine content of 1-2%, versus a typical canavanine content of 1.5% (dry weight--that is, when completely dehydrated) for alfalfa sprouts [Malinow 1982]. Thus, although it would be very difficult for a human to eat enough fresh alfalfa sprouts to ingest even 1% canavanine, individuals should be aware of the potential risks, and consume (or not consume) alfalfa sprouts accordingly. (In particular, those rawists who juice sprouts should probably strictly limit or avoid the consumption of alfalfa sprout juice, due to the concentration effect that results from juicing.)

Other Examples
Let's mention quickly a few other examples: pyrrolizidine alkaloids, present in herbal teas, are carcinogenic, mutagenic, and teratogenic (cause birth defects); gossypol in cottonseed causes abnormal sperm and male sterility, and is a carcinogen; piperine in black pepper causes tumors in mice; capsaicin in hot pepper is a mutagen; allyl iosthiocyanate, in oil of mustard and horseradish, is a carcinogen in rats; quinones and their phenol precursors (in many different plants) have mutagenic and antimutagenic properties.

Other toxic effects of cooking


Heated Milk Protein
It is possible that heated milk protein may be a factor in atherosclerosis [Annand 1971, 1972, 1986].

Heated Fats Oxidized fats, oils, and cholesterol. Research reveals that in animal models, oxidized fats, oils, and cholesterol induce higher levels of arterial plaque (i.e., atherogenesis) than do the corresponding nonoxidized fats, oils, and cholesterol [Taylor et al. 1979, Kummerow 1993, Kubow 1993, O'Keefe et al. 1995]. The biochemical processes that make oxidized fats atherogenic are the subject of scientific controversy; however, one suggestion is that the heating of fats, oils, and cholesterol increases the levels of lipid peroxide products. The idea is that the peroxides (in combination with lipids) promote an atherogenic response [Kubow 1993].
In tests feeding high-cholesterol diets to rabbits, the consumption of scrambled or baked eggs produced increases in serum cholesterol of 6-7 times the pre-existing levels, while fried or hard-boiled eggs raised levels by 10-14 times [Pollack 1958]. Cordain [in a posting to the Paleodiet list of 10/9/1997] also reports that his research group routinely induces atherogenesis in test animals (miniature swine) by feeding oxidized fats/cholesterol.

Role of oxidized LDL cholesterol in atherogenesis. O'Keefe et al. [1995, pp. 70, 72] explain the role
of oxidized cholesterol in atherogenesis as follows:

324

LDL cholesterol must be oxidized or glycosylated (or both) before it becomes atherogenic.(8,9) Oxidative modification of cholesterol occurs by means of oxygen free radical processes. Only after the LDL has been modified (through oxidation or glycosylation) does it activate differentiation and migration of macrophages. The scavenger receptors on the macrophages recognize oxidized LDL (but not unmodified LDL) and allow for subsequent phagocytosis. When the macrophage becomes filled with oxidized LDL cholesterol, it becomes the foam cell that is typically observed in early atherosclerotic lesions... The oxidative modification of LDL cholesterol seems to be the final common pathway in the process of atherosclerosis. Steinberg et al. [1989] also report that oxidized LDL cholesterol, at high levels, is atherogenic. For a good summary of the atherogenic properties of oxidized LDL cholesterol, see Table 2 in O'Keefe et al. [1995, p. 72], and Table 1 in Steinberg et al. [1989, p. 917]. Here it should be noted that some aspects of the effects of oxidized cholesterol are controversial in the sense that a scientific consensus has not yet been reached. Readers are encouraged to consult O'Keefe et al. [1995] and Steinberg et al. [1989] for a detailed overview of current knowledge in this field.

The paradox of relatively high cholesterol intake and cooked meats vs. rarity of heart disease in hunter-gatherer groups. It is also worth remarking that many other factors than lipid
peroxides influence the development and/or prevention of atherogenesis, such as the amount of saturated fat, amount of mono- or polyunsaturated fat, amount of carbohydrates and insulin response, etc. Of particular note here is the example of hunter-gatherer societies, where the incidence of heart disease is extremely low (perhaps the lowest that has been seen among human groups), despite the fact that relatively large amounts of cooked meat are consumed This is in marked contrast to the high levels of atherosclerosis in Western diets containing cooked meat. This divergence may be due to the differences between the type of meat (wild game) in hunter-gatherer diets--which in general is quite lean--compared to modern domesticated meats (five times less fat, and onefifth to one-sixth as saturated [Eaton 1996]), and/or the difference may be due to a range of other factors. Of specific interest regarding the subject of oxidized cholesterol is that while hunter-gatherers eat roughly the same amount of cholesterol (480 mg) as in the modern Western diet [Eaton 1992] (from meat, presumably cooked), their serum cholesterol levels, as measured in five modern hunter-gatherer groups, averaged a very low 123.2 mg/dL [Eaton 1992].

Viewing single factors out of context can be misleading. Thus if cholesterol and/or oxidized
cholesterol are in fact atherosclerotic in effect, then the implication is that there must be something else in the diets/lifestyles of hunter-gatherer groups mitigating or negating this effect. (A good overview of the large divergences between hunter-gatherer diets and the modern Western diet can be found in Eaton [1996]. Major differences are to be found in consumption levels of saturated and polyunsaturated fats, preformed long-chain fatty acids, protein, carbohydrate, phytochemicals, etc., as well as in exercise levels.) The general point here is to keep in mind that, before attempting to form conclusions that might be premature, it is important to view the role of any one factor in the equation of health in the context of the overall diet rather than in isolation. Depending on the situation, the benefits of a food or class of foods may mean more for the health of the body than whatever associated negatives there may be--or vice versa. That nutritional benefits from foods, whether raw or cooked, unavoidably come at the expense of costs and tradeoffs is a central issue that we will return to more than once, in different forms, as this paper proceeds

Lesson of the Pottenger's Cats experiment: cats are not humans


In the period 1932-1942, Dr. Francis Pottenger conducted a series of feeding experiments on cats investigating the effects of cooked food. The papers published from that research have been compiled and

325

edited and released as Pottenger [1995]. Pottenger [1946] is relevant as well, and is available as a reprint from the Price-Pottenger Nutrition Foundation. Dr. Pottenger observed that cats fed a cooked diet developed a number of pathologies, some of which were remarkably similar to certain diseases of civilization, whereas cats eating raw didn't suffer from these problems. It has thus been tempting to blame cooking for all the food-related evils from which we suffer; and in the raw-food movement, this study has been held up as the quintessential paradigm of proof of the perils of cooked food. However, casting Pottenger's experiment in this role suffers from some weaknesses that we shall examine, some of which are fundamental.

Domestic cats today reproduce prolifically on "cooked" commercial pet feeds. One
obvious remark about the Pottenger's Cats study is that domestic cats all around the world have been living and reproducing successfully for numerous generations on cooked domestic foods, or canned or dry commercial foods. Some cats are so domesticated that they don't even know what to do when they see a mouse, or will play with it for hours (why bother hunting when you are fed with palatable pet food every day?). That is in contrast with Pottenger's cats who couldn't reproduce beyond the third generation. It is certainly true that domestic cats do not have perfect health, but cat nutrition is as complex as human nutrition, and many other factors than cooking determine feline health.

Was Pottenger's cooked diet detrimental because it was "dead" or simply deficient?
One might argue in response here that commercial feeds are not simply cooked food--they are supplemented as well. But this, then, simply demonstrates it is not that the food is somehow "dead" (as raw-fooders often term cooked food) that is the underlying problem, but rather that the diet fed by Pottenger was deficient in some way. And we shall see later that cooked diets--in humans, at least--are not necessarily more deficient than raw ones. Moreover, one cannot compare pet diets--particularly cat diets--with cooked human diets, which are less monotonous and hence provide a larger variety of nutrients. (Cats are true carnivores, humans are omnivores eating a much wider range of foods.) To put it plainly: cats are cats, humans are humans, and there are significant differences between the two. Such considerations, therefore, suggest that--since domestic cats reproduce prolifically on today's cooked-food pet diets (to the point that everyone is urged to spay and neuter their cats to prevent severe overpopulation)--Pottenger's cats suffered not due to some magically bad toxic effects of cooking, or because the food was "dead," but rather from nutritional deficiencies in the diets fed by Pottenger.

Study not as well-controlled as newer studies. Pottenger's cat studies were done under
limitations that make them less well-controlled than more modern studies. Specifically, many of the cats in the study were donated, and although Pottenger tried to get as much information as possible on the cats' histories [Pottenger 1946], stray cats adopted by people usually come without a reliable health history that cover's the cat's entire life. That is, the likelihood that some of the cats in the study were former strays means the study included cats with an unknown health history, i.e., the use of former strays introduces unknown error in statistical terms. Another possible confounding factor is the very nature of the diets used; from Pottenger [1946, p. 467]: We placed an order for raw-meat scraps at the market where the Sanatorium meats were bought; these scraps included muscle, bone, and viscera. The problem with the above is that it strongly suggests the composition of the diet was not wellcontrolled, at least not to the level that would be considered essential in modern studies.

326

While these two criticisms of Pottenger's methodology do not in themselves invalidate the results of the experiments, they do make inferences drawn from them about cat nutrition less than reliable, since the diet could not have been nutritionally analyzed for key dietary constituents known today but unknown then. This puts significant limits on just what conclusions can be drawn from Pottenger's experiments. Most important in this regard, we shall see in the bullet point just below that the experiments did not anticipate the crucial role of taurine in cat diets, a key nutrient for cats with critical implications for any study of cat nutrition.

Pottenger study was done long before the essential role of taurine in cat nutrition was known. Although taurine was discovered in 1838 [DeMarcay 1838, as cited in Huxtable
1992, ref. #134], the modern era of research on taurine did not begin until the late 1960s. See Jacobsen and Smith [1968] for a review of the state of knowledge about taurine at that time, including an interesting discussion of the similarities and differences in taurine synthesis pathways in humans and cats. As well, a comparison of Jacobsen and Smith [1968] with Huxtable [1992] shows the considerable recent expansion of knowledge about taurine, and its importance in mammalian physiology. Recall that the Pottenger cat studies were conducted in the period 1932-1942. Although taurine was known at that time, the level of knowledge was much less than today. In particular, the critical role of taurine in cat nutrition--the fact that it is an essential amino acid for cats--was not discovered until 1975, with the publication of two papers: Hayes et al. [1975a,b] as cited in NRC [1986] and Schaeffer et al. [1985]. Inasmuch as the role of taurine in cat nutrition was unknown at the time of Pottenger's research, the possibility that taurine might be a factor in (or explain) the symptoms observed in their cats was obviously unknown to Pottenger and his research team.

The cat feeding studies were apparently never replicated or confirmed by any other study. It's worth remarking once again, here, that the reliance on old scientific studies that haven't
been updated or confirmed--especially citing a single study without mentioning the context of the rest of the scientific literature on that subject--is a weakness of several other rawist arguments. Sentiment among some raw-fooders seems to be that there must be some secret scientific conspiracy to ignore the Pottenger experiment. But if the experiment has been relegated to the dustbin by science--at least in terms of its applicability to humans--the reason is far more logically the one which follows next. Cats are not an appropriate experimental animal model applicable to humans. Even if one assumes the study to have been a valid one (as far as it went, at least), the decisive and most fundamental criticism of the validity of the Pottenger's Cats experiment is that cats are not used by researchers as an experimental model for humans because the results cannot be extrapolated to human beings with any confidence. More common animal models for which results may have greater relevance for humans are mice, rats, and particularly primates, who are omnivores rather than carnivores as cats are. Differences between cats and humans. Cats, being total carnivores, have special nutritional needs in certain respects when compared to humans (who are omnivores): They require inositol, which is a B-vitamin; but not vitamin C; they need more protein (25-30%) than humans, a fair amount of fat (15-40%), etc. Recall that the SAD is about 15% protein, and that one can live (on a calorie-adequate diet) with 10% protein or even less without visible signs of deficiencies; and that low-fat diets may be as low as 10% fat. It is true that cats and humans do share some metabolic similarities in certain respects as well. (See Metabolic Evidence of Human Adaptation to Increased Carnivory on this site for examples.) However, given the significant differences, drawing more sweeping conclusions about human nutrition from experiments on cats cannot be given much credibility. Most crucially, of particular note are differences in taurine requirements: o Cats lack the ability to synthesize taurine and require it in their diet. One of the more striking differences between humans and cats is that the latter require taurine in their diet, which they have no ability to synthesize from precursors. While humans and cats are similar in that both have reduced ability to synthesize taurine in their diets

327

compared to herbivores, the key difference is that cats (who are carnivores) have completely lost the ability and must obtain all that they need of it from their diets. (Humans, as omnivores, have retained the ability to synthesize taurine although it is limited and inefficient compared to herbivores.) o Heat-processing negatively affects taurine levels in cats. It has been shown [Hickmann et al. 1990, 1992, Kim et al. 1996a, 1996b] that cats eating heat-processed foods have a lower plasma taurine concentration. The explanation is probably that Maillard reaction products promote an enteric (intestinal) flora that degrades taurine and decreases recycling of taurine by the enterohepatic route. Excessive secretion of the hormone CCK due to a lower protein digestibility might also be another reason [Backus et al. 1995]. (Note: "Enterohepatic" recycling occurs when food absorbed in the lower bowel is transported to the liver for storage and/or processing, some of which is released via the bile back into the upper bowel, where it can recycle again.) o Taurine deficiency induces a number of pathologies in cats, such as retinal degeneration, heart disease, reproductive failure, platelet abnormalities, and developmental abnormalities [Waltham 1993, 1994]. In addition, taurine-deficient female cats have taurine-deficient milk [Sturman 1991], which may explain the worsening of the symptoms observed by Pottenger in the offspring. Modern research on inter-generational effects of taurine deficiency in cats. The research of Sturman et al. [1986] is of particular interest in relation to Pottenger's cat experiments because it analyzed the effects of taurine deficiency on pregnant cats and their offspring. In this research, two groups of cats were fed a synthetic diet (obviously also thereby a "cooked" one), except that the diet of one group was supplemented with 0.05% taurine. The female cats were put on the diet for at least 6 months before breeding. A wide range of symptoms similar to those Pottenger observed in his cats were observed in the group of taurine-deficient cats in the study by Sturman et al. [1986], as follows. o

Defects in visual pathway (retinal degeneration) of taurine-deficient cats.


Pottenger [1995, p. 10] reported "nearsightedness and farsightedness" in the cats fed cooked food. (Presumably Pottenger did not examine the cat retinas as Sturman et al. did.) Higher incidence of stillbirth in taurine-deficient cats vs. the control group. Pottenger [1995, p. 40] reports higher incidence of stillbirth in his cooked-food cats vs. raw. Much lower survival rate of kittens born to taurine-deficient cats vs. controls. Pottenger [1995, pp. 12, 40] reports lower survival rates in kittens born to cooked-food cats. Kittens born to taurine-deficient cats weighed less than controls [Sturman et al. 1986, p. 658]; compare to Pottenger [1995, p. 40] which reports that kittens from cats on cooked food weighed less than raw.

o o

Kittens born to taurine-deficient cats exhibit "neurological abnormalities, including abnormal hind limb development..." [Sturman et al. 1986, p. 656].
Similarly, Pottenger [1995, pp. 10, 12, 27-32] notes nervous system problems in cats receiving cooked foods, differences in long-bone length, and differences in calcium and phosphorus content of cat femurs between the cooked vs. raw groups.

Another paper, Novotny et al. [1994] notes that taurine deficiency in cats can cause heart disease. Pottenger [1995] does not remark on the cat's hearts, but does report abnormal tissues in the cooked food cat's lungs. Note that Pottenger's cats (in the main experiments of interest here) were fed 1/3 raw milk and cod liver oil, plus 2/3 raw meat scraps (raw group), or 2/3 cooked meat scraps (cooked group). Cod liver oil has little or no taurine in it, and cow's milk is extremely low in taurine. Thus the only

328

source for Pottenger's cats to obtain adequate amounts of taurine would have been from the meat scraps in their diet (cooked or raw).

Taurine deficiency is a plausible explanation for the symptoms observed by Pottenger. Now consider all the information above, i.e., the considerable similarities between
taurine deficiency and the symptoms Pottenger reported for the cats fed his cooked-food diet; the fact that the only source of taurine in the cat's diet was meat; and that heat (cooking) has a negative impact on the bioavailability of taurine to cats. The obvious hypothesis suggested by the above is that the cats in the cooked-food group in Pottenger's research suffered from taurine deficiency. That is, the symptoms of the cooked-food cats were most likely the result of a nutrient deficiency, and not due to "toxins" created by cooking, as suggested by some raw food diet advocates. Of course, more than taurine deficiency may be involved here as well; perhaps other deficiencies exist, the only way to know being to exactly repeat Pottenger's experiment, but using various supplementations to understand the true reasons. But modern experiments today, as cited above, make this all basically a moot point. Certainly at the least, they show cats can get along and reproduce well enough on synthetic cooked-food diets--thus demonstrating definitively that cooking itself is not the issue here; something else is.

Taurine requirements: once again, cats are not humans. Further, information on taurine
in other studies shows that what is valid for cats is not necessarily valid for humans. One interesting thing we know about taurine in humans, for instance, is that vegans who eat no animal products at all have low levels of taurine compared to non-vegetarian humans, whereas conventional (i.e., SAD diet) omnivores have normal levels [Laidlow 1988]. There is an obvious implication of this regarding the effect of cooking. If one makes the assumption that (most all) omnivores displaying normal levels of taurine are ingesting meats or animal foods that are mostly cooked, this suggests that cooking of animal foods does not cause the same problems in humans (an omnivore) as in cats (who are true carnivores). Certainly it shows a typical (mostly) cooked SAD diet does not have the same effect on humans as Pottenger's cooked diet did on cats. For more information on taurine, its role in human nutrition, and differences in taurine levels in humans (i.e., vegans vs. the standard Western diet), see the discussion Taurine, a Conditionally Essential Amino Acid (about one-quarter the way down the linked page). Finally, some readers may be curious about the estimated taurine requirements for our feline friends (i.e., the domestic cat). NRC [1986] recommends an intake of 400 mg taurine/kg of body weight for kittens and adult cats, and 500 mg/kg for pregnant female cats.

Raw fish blocks thiamine absorption in cats. As an aside, it is known that raw fish contains
an enzyme, thiaminase, which blocks thiamine absorption in cats. Thiamine deficiencies may occur if cats are fed raw fish in excess, so we find here a benefit in destroying enzymes by cooking if cats are fed this food. (This point is an issue primarily of academic interest, of course, since fish were an insignificant part of the natural diet of the wild African ancestors of today's domestic housecats.)

A further point is that Pottenger's study tells us little about cats fed a more varied diet (Pottenger's cats were fed meat, supplemented with a little milk and cod liver oil), or fed
cooked food properly supplemented (in particular with taurine). Importantly, even more, it doesn't tell us anything about 100% raw versus mostly-raw diets, which is one of the primary issues of interest to us here that will be examined later, in Part 3.

Pottenger's cat study was well-conducted for its day, but does not support the usual rawist conclusions
Although a few of the details of the Pottenger cat study might not meet current research standards, it

329

appears that at the time the work was done, Pottenger's study was probably a good one, perhaps even excellent. Despite the major handicap of lack of knowledge about the role of taurine in cat nutrition, Pottenger had the considerable foresight to hypothesize (pp. 19-20, from the reprint of Pottenger [1946]; italics below as in the reprint): What vital elements were destroyed in the heat processing of the foods fed the cats? The precise factors are not known. Ordinary cooking precipitates proteins,(7,8) rendering them less easily digested.(9)... It is our impression that the denaturing of proteins by heat is one factor responsible.

In summary. Thus we see that the suggestion in this paper--i.e., that the symptoms observed in Pottenger's studies likely were the symptoms of taurine deficiency--was expressed in a less precise form by Pottenger (the only form possible at the time of his research) as a possible explanation. Inasmuch as the above hypothesis of Pottenger appears to be substantially supported by research done 40-50 years later, that is a tribute to Pottenger and his skills as a scientist. In this connection, we once again note that the rawist idea that "dead" food was responsible for the study results is not only vague but incorrect. Instead, a specific deficiency of a specific nutrient (whether of taurine or possibly some other nutrient(s)) is the explanation required to account for the cats' problems. It should also be noted, once again, that as cats are not a valid experimental model for humans in critical respects such as taurine metabolism, it is invalid to extrapolate the results as having specific relevance for humans. Thus, while Pottenger's results may have been valid (for cats), the usual rawist conclusions about them (for humans) are not.

Digestive leukocytosis: what a close reading of Kouchakoff reveals


One of the prime cornerstones of rawist ideology is a very old and dated research paper, Kouchakoff [1930]. In this research, Kouchakoff studied the effect of foods on human blood and observed what was termed digestive leukocytosis, an increase in the number of white blood cells after eating foods heated to a certain temperature, whereas raw, unheated foods did not have this effect. This observation has been widely publicized by raw diet advocates and is often cited as proof that raw foods are "good" whereas all cooked foods are allegedly "bad." The present section will examine Kouchakoff [1930], as well as a follow-up paper published only in French, Kouchakoff [1937]. This later paper is only rarely cited, if at all, in English-language rawist writings, which is understandable, though unfortunate, as we will see that its omission significantly changes the conclusions to be drawn from Kouchakoff's research overall.

Kouchakoff's experiments on which the phenomenon is based are of questionable validity.


The phenomenon termed "digestive leukocytosis" was discovered by Kouchakoff in the 1920s. However, since it has never been confirmed by subsequent scientists, it is not unreasonable to believe that it may have been due to an experimental error. That Kouchakoff's single series of experiments upon which the rawfoodist claims about leukocytosis are based hark back to the 1920s/30s would be enough to raise doubts for any rigorous researcher about the potential oversights early work like this might have had. Without these experiments having been replicated with the benefit of considerably more sophisticated modern techniques and knowledge about the immune system, it is difficult to take these experiments without a large grain of salt today.

Experiments apparently not replicated since. The question of why Kouchakoff's experiments were
apparently never replicated is sometimes raised by raw diet advocates. Speculative claims may be made, e.g., regarding the alleged presence of bias among scientists who eat cooked foods (a point-blank claim,

330

which, without specific evidence of such that is relevant to these particular claims, could be considered an example of what might be called "dietary racism"); or that Kouchakoff's results are self-evident. Of course, the latter claim is not only speculative, but self-serving. Two possibilities come to mind here. First, it may be that the work was replicated (whether successfully or unsuccessfully) but unpublished, or published but obscure today, due to the limitations on searching the scientific literature of that period (much of it cannot be searched via computer databases). Second, and perhaps more likely, is that since the leukocytosis observed by Kouchakoff was short-term in effect, it was (and still is) seen as normal, and in particular, it was/is not seen as a pathological reaction, or one with clinical significance. (As will be noted below, there are at least another couple of examples of short-term leukocytosis that can be observed under conditions most observers would consider "normal.") In other words, the reaction of the scientific community to Kouchakoff's research might have been, in figurative terms, a collective "so what?" This latter possibility, and reaction, should be considered when one reads rawist accounts of the Kouchakoff research. One can find raw advocates who claim that the Kouchakoff research is of critical importance, and/or supports specific technical claims. However, a careful review of the Kouchakoff paper shows that there is often little or no connection between the actual Kouchakoff research and specific rawist claims allegedly based thereon.

Even at face value, Kouchakoff experiments not an argument against cooked food when some raw foods are also eaten. However, for the sake of discussion, let's look at the Kouchakoff
experiments upon which the raw-foodist claims are based. When we do, what we first notice is that Kouchakoff's experiments are not an argument against predominantly raw diets (versus 100% raw)--or even against predominantly cooked diets that include sufficient amounts of raw food--since according to Kouchakoff [1930]: It has been proved possible to take, without changing the blood formula, every kind of foodstuff which is habitually eaten now, but only by following this rule, viz.--that it must be taken along with raw products, according to a definite formula. Leukocytosis was avoided if at least 10% raw foods were also eaten. That is, Kouchakoff found (if his results are to be believed) that when a cooked food is ingested, leukocytosis can be avoided simply by adding about 10% of the same raw food [Kouchakoff 1937, p. 336], or a raw food whose "critical temperature" is higher [Kouchakoff 1937, pp. 332-334]. (Note: "critical temperature" here refers to the specific temperature for a specific food above which leukocytosis occurs.) Therefore, by these results, it is by no means necessary to eat 100% raw or even predominantly raw to avoid leukocytosis; even totally cooked meals would be fine provided cooking temperatures are low enough.

Foods cooked below their "critical temperature" did not induce leukocytosis. According to
Kouchakoff [1937, pp. 330-332], foods heated below their critical temperature for 30 minutes don't induce digestive leukocytosis. Critical temperatures vary between 87C (189F) and 97C (207F).

That even water was said to have a critical temperature suggests experimental errors involved. Even water was said to have a critical temperature (of 87C, or 189F), which is quite
surprising, as pure water is hardly toxic. This, of course, strongly suggests Kouchakoff's experiment may have had systematic experimental errors and thus raises considerable doubt about the validity of the results in general. Now--still assuming that Kouchakoff's observations are valid, which is somewhat questionable--it is important to understand the meaning of an increase in white blood cells. The naive interpretation of rawfoodists is that cooked food contains foreign substances that the body must fight; and in addition, since the body's immune system is busy with cooked molecules, it can't be fully efficient against germs.

331

Note that these last assertions demand specific evidence--which probably few if any raw-foodists who espouse this theory would be equipped to supply. That is, such assertions are simply speculation. (A common tactic in raw-foodist thought is to attempt to reverse the burden of evidence by adopting an unproven theory, then demanding that others disprove it; when by logical rules of evidence, the burden of proof is on those making speculative claims, not the other way around.) Efficiency of the immune system depends on a number of factors; it's much more than a simple white blood cell count.

Leukocytosis can occur under normal conditions of health. Next, we note that leukocytosis occurs
in other circumstances than infections [Gabriel et al. 1997]: in particular, following exercise. It has been shown that infection-induced and exercise-induced leukocytosis differ in nature, but the phenomenon is still not well understood. No one will say that exercising is bad because of the subsequent leukocytosis. Similarly, it is known that pregnancy induces leukocytosis [Branch 1992], as well as increased levels of adrenalin. The mechanism underlying this phenomenon and its physiological implications are not known.

Additional Kouchakoff article includes dietary advice for avoiding leukocytosis that includes cooked as well as raw foods. Finally, let's mention that an article of Kouchakoff's published
in French [1937] includes some dietary advice (pp. 314-316) to prevent digestive leukocytosis. As seen below, the recommended diet is not even predominantly raw, and foods such as grilled meat, bread, coffee, and salt are not prohibited. Of course, we express no opinion concerning the ideas of Kouchakoff here related to food selection, our point being simply to show that, in the eyes of the researcher himself who "discovered" a relation between leukocytosis and cooked food, only minor changes to fully-cooked diets are enough to prevent increases of white blood cells subsequent to meals. Our translation from the French follows: Here is, along with their preparation method, a short list of the main foods we can use without risk of inducing digestive leukocytosis. For the sake of simplicity, "raw" will designate foods that are unheated, as well as those heated below their critical temperature. Furthermore, to correct the effects of a cooked food with a raw one, we recall that the latter should be added in the proportion of approximately 1 to 10. By "water," we mean ordinary drinking water, such as tap water.

Milk: raw, or heated below 88C (190F); if boiled, add raw milk or cream. No sugar. Yogurt, curds: allowed. Tea and coffee: add lemon juice, or water, or raw milk, or cream. No sugar. Wine: should be
diluted with two raw products: water, fruits, fruit juice. Bread: always whole, with butter. Eggs: fresh or soft-boiled. The yolk will remain raw and will correct the white. Butter: fresh, or melted below 91C (196F). Cheese: all kinds allowed, but should be taken with bread and butter. Fruits: raw or in salad. If at least two different kinds are used, sugar may be added. Sugar: avoid as much as possible, replace with honey. Remember that sugar should always be balanced with two different raw products.

Condiments: all kinds allowed (nutmeg, pepper, cinnamon, cloves, etc.), but add to cooked foods only when serving. Salad: one, or better, several kinds (lettuce, endive, watercress, dandelion). By increasing
the number of different components, several cooked foods can be corrected at the same time. Use (coldpressed) first-choice olive or walnut oil. No peanut oil. No vinegar, but lemon juice. Salt and pepper unlimited.

Vegetables: raw, finely chopped and prepared at the last minute. As with salad, it is better to mix several
vegetables (carrots, beets, turnips, potatoes, etc.), which will perform a multiple correction. Eat with a mayonnaise: from first choice-olive oil. No peanut oil. Fresh eggs, salt, pepper, chives to taste. Lemon juice. No vinegar.

Meats: any raw or rare meats, smoked or salt-preserved meats: herring, ham, bacon. Boiled, braised, or
grilled meats will be served with a mixed salad or mixed vegetables. Fish can be steamed (trout). Thus,

332

boiled water will be avoided, since steam is simply distilled water, which is neutral for the organism. When serving, add fresh butter, lemon, parsley, and chopped onions, etc. Serve grilled meats in a similar fashion. For braised meat, cooking water will be discarded and replaced with the juice of the same vegetables that are served with the meat (carrots, tomatoes). The meat will be corrected with fresh butter, lemon, parsley, and the vegetables themselves, or by a mixed salad or a vegetable salad. Note that we are not here recommending a diet based on food choices like what Kouchakoff suggests above--only demonstrating that his research has been taken considerably out of context by extremists, and used to "prove" things that Kouchakoff's work itself does not support (such as that an all-raw diet is necessary, or is the only way, or the best way, to avoid leukocytosis).

Conclusion: assessing the risks of cooking by-products in context


A concise summary of the main effects of by-products formed as a result of cooking is shown in the table below, with a wrap-up and concluding commentary about these effects following.

Cooking By-Product Dietary MRPs (Maillard reaction products)


Main Effects Antinutritional. Anticarcinogenic. Risks for vascular system and kidney in diabetics. Aging. Risks for vascular system and kidney in diabetics.

Main Source All cooked or stored foods, especially those high in protein and carbohydrate.

Endogenous AGEs (advanced glycation endproducts)

NOT in food. (Generated in the body by non-food pathway.)

HCAs (heterocyclic amines) PAHs (polycyclic aromatic hydrocarbons)

Mutagenic, carcinogenic if dosage is high enough. Mutagenic, carcinogenic if dosage is high enough.

Meat and fish cooked at high temperatures. Meat and fish cooked at high temperatures.

Highest temperatures represent the primary risk. It appears from the research about Maillard
molecules, heterocyclic amines, and PAHs that the most dangerous compounds are formed at the highest temperatures, and that the longer the processing time, the more reaction products appear. The main effect of Maillard molecules themselves is that they slightly reduce lysine availability (other effects are probably minor), but on the other hand, they have anticarcinogenic properties. One of the arguments found in the book Manger Vrai, by G.C. Burger (founder of "Instinctive Nutrition"), is that light cooking may be more dangerous than severe cooking, because slightly denatured molecules can pass the body's first-line digestive barriers more easily (since they are similar to naturally occurring molecules), and then accumulate in the body because they are unusable. While it's possible the reasoning as to passage through digestive barriers might be true as far as heat-denatured proteins are concerned, no evidence has been found that denatured proteins might be toxic. (Heat-denaturing merely means that the amino acids in a compact protein sequence have been unfolded by heat and become entangled with other

333

unfolded protein sequences, not that the components of the proteins themselves have been changed.) On the other hand, Burger's reasoning doesn't apply at all for the Maillard reaction, since: Whatever the temperature, Maillard products are notably different from the proteins they are produced from. Most, if not all, Maillard compounds also appear naturally, in foods stored at room temperature, or inside the body. There is evidence that toxicity increases with temperature.

Cooking probably represents a small risk in overall context, particularly with diets that are predominantly raw already. It seems that, among all the causes of cancer, and all the factors (including
various dietary factors besides cooking) at different stages of its development, carcinogens from cooked food would be likely to play only a modest role, even in the context of the standard American diet ("SAD"). Moreover, with more gentle cooking practices (avoidance of frying or grilling, preference for rare meat and/or steamed vegetables), very little toxicity results from cooking: it should be clear from our analysis that, for example, steamed spinach is not "poison," contrary to what some people claim. Besides, as remarked earlier, since natural foods do contain a certain toxin load of their own, it is very doubtful whether eating 99-100% raw makes much difference compared to, say, 90% raw or even 75% raw, contrary to what many raw-foodists believe.

PART 2: Does Cooked Food Contain Less Nutrition?


In this part, we'll investigate whether raw food is or is not more nutritious than cooked food. This will involve the following considerations:

What is the effect of cooking on digestibility of various foods? Clearly, a food can have a
nutritive value only if it is digested. Hence, it is necessary to investigate the effects of cooking on digestibility. We'll see that there is no general rule, and that there are many examples of foods for which cooking improves digestibility, and many examples for which it does not. Postulated effects of dietary "food enzymes." One of the main arguments of raw versus low-temperature cooking is that heating above 104F (40C) results in progressive enzyme damage; hence, even if you steam your vegetables, your food is "dead." However, we will contend that such arguments are invalid, and that dietary enzymes have a very minor role in our health, if any. Degree of vitamin and mineral losses. It is well-known that cooking may result in some vitamin and mineral losses. We try below to give an estimate of these losses. Not all vitamins are lost by heating; other factors (such as exposure to air or light) may result in vitamin loss; and heating by itself doesn't destroy minerals (however, some minerals may be leached in water). In short, we try to determine how much vitamins and minerals can be lost by heating a food. Evaluating the pros and cons. As with the question of toxicity that was assessed in Part 1, the consideration here cannot be reduced to black and white. Instead, the various factors must be weighed against each other. We conclude that cooking does not represent huge nutritional losses, but of course, when the choice is given, it is best to eat raw whatever is palatable in that form, and to cook as little as possible when heating is necessary to improve digestibility or to improve taste.

Effects of cooking on digestibility 334

Starch
It is known that starch gelatinization--a change of structure into a form that resembles gelatin--results from cooking at temperatures higher than 70C (158F), which improves digestibility [Holm et al. 1988, Lee et al. 1985]. Cooking also neutralizes the anti-amylases (in grains and seeds, and also some tubers). On the other hand, some resistant (indigestible) starch is formed by cooking. Resistant starch is present [Englyst 1985] in smaller amounts in (dehulled, rolled, steamed) oats than in cornflakes or white bread. However, about 94% of the carbohydrates are digested. Uncooked oats don't contain resistant starch, so their starch is totally digestible if left long enough in test tubes, but in practical terms it is less digestible than in cooked form. (Any reader not convinced of that should try two comparison meals for themselves, both consisting of 200 grams of oats, one raw and the other cooked.)

Experimental results show cooked starch to be 2 to 12 times more digestible than raw starch. Kataria and Chauhan [1988] provide a direct comparison of starch digestibility in raw vs. cooked
mung beans. Here, starch digestibility was measured in milligrams (mg) of maltose released per gram of food. The data from this study indicate that digestion of mung beans soaked 12 hours yields 25.3 mg maltose/gm, mung beans sprouted 24 hours yield 75.0, mung beans soaked and subjected to ordinary cooking yield 138, while mung beans soaked and pressure-cooked (for 5 minutes) yield 305 [Kataria and Chauhan 1988, Tables 1-3, pp. 54-56]. This is solid evidence that the starch in cooked mung beans is much more readily digested than the starch in soaked or sprouted beans, by a factor ranging from about 1.8 (i.e., 138 75.0) to 12 times more efficient (305 25.3), depending on differences in methods of preparing raw vs. cooked starch. We thus conclude here that overall, certainly at least in the case of mung beans, cooking greatly improves starch digestibility. One final point regarding starch digestion needs discussion. Some raw advocates claim that their ability to digest raw starch increases when raw starches are a regular part of the diet. This assertion is based on close examination by the raw vegan of his/her stools (fecal matter; a topic of intense interest for some rawists). While there are reasons to believe that the human digestive system adapts, to a certain degree, to the diet eaten, it appears that there are no studies or experiments to quantify the degree of adaptation in comparisons of raw vs. cooked starch digestion. Furthermore, the limited qualitative anecdotal evidence available does not support a claim that the human digestive system can become 1.8 to as much as 12 times more effective in digesting raw starch--i.e., the magnitude of the effect seen from cooking of mung beans, as seen in the above-cited research.

Protein
We won't try to be exhaustive here, but our intention is to show by means of examples that, depending on the food and on the temperature and duration of cooking, a food can become more, or less, digestible, so that no general rule can be inferred.

Cooking followed by increased protein digestibility


From Oste [1991], many foods benefit from cooking:

Legumes: African yam beans (soaking + cooking/autoclaving), moth beans, mucuna seeds, mung
beans, soybeans. Cereals: wheat. Seeds: sunflower seed.

Of course, other treatments, such as germination, can improve digestibility as well (at least for some of these foods).

335

Processing conditions may also alter the food structure on a cellular level, enabling digestive-tract enzymes to gain physical access to the protein. Some native legume proteins resist proteolysis (breakdown of proteins into smaller components), but are degraded after heating.

Destruction of antinutrients of primary importance. But perhaps the major reason for the
improved digestibility that can result from cooking is the destruction of antinutrients, such as protease inhibitors, polyphenols (including tannins and saponins), hemagglutinins, phytates, and dietary fiber. Polyphenols and phytates are, however, not reduced by heat alone. (For instance, soaking, sprouting, and fermentation also reduce phytates.) From Bradbury [1984], the protein digestibility of the aleurone layer and grain coat from raw rice was only 25%, but increased to 65% from cooked rice, due to the disruption of the cellulose cell walls at 100C (212F), which was shown by electron microscopy.

Cooking followed by reduced protein digestibility


From Bach Knudsen et al. [1988], protein utilization and digestibility are decreased by cooking in some varieties of sorghum. In contrast, protein digestibility of pearl millet and corn (maize) decreases only very little (not significantly) after cooking [Ejeta et al. 1987]. From Oste [1991], heating (above 100C, or 212F) decreases meat protein digestibility. Frying chickpeas, oven-heating winged beans, or roasting cereals at 200-280C (392-536F) reduces protein digestibility. Seidler [1987] studied the effects of heating on the digestibility of the protein in hake, a type of fish. Fish meat heated for 10 minutes at 130C (266F), showed a 1.5% decrease in protein digestibility. Similar heating of hake meat in the presence of potato starch, soy oil, and salt caused a 6% decrease in amino acid content.

Do "food enzymes" significantly enhance digestive efficiency and longevity?


The references discussed here include, first: Enzyme Nutrition, by Dr. Edward Howell (Avery Publishing, 1985). (Note: This version of Enzyme Nutrition is claimed to be an abridgement of an earlier book with the same title, said to be 700+ pages long. However, no one we have talked to who has mentioned it has actually been able to locate the book; nor have we, so far.) The second reference utilized here is The Status of Food Enzymes in Digestion and Metabolism, also by Edward Howell, originally published in 1946, and republished as Food Enzymes for Health and Longevity (Lotus Press, 2nd edition, 1994).

The food enzyme theory in a nutshell


Howell's theory of "food enzymes" as discussed in the above two books basically proposes the following:

That the enzymes present in raw foods are of significant help in digesting the foods themselves once they are put into the human digestive system. That cooking destroys these "food enzymes," forcing the body to produce more of its own digestive enzymes than would otherwise be necessary to digest the food.

336

That the body has a finite lifetime "enzyme potential" for manufacturing digestive enzymes, which is important for preserving health and longevity; and a portion of which is
unrecoverably "used up" in producing otherwise unnecessary digestive enzymes each time cooked foods are eaten. That enzymes in raw food also carry the "life force," which can be transferred to the body, enhancing vitality and longevity; and that the body must use up some of its own "life force"/"enzyme potential" to compensate whenever cooked foods (that have no life force/live enzymes) are eaten.

In a nutshell, those are the primary elements of Howell's "food enzyme" theory. It is basically a hybrid twopart theory, part hypothetical science, part mystical theory. The first three bullet points above put forward the physiological elements of the theory; the fourth bullet proposes a mystical element that is, ultimately, unverifiable by experimental method, thus unscientific. In one sense, if the principle of Occam's Razor were to be employed, the fourth bullet could be pared away from the theory as entirely unnecessary, since it is essentially a redundant duplication (one restated in mystical terminology) of the physiology proposed in bullet point number three.

Reference sources for theory are outdated. Although both Howell books proposing the food-enzyme
theory are dated, they are still among the most often-cited references in raw-food circles. In the 1946 book, for example, most scientific references cited are from the 1920s and 1930s. (By the way, isn't it startling that three major raw-foodist references are more than 50 years old--leukocytosis, Pottenger's cats, and Howell's 1946 book?) As we proceed, we'll see several examples where Howell's claims are based on such outdated science (which is still harked back to by his proponents today) as to completely invalidate a claim by that very fact, let alone the logical problems which also riddle much of Howell's theory.

Example of diversionary tactics that may be engaged in by present-day promoters of Howell's theories
Just as relevant to our discussion of Howell's "food enzymes" theory besides Howell's own writings, though, are the flavor of the arguments made by raw-foodists who promote Howell's theories today. To begin our discussion, let's take as one example a debate over Howell's enzyme theories that occurred on the Veg-Raw email list (currently defunct) in May and June of 1998, to give an up-to-date illustration.

Two commonly encountered logical fallacies. In this debate, the supporters of Howell's theories
engaged in two major logical fallacies that typify the kind of reasoning commonly offered in defense of his theories. First, when asked for credible evidence for the claims of Howell, his proponents asserted that such proof was in fact provided in the unabridged (rare, perhaps privately published) 700+ page version of Enzyme Nutrition, quotes and/or evidence from which no one was able to produce because no one had actually seen a copy. One might as well claim that the unabridged version has credible proof that the earth is flat, i.e., such claims are pure speculation. Claims that aspire to be scientific (including the enzyme theories of Howell) require credible scientific evidence, not "vaporware" proofs based on evidence that advocates themselves admit they have not yet seen, or can produce, even at the very same time they claim it exists. Second, the supporters of Howell's theories made the common rawist error of adopting an unproven theory, and then aggressively demanding that others disprove it. By all logical standards of evidence, the burden of proof rests on those who propose a theory to provide evidence for it; a burden the rawists here would apparently have preferred to be able to reverse or escape. Thus, those who advocate speculative claims, such as the ones Howell put forward, are logically required to provide reasonable proof, and as we shall see below, the proof provided is not adequate.

Assessing the main arguments and corollaries of Howell's theory of food enzymes

337

With that as preface, let's now look at the main arguments by Howell "proving" that food enzymes are important for human health, and the weaknesses of these arguments. It will appear that, in fact, "food enzymes" (meaning enzymes present in foods before they are eaten, as opposed to digestive enzymes produced by the body) play a very minor role in our health, if any.

CLAIM: Enzymes are not merely catalysts. They contain the "life force," which is subtle and unmeasurable. Once the body's enzyme potential has been used up, we die. Therefore, preserving
our enzyme potential will ensure longevity.

COMMENT: First let's note that, in the form it's asserted, the above claim constitutes assuming to be true what insteads needs to be demonstrated/proven via evidence (and not simply
assumed). That there is an "enzyme potential" needs to be shown by evidence, yet here it is instead assumed at the beginning without proof. This is unfortunately a common logical fallacy in much of rawist thinking.

That enzymes are held to be the "life force" is mystical, not scientific. If the author starts with
these particular assumptions, one must question the scientific value of their reasonings. One of the most fundamental axioms in science is that the components of a theory must be at least in principle "falsifiable" to qualify as science in the first place. (That is, the components have to be capable of being observed and in principle testable for truth or falsity by physical observation or measurement.) If one cannot physically observe or measure something, then it is not amenable to scientific testing, and advocates can basically affirm anything they want with regard to it, no matter what experimental results are gotten, since there is no scientific way to confirm or disconfirm something allegedly immaterial. Therefore, if one claims that enzymes contain a "life force" that transcends their measurable physical characteristics, then it is by definition a mystical assertion and not a scientific hypothesis.

Enzymes ARE catalysts, in the broad sense (i.e., they speed up a chemical reaction, or allow a barrier of
potential to be crossed, and are left unchanged after the reaction has occurred). The mode of action of various enzymes has been described by scientists--there is nothing mysterious. Anyone not convinced by the last statement is invited to check out a textbook about enzymes in any university library. Enzymes are not the "life force," and the idea that we have a certain "quantity of life force" at birth, and that by eating "living foods" we can increase our life potential (or at least preserve it) is naive and unproven. No one has ever measured any enzyme potential (in the sense that Howell understands it); and, according to modern theories of aging, such processes as cumulative oxidative damage to tissue, the decreasing length of telomeres with each cell division, and other known processes currently under investigation, rather than enzymes, are the most likely reasons for the process of aging.

Howell postpones his proof of an "enzyme potential." It should be added that the logical
organization of Howell's books suffers greatly from the fact that he postpones the "proof" that the body has a limited enzyme potential, and some arguments in the first chapters rely on that assumption. Since, as we shall see, his proof of the existence of an enzyme potential is seriously flawed, these arguments become invalidated.

Side note in regard to Kirlian photos. Raw advocates often cite Kirlian photos that compare cooked
and raw foods as evidence of the "life force" in raw foods. Kirlian photos do show something, but there is no evidence that they are somehow photos of a (subtle and/or mystical) "life force." The logical, plausible explanation for Kirlian photos is that they show the effects of a type of corona discharge (the same phenomenon that produces lightning). The power of such a discharge will depend on moisture content (which in humans may vary with stress, e.g., sweating) and other relevant physical factors.

338

A corona discharge effect depends on ionized gases that result when a small electric current is applied to the object being photographed. Cooked foods have been heated, and heating can cause a loss of ions via steam and compounds dissolving into the water used in cooking (and the cooking water is usually discarded). Hence one would expect Kirlian photos of raw foods to have a stronger corona discharge than similar cooked foods. Even more telling, if Kirlian photos were actually the "life force" instead of a corona discharge effect, one would expect there to be no difference between Kirlian photos taken in a vacuum (where there are no ionized gases to create the corona discharge effect), and photos taken in normal atmospheric conditions. However, Kirlian photos taken of an object in a vacuum show no "aura." Thus we conclude that Kirlian photos do not show the "life force" or "aura," and are not credible evidence in comparing raw and cooked foods. Readers interested in Kirlian photos should check out the links http://www.dcn.davis.ca.us/go/btcarrol/skeptic/kirlian.html and http://www.theness.com/pseudo.html. (Note that the information on Kirlian photography in the second link occurs about halfway down on the page linked to.)

CLAIM: Food enzymes act in the mouth and in the upper stomach, before our own enzymes have begun to digest the food. A few enzymes are left undestroyed by the stomach acids and are
absorbed by the intestine. Thus, eating raw food will preserve the enzyme potential.

COMMENT: The impact of pre-digestion by salivary enzymes is small. Tortora and


Anagostakos [1981, p. 628] note: Even though the action of salivary amylase may continue in the stomach, few polysaccharides [carbohydrates] are reduced to disaccharides by the time chyle [the mass of food and digestive fluids in the stomach] leaves the stomach. Also, Tortora and Anagostakos [1981, p. 629] report that 90% of all nutrients are absorbed in the small intestine. Further, digestion in the small intestine relies on bile and pancreatic enzymes, as a large part (but not 100%) of the food enzymes are destroyed in the stomach prior to the food reaching the small intestine. The above quote, coupled with the 90% absorption figure, suggests by extension that enzymes in raw food are of very limited nutritional significance.

Certainly cooking destroys enzymes in food, but may improve digestibility for other reasons. Indeed, as we saw earlier, cooking sometimes alters the cell structure so that the nutrients become
more accessible to our own body's digestive enzymes (such as by gelatinizing starch), or destroys antiamylases or anti-proteases. Thus, in many cases, cooked food actually requires less enzymes for digestion than raw food.

Digestive enzymes, like any catalyst, are reused/recycled multiple times. Thus, any supposed
"savings" (of energy, or of a proposed "life force") in the production of digestive enzymes to make up for food enzymes lost to cooking (assuming the latter are of much use to digestion in the first place) would be very small and mostly illusory given that enzymes are reused during digestion anyway. (Note: Tortora and Anagostakos [1981, pp. 46-47] discusses reuse of enzymes.)

Instances of decreased digestibility from cooking are primarily due to reasons besides enzymes. It should be added that, when cooking does decrease digestibility, it's often not because food
enzymes have been destroyed. There are many examples (see above) of foods whose protein digestibility decreases only at temperatures much higher than 100C (212F). But obviously, enzymes are already destroyed at 100C. Also, protein availability may decrease (slightly) because of the Maillard reaction, which has nothing to do with enzyme denaturation. So it is not true that cooking necessarily demands more

339

enzyme production by the body, and it's also not true that, in the cases where cooked food is less digestible, the effect can be attributed to enzyme destruction. Finally and more importantly, the idea that the body has a limited enzyme potential is, to say the least, dubious. Digestive enzymes in food are just what they are: a possible help in digestion. Obviously enzymes do indeed help--inside or outside the body: ripening of fruits, sprouting of seeds, legumes and grains, and aging of meat are examples of important actions of enzymes that occur prior to consumption. But some types of food processing can also improve digestion, including cooking in some cases.

CLAIM: Humans have a larger pancreas than herbivores (per unit of body weight). It seems
by comparing different human populations that people eating more heat-treated carbohydrates (i.e., cooked starch) have a larger pancreas and larger salivary glands.

COMMENT: A larger pancreas is not necessarily "bad," and in any case can be explained by simpler factors than postulating overcompensations due to cooking. A difference between
humans and another species may be explained by any of the major factors that differ between the species. That is, the difference in pancreas size between species is not necessarily due to diet (though it might be, depending on specific circumstances); this follows from the principles discussed in Pagel and Harvey [1988]. As for the comparison between different human populations, a larger pancreas and larger salivary glands may result from many other factors than the supposed necessity of producing more enzymes due to cooking. In particular, a simpler and more reasonable suggestion would be that eating more carbohydrates (whether heat-treated or not) increases the secretion of insulin, subject to investigation of course. (Generally, people eating heat-treated carbohydrates would be likely eat to more carbohydrates, of course, since such carbohydrates are denser in calories, and cooked starch foods are usually softer and easier to eat than raw starch foods.)

Concerning enlarged salivary glands, the first point to note is that Howell never provided any hard
proof that having larger salivary glands is necessarily a pathological condition. We will see below that while certain health disorders are associated with larger salivary glands, that does not necessarily imply the converse is necessarily true, i.e.: specific health disorder = (implies) enlarged salivary glands However, the Howell theory is asserting (without proof) the converse, i.e.: enlarged salivary glands = health disorder or bad health.

Since size is always measured relative to something else, how do we determine a baseline?
Another relevant point here is that in deciding what is "enlarged," how is the baseline or "normal" size to be determined for comparison purposes? If a person makes no lifestyle changes, yet their salivary glands swell to double their normal size in a few days as a result of a health disorder, then the term "enlarged" can reasonably be used. On the other hand, if one person has larger salivary glands (adjusted for body size) than another person with a different diet, are one of the person's glands "enlarged"? Or are the other person's instead "reduced"? In other words, on what basis does enlargement (or reduction in size, for that matter) due to lifestyle or dietary differences constitute a normal response on the one hand, or a pathological one, on the other? And on yet another hand, might some differences be simply an example of natural (genetic) variation? How can one answer these questions reliably? Without controlled data, you simply do not know.

Controlled data on size of salivary glands is mixed. In relation to salivary glands specifically, this
brings us to the the second point for discussion here, which is that the evidence relating their size to diets is mixed. Etzel [1993, p. 136] cites two studies that show diets high in tannins can increase the size of the salivary glands. Cooking can deactivate some tannins, while increasing the bioavailability of others by concentrating them in cooking broth/water.

340

Etzel [1993, p. 140] reports that protein deficiency can reduce the size of salivary glands in test animals. He also cites studies (pp. 137-138) showing mixed results of diets that are calorie-restricted: some diets showed decreases, others increases in salivary gland sizes.

What can be concluded? Putting the above information together, since conservative cooking increases
the bioavailability of protein and energy (e.g., digestion of cooked starch is more efficient than raw), the hypothesis that a cooked diet might produce larger salivary glands than a similar raw diet seems plausible. That is, the effect of cooking (if any) on the size of the salivary glands may actually be the effect of increases in the bioavailability of protein, and/or increased absorption of tannins (rather than the lack of enzymes). So what, if any, conclusions can be drawn, then? Although the salivary glands can enlarge due to certain specific health disorders--e.g., sialadenosis; see Rossie [1993]--Howell produces no evidence that the general kind of enlargement he discusses is pathological or harmful in any way. And finally, the most important point here is that there is no credible evidence that links the change in size of salivary glands to the lack of enzymes in cooked foods.

Since cooked starches are actually more easily digested than raw, enzyme production is not the real issue here. Finally, the question of starch digestion is probably of minor importance here, in any
event. In general, cooked starches are easier to digest than raw starch foods; see Kataria and Chauhan [1988], Bornet et al. [1989] for some specific examples. Note that cooked starch is extremely easy to digest (humans and cattle are easily "fattened" with cooked starch), and if one replaces starch by other foods, there would be a need to produce other enzymes anyway. Thus, despite the fact that cooking destroys enzymes in raw starch foods, the end result (cooked starch) is easier to digest than raw starch. Consequently, if Howell's claim that populations eating more heat-treated carbohydrates have a larger pancreas and salivary glands is true, then as mentioned previously, the simplest hypothesis is that it is explained by the larger volume of carbohydrates eaten, since fewer enzymes per unit of volume would be required.

CLAIM: Laboratory rats fed cooked foods have a larger pancreas than rats fed raw foods, especially wild rats. Similarly, wild mice on a raw diet have (relatively) larger brains than domestic
laboratory rats fed cooked foods.

COMMENT: The above claims are made in Howell [1985], but the evidence provided for them is, quite frankly, sloppy and logically invalid. The evidence for the claims regarding brain
size is presented in Tables 5.1-5.3, pp. 75, 77 of Howell [1985]. However, a look at the tables reveals that in every case, Howell is mixing data from different researchers, as well as the critical point that renders his comparison invalid: he is comparing data from what appear to be different species and/or strains of rat. That is, a difference in brain size between different species (or strains) is not necessarily due to differences in diet (this is a generalization of Pagel and Harvey [1988]). And of course, Howell's claim that the size difference is due to enzymes in the diet is nothing but speculation on his part, in any event. Howell's data on pancreas size is similarly sloppy and logically invalid. Howell says [1985, p. 82]: ...[F]eeding one group of mice a raw diet, and another group the same food cooked (and therefore enzymeless). A reasonable period to complete the job would be two months. The animals would then be dissected and each pancreas weighed. But all you would have to do is read about this laborious experiment; the work has already been done. Expanding on the above quote, the criteria Howell suggests is to take a group of rats, all raised in the same manner, and randomly extract two samples. Feed each sample the same diet, except that one sample gets the food raw, the other cooked. Then sacrifice the rats and compare organ weights. Although not mentioned by Howell, to avoid confounding caused by the use of different species and/or strains, the comparison of organ sizes must use a single species/strain for all animals and diet(s) that one wishes to directly compare. Such a structure makes some sense (provided, of course, that a single species/strain is used for the studies),

341

and could potentially allow assessment of dietary effect on pancreas size. But--do the studies cited by Howell meet this standard? Let's examine the data in Howell [1985] carefully.

The data cited by Howell do not meet the above stipulated experimental requirements. The
data in support of the claims of pancreas size differences is given in Tables 5.7-5.9 of Howell [1985, pp. 81, 83]. Table 5.7 compares the weight of laboratory mice against 8 strains of wild mice. It is highly probable that the wild mice are a different species, so the observed differences may simply be differences by species, and not necessarily related to diet. Table 5.8 presents data from Brieger [1937] on the results of feeding rats 3 different types of raw diets. Unfortunately, Brieger [1937], as reported in Howell, provides comparison data only on raw-food diets, and does not provide data on cooked-food diets. Table 5.8 also has data from Donaldson [1924], but we don't know if the rats used by Donaldson are the same species or strain used by Brieger, hence cannot compare. Table 5.9 provides data from a variety of researchers on rat pancreas size. However, none of the studies cited are a raw-versus-cooked comparison that Howell says we need. Thus the result of a close examination of the data in Howell is to conclude that it is a confused mess, and the data do not meet the criteria specified above. That is, none of the studies cited compares raw vs. cooked diets for the same species/strain of rats; also the species/strain used is not specified for some of the studies. Thus we note that the data presented by Howell do not support his conclusion.

Other logical/procedural problems with the experimental data cited. The approach of Howell to
the above topic has a number of other problems. First, the laboratory rats used in some of the earlier papers cited by Howell may have been fed "raw grain" [Howell 1985, p. 77] as part of the diet provided them. The term "raw grain" might also include raw legumes, and it is well-known that the enzyme inhibitors in raw legumes can increase pancreas size in rats; Liener [1994] and Pusztai et al. [1995]. Hence any laboratory feed that included raw legumes might cause an increase in pancreas size. Second, and of greater importance, are the following:

It is well known that the digestive system in a wide variety of animals can and does change in size in reaction to changes in diet. This effect can occur rapidly, in as little as a
few weeks (i.e., within the duration of most feeding studies). See Milton [1987] for references on this topic.

Underlying all of Howell's discussion on pancreas size is the assumption that a larger pancreas is "bad" or "harmful" in some sense. However, Howell never provides a credible proof for this critical point.

Thus we conclude that Howell's analysis of pancreas sizes is flawed and of little relevance, and certainly does not prove anything about enzymes. Note: Neither Howell [1985] nor Howell [1946] include a reference list--yet another serious shortcoming-hence we cannot provide a citation for Brieger [1937] or Donaldson [1924], above.

CLAIM: Germ-free laboratory animals--those raised in a sterile environment consuming sterile food, and who therefore do not obtain any enzymes from non-bodily sources, either from food, or from intestinal bacteria (i.e., "germs")--are not healthy. However, arctic animals,
who do not get enzymes from their intestinal tracts (which happen to be sterile, i.e., germ-free, like the germ-free lab animals) but DO get enzymes from their food, ARE healthy. This demonstrates that the body is unable to produce enough enzymes on its own to sustain health, and so food enzymes are an important factor for human health.

Explanatory Note: By "germs" is meant all microorganisms with which the animal is in contact (such as
intestinal bacteria). A germ-free animal is raised in complete isolation, and is fed sterilized food. Therefore,

342

of course, it can't have infectious diseases. Howell considered that (intestinal) germs can be a significant source of enzymes (since any living organism contains enzymes), so the only animals not receiving an external source of enzymes are germ-free ones who also eat sterilized food.

COMMENT: The first remark is that the "proof" here is incomplete. The only way to
complete it would be to take arctic animals to the lab and feed them an enzyme-free (but not vitamindeficient) diet. Some animals might need germs (for other reasons than enzyme production), and others not.

The second remark is that, at that time, germ-free animals were still relatively new, and their special nutritional needs not well-known. Nowadays, they have been extensively studied and
used [Wostmann 1996]. The nutritional requirements of germ-free animals differ from the normal ones for many reasons. (Germs, i.e., bacteria, can have various effects on digestive elements of the intestinal tract, on the physical conditions of the digestive tract, may synthesize certain B-vitamins, chelate certain vitamins or minerals, etc.) Lifespan of germ-free animals has gradually improved from the beginning of the 20th century until now, with a better understanding of their nutritional requirements. Nowadays, the core of the diet is maintained at 120C (248F) for at least 15 minutes, which inactivates all enzymes, no doubt. The resulting diet needs to be supplemented by high-quality protein and vitamins to keep the animals healthy.

In reality, germ-free animals, if properly fed, are healthier overall than their conventional counterparts [Wostmann 1996]. They have a considerably lower incidence of kidney disease, fewer solid
tumors, live on average 6 months longer; and in general, spontaneous tumors occur somewhat less frequently in germ-free rats than in conventional ones. (To be more precise, by "conventional" is meant non-germ-free, but fed the same synthetic diet as the germ-free animals.) They also reproduce as successfully as their conventional counterparts. Certainly, these comparisons tell little about whether animals fed a natural diet would be healthier or not, but at least two points are proved: (a) You can live reasonably well without enzymes, at least for a few generations; (b) It's not true that germs produce enzymes that are useful for animal health. And of course, there is no reason to believe that the natural fate of animals should be to die peacefully, with all organs still functioning perfectly... Humans are idealistic, not Nature. So, ironically, the crucial argument of Howell, which could have been considered as strong at his time, is reversed and proves that animals not receiving exogenous enzymes (from food or bacteria) aren't less healthy, and so food enzymes are not necessary to maintain good health (sure, sure, maybe the 100th generation will become sick... :-) ).

CLAIM: Enzyme secretion is decreased in older people, and in some diseases. Therefore, older people have a practically used-up enzyme potential, and people are sick because of a low
enzyme potential.

COMMENT: Here again, the unproven concept of enzyme potential is used, and asserted with no evidence offered. In older people, all body functions can be impaired, as well as mental
functions. But no one will say that the "thinking potential" has been used up, and that we should exercise our brains less in order to keep the reserves of thoughts, or "thought potential" that we inherited at birth! Same remarks for enzyme secretion in disease, which is impaired because of the disease and not the other way around.

CLAIM: Continuous total loss of pancreatic juice by means of an external fistula (duct) is quickly fatal (dogs die after a few days). Therefore, it is important for the body to preserve pancreatic enzymes.

343

COMMENT: The logic of the author or the claim here is not apparent. Assuming for the
moment it were true that an animal will die when completely deprived of pancreatic juice, it does not at all prove or even suggest that the pancreas isn't able to produce enough enzymes daily on an as-needed basis, as related to type of food eaten. All the above experiment tells one is that SOME pancreatic function is necessary to sustain life. No one argues that point. Beyond this it gives no further insight, and is beside the point in regard to Howell's enzyme theory about "saving" enzymes.

The pancreas secretes other critically important substances besides just digestive enzymes.
Humans secrete a considerable amount of pancreatic juice every day (1.2 to 1.5 liters), which consists of more than just enzymes. Pancreatic juice consists of water, select digestive enzymes, sodium bicarbonate, and some salts. The pancreas also secretes important hormones, e.g., insulin and glucagon. For details, see Tortora and Anagostakos [1981, pp. 615, 626]. So the above experiment--by itself--cannot even tell us whether the death is caused by the loss of enzymes and not of something else in pancreatic juice, or if the death is due to physical factors unrelated to enzymes.

Reasoning is typically black-and-white. However, even if one were to assume for the sake of
discussion that it is in fact the enzymes which are responsible, we are still left with the fact that total loss of a function or secretion is a black-and-white discontinuity (seized upon by black-and-white reasoning) that doesn't furnish us with any insight into the actual efficiencies or functional RANGE that an organ operates within under actual conditions.

CLAIM: You have a "bank account" of enzymes from which your withdrawals are limited. Your pancreas makes digestive enzymes at the expense of metabolic enzymes, by stealing "precursors." COMMENT: Howell reiterates these points several times, but once again never presents credible proof for his claims. Digestive enzymes are made in the parotid glands, submandibular glands,
sublingual glands (saliva), stomach, pancreas, and in the small intestine mucosa and submucosal glands. One wonders how the pancreas (as Howell alleges) can somehow magically prevent the manufacture of metabolic enzymes in the blood, lymph system, or brain of a person. Can the pancreas somehow "vacuum up" all the precursors? Apparently this is supposed to happen by a process whereby the pancreas creates a drain on "precursors" from some common pool--but one which is never located or biologically delineated, and conveniently left vague and purely speculative. An analogy that has been put forth by one raw-foodist supporter of Howell today likens the body to an electronic computer's multitasking CPU which has to split its available processing cycles or energy supply between various "tasks," which are then equated with enzymatic manufacturing processes (or vaguely, "energy use") in the body. But again, the analogy is left conveniently imprecise without delineating the corresponding biological pathways or counterparts that might actually behave this way.

Confusion over "recycling" of enzymes has contributed to flawed claims. The idea that your
body must "rob" materials from metabolic enzyme production in order to produce digestive enzymes is at the heart of Howell's theories. If the claim were true, one would expect the body to be efficient at recycling enzymes, particularly in intact forms. Indeed, one claim occasionally made by today's promoters of Howell's theories is that digestive enzymes are recycled intact in just such a fashion. It is likely that part of the reasoning behind this idea that enzymes can get recycled between very different bodily systems arises out of confusion over just what the well-known "recycling" properties of enzymes actually refer to. As mentioned earlier, enzymes do in fact get "recycled" on a limited basis--though to put it more precisely, what actually happens is that they are immediately reused (multiple times in quick succession) during their specific phase of intended "use"--i.e., say, for the limited period of time in the stomach or intestines during which they are breaking down a specific food substrate.

344

This well-known, localized, phase-specific recycling is much different, however, than the type of body- or system-wide recycling being proposed by proponents of Howell. More accurately, then, to better distinguish the latter type of "recycling" from the former, what Howell's promoters are proposing here would be better termed a process of "shuttling" enzymes between different enzyme production systems in different parts of the body. Let's examine these claims and find out what they tell us about Howell's theories.

Enteropancreatic circulation of enzymes? In the late 1970s and early 1980s, a few scientific papers
were published claiming that there is an enteropancreatic circulation of digestive enzymes. By the term "enteropancreatic circulation," these papers proposed that excess pancreatic enzymes are absorbed from the intestines into the bloodstream; the enzymes are then circulated through the bloodstream, and resecreted by the pancreas--i.e., they are shuttled around and eventually recycled as a conservation measure. Although this is not identical to the claims of Howell himself, there is some commonality, as this theory suggests that human physiology actively seeks to preserve enzymes, i.e., in Howell's terminology, to preserve the "enzyme potential." Further, being realistic about it, the idea of "enteropancreatic circulation" would probably have to be considered the best modern translation of the form that Howell's dated hypothesis of an "enzyme potential" would need to assume in order to be taken seriously. And it is probably a fair approximation of how the assertions made by his supporter's today would have to be rendered in biological form, if they are also serious about translating the idea that enzymes get shuttled/recycled into a meaningful form amenable to real scientific evaluation.

Early studies claimed existence of enteropancreatic circulation of enzymes. The papers of


Liebow and Rothman [1975], Gotze and Rothman [1975], Heinrich et al. [1979], among others, claim to prove the existence of such an enteropancreatic circulation. The efficiency of the circulation in these studies was controversial; Gotze and Rothman claimed 60-90% efficiency, while Heinrich et al. reported only 1%. Occasionally, some of these studies are cited by raw diet advocates in support of the general theories of Howell, and allegedly as evidence that one needs to eat primarily raw/so-called "enzymatic" foods.

More modern radioactive-tracer studies demonstrate little or no evidence of enteropancreatic circulation. A number of subsequent studies made since then, however (which
usually are not cited by raw diet advocates), have found no, or very limited, evidence of enteropancreatic circulation of enzymes.

Studies with dogs. Bohe et al. [1984] injected radioactive-labeled trypsin into dogs, and found
that only 0.15% of the radioactivity was secreted into the dog's pancreatic juices. Levitt et al. [1981] injected radioactive-labeled amylase into dogs, and found that less than 0.02% of the radioactivity was excreted in pancreatic juices. The result of both of these studies clearly demonstrates via radioactive tracers that enteropancreatic circulation in dogs is very limited, and such circulation is not biologically significant. Studies with rats. Rohr et al. [1981], using radioactive tracers, proved that there is no enteropancreatic circulation of any kind in the rat. In two other studies, Urban et al. [1982] and Urban and Zingery [1982] analyzed rat intestines and found them relatively impermeable to ribonuclease, a major element of pancreatic fluid. This research adds further support (though in itself is not a direct proof) that there is little or no enteropancreatic circulation in the rat. A major study that clearly discredited the theory of enteropancreatic circulation was that of Rohr and Scheele [1983], who injected radioactive pancreatic proteins into the bloodstream of rats and studied the absorption. They note [Rohr and Scheele 1983, p. 991] (emphasis below mine): The majority of pancreatic proteins (~97%) were taken up by a variety of body tissues, particularly kidney, liver, spleen and lung...

345

These studies indicate that pancreatic proteins are removed from the blood circulation by at least three separate pathways: (a) uptake and degradation by a variety of tissues in the body (~97% of injected radioactivity), (b) excretion of intact proteins into urine (1%-2%), and (c) transport of intact proteins into bile (0.3%-0.5%). Transport of exocrine pancreatic proteins from the blood circulation to pancreatic juice could not be demonstrated.

Initial studies "proving" enteropancreatic circulation were flawed and invalid.


Scheele and Rohr [1984] and Rohr et al. [1981] provide a detailed critique of the errors in the earlier studies that found an enteropancreatic circulation of enzymes, and explain why the studies are invalid. In particular, Rohr et al. [1981] report that the results of some of the previous studies actually reflect the metabolic breakdown of the radioactive enzyme, followed by reuse of the radioactive tracer protein by the pancreas to make new enzymes (i.e., the earlier studies were misinterpreting their data and/or lacked controls to make sure the enzymes measured were actually exogenous). Scheele and Rohr [1984] also mention the possibility for input of exocrine pancreatic proteins via diffusion; however, such an input process will necessarily support only very low levels of enzyme output.

Enteropancreatic circulation and Howell's theories. Now to bring the discussion back to the
theories of Howell, we note that there is no evidence the body tries to recycle intact digestive enzymes. Instead, "excess" digestive enzymes are absorbed by the body's various tissues. The body has no mechanism for conserving (by recycling) digestive enzymes--a surprising result if one believes that enzymes are the body's "life force" and/or that we have a limited supply or "bank account" of enzymes, as alleged in Howell's theories. For the sake of discussion, however, let's now assume (pretend) that some digestive enzymes are assimilated, broken down via the protein cycle, and the base proteins are then used by the pancreas to make new enzymes. Interpreting the results of Heinrich et al. [1979] as reflecting such a stepwise metabolic processing of enzyme proteins suggests that the recycling of pancreatic (digestive) enzyme protein results in a reuse of at most 1% of the digestive enzyme proteins. (The study of Gotze and Rothman [1975] is unreliable and invalid, hence their higher reuse figures will not be analyzed here; see Scheele and Rohr [1984] for a critique.)

Enzymatic precursors: scarce or abundant? Now let's link this to Howell's theories. The extremely
low levels of enzyme protein recycling/reuse noted in the preceding paragraph indicate, to use Howell's terminology, that the "enzyme precursors" are not scarce but instead are present in great abundance. This, of course, directly contradicts Howell's claim that one must "rob" from metabolic enzymes in order to make digestive enzymes. CLAIM: Animals in captivity fed a natural (raw) diet are healthier than animals fed a cooked diet similar to the diet humans eat. Therefore, health suffers from the lack of food enzymes.

COMMENT: No one with an interest in dietary issues doubts that a natural diet approximating an animal's wild diet is superior to feeding them a human diet (cooked or not),
and for many reasons unrelated to enzymes.

CLAIM: Raw milk and raw meat have a higher nutritional value than pasteurized milk and cooked meat, as shown by comparing the growth rate of rats. Therefore, food enzymes improve
the availability of nutrients.

COMMENT: Cooking affects a food in numerous ways, not only by destroying enzymes. In
particular, heating produces Maillard molecules which reduce lysine availability (see above). We also have seen that reduction of digestibility can be caused by many other factors than enzyme destruction.

346

CLAIM: Traditional Inuit ("Eskimos") were healthy, but modern Inuit, who eat cooked food, develop numerous pathologies. COMMENT: The claim that the traditional Inuit diet was a raw-food diet is simply a legend, not fact, and has no proof to back it up. In fact, early contact reports with the Inuit,
reviewed in the section, The traditional diet of the Inuit: Were they the only raw-food culture? indicate that the traditional Inuit diet was a mixed diet, cooked plus raw. (Sufficient data are not available to determine whether the traditional Inuit diet was a mostly raw-food diet according to the definition used here, i.e., 75+ % raw by weight.) Thus Howell's claim that the Inuit were rawists is dubious at best. However, for the sake of discussion, let us assume that the Inuit diet (at least in some cases) was a raw-food diet in the modern sense. We observe that the claim suffers from the same logical fallacy as the previous two claims: If A is proposed as being caused by B, but can also be, or is already known to be, caused by C, D, E, and F, etc. (which the enzyme proponent completely ignores), one cannot simply pin blame on B (cooking in this instance) by fiat, merely by arbitrarily choosing to completely ignore the many other factors.

Modern Inuit eat Westernized diets with all that that entails, not simply, or only, "cooked" foods. We won't expand much further here, but it is clear that those Inuit (if any) who followed a
predominantly raw diet, who then became Westernized, not only started to eat more cooked foods, but their diets became increasingly Westernized as well (e.g., highly processed foods, refined sugar, etc.). Much modern research has demonstrated that Western diets are harmful due to numerous factors that have little or nothing to do with cooking by itself. For example, cardiovascular disease was almost nonexistent in those following a traditional Inuit lifestyle [Schaeffer 1981]. Possible causal factors include their lowcarbohydrate diet (hence low insulin secretion) and a favorable fat profile (polyunsaturated fatty acids from fish). Other evidence indicates that cancer was very rare among the Inuit living their traditional lifestyle; Stefansson [1960], Schaeffer [1981]. Finally, we note that if one does not assume that the traditional Inuit diet was a raw-food diet in the modern sense, then Howell's claims about enzymes in the "raw" Inuit diet become irrelevant.

CLAIM: In nutritional experiments on rats with a synthetic diet, supplemented by the addition of vitamins (relatively enzyme-free) and salts, McKay et al. [1935] observed many pathological conditions such as tumors and diseases of the lungs, kidneys, and genital tract. COMMENT: Here, Howell fails to mention clearly that the main point of the experiment by McKay et al. was to study the effect of calorie restriction on longevity. Many rats showed
diverse signs of pathology at old age, but obviously some physical degeneration is predictable. A glance at the article shows that the rats were fed a diet of (numbers in parentheses indicate percentage contribution to the diet by weight): casein (40%), starch (22%), sucrose (10%), lard (10%), dried yeast (5%), cod liver oil (5%), salt mixture (6%), and cellulose (2%). At the time, the authors considered the diet vitamin-adequate, and didn't supplement it with any vitamin mixture, but it is important to note that, at that time, all the vitamins and essential fatty acids hadn't been discovered yet, and that the nutritional requirements of rats weren't well-determined. Here again we can see the problems involved when raw-food theory grasps at such badly out-of-date research to rely on for support. To get an idea of the state of nutritional knowledge in the 1930s, folate was discovered in 1941 and synthesized in 1948; vitamin B-12 was isolated in 1948.

The diet of these rats was deficient and much less elaborate than what laboratory animals receive nowadays; compare for instance with Weindruch [1986], where rats get complete vitamin and

347

mineral mixtures, in addition to casein, cornstarch, sucrose, corn oil, mineral mixture, fiber, brewer's yeast, and zinc oxide. It is true that laboratory rats show signs of disease at old age in contemporary experiments, but at least part of the diseases can be explained by the natural degenerative process: no diet can maintain anyone young forever. In addition, a fair amount of the protein is from casein, which can hardly be considered as part of their natural diet, and which actually can cause kidney pathology [Iwasaki et al. 1988], as well as be a source of antigenic dead microorganisms [Wostmann et al. 1971]. (Something is said to be "antigenic" if it triggers the formation of antibodies by the immune system. The purpose of antibodies is to get rid of antigens, which are perceived by the body as "foreign" or detrimental.)

CLAIM: The successes of enzyme therapy (via enzyme supplements) also prove the value of eating raw food to get its enzymes. COMMENT: The fact that enzymes taken as tablets with protective coatings (to avoid destruction by
stomach acid before they reach the small intestine) might be of therapeutic value for some individuals (particularly those with inherent or genetic problems with enzyme metabolism) doesn't mean they are helpful for everyone else or for human health in general. Nor does it mean that the food enzymes that accompany raw foodstuffs have an equivalent effect. One cannot compare the large amounts we get with enzyme therapy (i.e., enzymes taken in capsule or tablet form) with the enzymes in food itself. Most enzymes in food are destroyed by the stomach

acids. In formal enzyme therapy, the tablets or capsules that people swallow are soluble only in the small intestine (so that the enzymes are not destroyed by the stomach acids), taken outside
meals (to improve absorption), and probably in larger quantities and concentrations than what occurs naturally, since foods contain a mixture of many different enzymes, which appear at more modest concentration. Such enzyme therapy via tablets or capsules with protective coatings is no doubt useful for individuals whose bodies lack the capacity to produce certain enzymes at all. Again, however, the (very) small amount of enzymes in food itself that might be able to make it through the stomach into the intestine will not have much significant therapeutic effect for these individuals. Some readers will note there are anecdotal claims that oral enzyme therapy using uncoated tablets (usually chewable) and/or uncoated capsules, is effective for some. Such tablets typically contain relatively large amounts of enzymes when compared to the amounts in food, and may have a limited effect--if the dose is high enough--before the digestive acids destroy the enzymes in the stomach. However, the possibility that extremely large doses of digestive enzymes, in chemically isolated and purified form, might have an effect when taken with food is really not a substantive argument for eating a 100% raw-food diet, as the amount of enzymes contained in raw foods is small in comparison to the supplements. Further, some enzyme supplements also contain mint oils--and mint has a long history of use in traditional/folk medicine as a stomach-soothing herb. That is, in some cases, the effect observed from enzyme supplements may actually be due to the peppermint (or other mint) content of the supplement.

What about the "enzyme synergy theory"?


Another, more recent theory on enzymes in food is provided in Prochaska and Piekutowski [1994]. This theory will be referred to here as the "enzyme synergy theory," and shares some commonality with the theories of Howell. Prochaska and Piekutowski describe the theory as follows [1994, p. 356]: ...[E]nzymes that occur naturally with those in foodstuffs can act synergistically with those in the human digestive tract to release the maximum amount of thermodynamic free energy from the food...

348

Energy is defined in this report as the thermodynamic free energy of the chemical bonds in the foodstuffs... The resulting metabolism of the smallest components (amino acids) results in the release of thermodynamic free energy which drives cellular functioning.

Hypothesis as published is poorly presented. The enzyme synergy theory is published in the journal
Medical Hypotheses, and it is only a hypothesis; it is not proven. The hypothesis as presented in Prochaska and Piekutowski [1994] is not very clear. Specifically, defining food energy as the "thermodynamic free energy of the chemical bonds" may be acceptable in theoretical terms, but may be impractical due to: The very fickle nature of hydrogen bonding in water (note that many raw foods are high in water). The huge variety of chemical reactions involved in digestion, many of which are multi-stage reactions. We really don't know all the reactions that occur in digestion.

In other words, it may be very problematic to accurately measure total energy as defined above.

Logical fallacy in analogizing from survival of stomach acid by enzyme-inhibitor proteins to other proteins such as food enzymes. The (nominally) strongest support for Prochaska and
Piekutowski's theory is provided by the enzyme inhibitors in raw legumes and grains, but the accompanying reasoning is problematic at best. That is, the point is made that such inhibitors are proteins and survive stomach acid, hence other proteins (enzymes) can and will survive the stomach acid as well. Insofar as digestion is a slow and imperfect system, some small amounts of enzymes apparently do survive. However, enzyme inhibitors have a different structure, and the fact that some inhibitors survive stomach acid does not lead to the conclusion that food enzymes are even remotely as robust. In apparently suggesting that raw legumes be consumed, Prochaska and Piekutowski also seem to ignore the important reality that raw soaked (or sprouted) legumes, except for a very few (mung, lentil adzuki, green peas, maybe chickpeas) have such an unpleasant flavor that one cannot eat them. In contrast, cooking makes it possible to eat a wide variety of legumes. That the enzyme synergy theory suggests one eat raw legumes is of little import when one tastes certain raw legumes and finds them inedible.

The authors narrowly focus on negative effects of cooking, while ignoring the positive ones in favor of net energy gain. Vitamin loss, protein degradation caused by excessive cooking, and even
the formation of resistant starch are mentioned in this regard. At the same time, Kataria and Chauhan [1988] is present in their listing of reference sources, but they fail to mention that the same study authors [Kataria and Chauhan 1988, p. 57] say that "starch digestibility increased more than six-fold as a result of ordinary cooking of soaked mung beans"; and that mung beans soaked 12 hours have a starch digestibility (mg maltose released per gram) of 25.3, while soaked seeds pressure-cooked for 5 minutes have a starch digestibility of 305--over 10 times that of uncooked ones (Tables 1, 2, pp. 54-55 of Kataria and Chauhan [1988]).

This increase in starch digestibility is very important in real energy terms (calories), as starch
is the predominant energy source for most people on this planet (i.e., most veg*n diets and the diets of lesser developed countries, are starch-based). It is hard to see just how energy as defined in Prochaska and Piekutowski has any real significance and connects to calories. Sotelo et al. [1987] is cited in Prochaska and Piekutowski as evidence that cooking does little to enhance the digestibility of chickpea protein. However, more recent research [Clemente et al. 1998, abstract] indicates that "appropriate heat treatment may improve the bioavailability of chickpea proteins."

Additional muddled logic. Prochaska and Piekutowski discuss research in which chickens were fed oral
enzymes which increased the digestibility of certain grains; they conclude that it provides evidence

349

enzymes can survive the digestive tract. However, that is not necessarily the case--a more plausible explanation is that the enzymes acted in the stomach before being neutralized by stomach acid, i.e., the enzymes didn't necessarily survive the stomach.

Other doubts. Finally, the theory is based on unpublished writings by Piekutowski. The title of one of
these papers: Cancer--Disease of Energy Deficiency--raises doubts as to whether it is a credible scientific paper, or may instead be the kind of food-faddist writings common in the raw diet community.

Effects of cooking on vitamins


Vitamin loss can be induced by a number of factors. Obviously, losses of vitamins depend on
cooking time, temperature, and cooking method. Some vitamins are quite heat-stable, whereas others are heat-labile. From textbooks in nutrition, such as Kreutler et al. [1987], many other factors than heat can destroy (some) vitamins, such as: solubility in water, exposure to air (oxidation), exposure to light (UVs), heat, acid and alkaline solutions, storage losses, etc. Here is a tabular summary:

FACTORS THAT CAN INDUCE VITAMIN LOSS Is substance susceptible to losses under given condition? Soluble in Water Vitamin A Vitamin D Vitamin E Vitamin K Thiamine Riboflavin Niacin Biotin Pantothenic Acid Folate Vitamin B-6 Vitamin B-12 Vitamin C Nutritional no no no no highly slightly yes somewhat quite stable yes yes yes very unstable Exposure to Air partially no yes no no no no ? ? ? ? ? yes Exposure to Light partially no yes yes ? in solution no ? ? when dry yes yes yes Exposure to Heat relatively stable no no no > 100C no no no yes at high temp ? no yes

Nutritional Element

Is substance susceptible to losses

350

under given condition? Element Acid Solution ? ? ? strong acids no no no strong acids yes heat-labile no strong acids ? Alkali Solution ? ? ? yes yes yes no yes yes ? yes yes yes Other --contact with iron or copper --long cooking in large volume of water -oxidizing substances -storage -contact with iron or copper --

Vitamin A Vitamin D Vitamin E Vitamin K Thiamine Riboflavin Niacin Biotin Pantothenic Acid Folate Vitamin B-6 Vitamin B-12 Vitamin C

Significance of losses depends on a given food's context in the overall diet. Of course, not all
vitamin losses have detrimental consequences, since some vitamins are widely available (such as pantothenic acid). The vitamins in which some deficiencies are occasionally observed are: A, D, E, thiamine, riboflavin, niacin, folate, and B-12. Of those, only thiamine, niacin, and folate would be destroyed significantly by excessive exposure to heat and/or water. It also appears from the above that many other factors than heat can destroy vitamins. Recommendations to preserve vitamins include: utilizing foods when fresh; using steaming in preference to boiling; and avoiding overly long cooking times.

Comparison of vitamin levels in raw vs. cooked foods


We also investigated the USDA nutrient database [USDA Agricultural Research Service 1998] for a few foods that were analyzed both raw and cooked. The figures shown represent the amounts of vitamins per 100 grams of food, which one should keep in mind may differ from what is actually bioavailable. In addition, we should keep in mind that, most of the time, cooking results in a loss of water content in foods, hence an artificial "concentration" of vitamins which masks the loss caused by heat or boiling in water. Thus, in order to make meaningful comparisons, the second table below shows vitamin values correcting for the effect of any lost water content. Even then, some inconsistencies are unavoidable, since different varieties of a food may have differing vitamin contents, and differing samples may have been used.

351

Finally, not all cooking methods are presented here--lighter cooking methods obviously result in less vitamin losses.

VITAMIN LEVELS IN RAW VS. COOKED FOODS (uncorrected for water loss)
(Note: Amounts are expressed in units per 100g portion of food. In the table below, "boiled" = boiled, drained, without salt.)

Food Broccoli, raw Broccoli, boiled Beef liver, raw Beef liver, braised Beef liver, pan-fried Carrots, raw Carrots, boiled Almonds, dried Almonds, dry-roasted Mung bean sprouts, raw Mung bean sprouts, boiled Tomatoes, red, raw Tomatoes, boiled Mackerel, Atlantic, raw Mackerel, cooked, dry heat

Water Content (%) 91% 91% 69% 66% 56% 88% 87% 4% 3% 90% 94% 94% 92% 63% 53% Water Content (%) 91% 91% 69%

Vitamin Assay C (mg) 93 75 22 23 23 9.3 2.3 .6 .7 13 11 19 23 .4 .4 B1 (mg) .065 .055 .26 .20 .21 .097 .034 .21 .13 .084 .050 .059 .07 .18 .16 B2 (mg) .12 .11 2.8 4.1 4.1 .059 .056 .78 .60 .12 .10 .048 .057 .31 .41 Vitamin Assay B6 (mg) .16 .14 .95 Folate (mcg) 71 50 250 B12 (mcg) 0 0 69 A (IU) 1,500 1,400 35,000 E (mg) 1.7 1.7 .67 B3 (mg) .64 .57 13 11 14 .93 .51 3.4 2.8 .75 .82 .63 .75 9.1 6.8 B5 (mg) .53 .51 7.6 4.6 5.9 .20 .30 .47 .25 .38 .24 .25 .30 .86 .99

Food

Broccoli, raw Broccoli, boiled Beef liver, raw

352

Beef liver, braised Beef liver, pan-fried Carrots, raw Carrots, boiled Almonds, dried Almonds, dry-roasted Mung bean sprouts, raw Mung bean sprouts, boiled Tomatoes, red, raw Tomatoes, boiled Mackerel, Atlantic, raw Mackerel, cooked, dry heat

66% 56% 88% 87% 4% 3% 90% 94% 94% 92% 63% 53%

.91 1.4 .15 .25 .11 .07 .09 .05 .08 .09 .40 .46

220 220 14 14 59 64 61 29 15 13 1.3 1.5

71 112 0 0 0 0 0 0 0 0 8.7 19

35,000 36,000 28,000 24,000 0 0 21 14 620 740 165 180

? .63 .46 .42 24 5.5 .01 .01 .38 .38 1.5 ?

Here is the average vitamin loss, expressed in percentages, and corrected for the effect of water loss: VITAMIN LOSSES FROM COOKING (corrected for water loss) (Expressed as average percentage decrease from value observed in raw food.) Food Water Content Vitamin Assay C B1 B2 B3 B5

Average Losses (% lost compared to Raw value) Uncorrected for water loss Corrected for water loss Uncorrected for water loss Corrected for water loss 8 -8 -8 16 -5 3 18 26 12 20 -11 -3 ? ? 10 18 3 11 9 17 3 11

Overall vitamin losses due to cooking are relatively modest. While there are a few inconsistencies
in the above tables due likely to differing samples, globally we see that, on average, cooking does destroy vitamins, but the consequences are not catastrophic. Average vitamin losses after correction for water loss range from about 10 to 25% in most cases. Also, vitamin losses correlate with what our textbook by Kreutler et al. [1987] said, but not precisely, so obviously heat is only one of the many factors which affect vitamin content.

353

Table illustrates dietary variety is as significant to nutrition as cooking. Finally, given the
extreme variability in vitamin content of different foods, there is a strong case to be made that variety is at least as important as cooking practices: someone eating 95% raw fruit is far more likely to be vitamindeficient than someone eating a 100% cooked diet, but including a variety of foods like liver, green vegetables, etc.

Cooking's effect on bioavailability


Next, the vitamin value of a food is not only determined by the absolute amount of vitamins it contains, but also by their bioavailability, which also may be influenced by numerous factors including cooking. First of all, anti-vitamins present in some foods can be destroyed by heat. (For example, avidin, which in raw egg whites binds to biotin.) Gelatinization of starch. As mentioned previously, cooking significantly improves the digestibility/bioavailability of starchy foods such as tubers (potatoes, yams, etc.), squashes, grains, and legumes through the process of gelatinization. Effect of cooking on bioavailability of beta-carotene. Let's mention here the effect of heat on carotenoids [Erdman et al., 1993]. Some plant foods, such as orange or dark green vegetables like carrots or spinach, contain hundreds of carotenoids, some of which are antioxidants and thus are protective against cancer. The most well-known is beta-carotene, which converts in the body as vitamin A. The absorption and transport are quite complex and not yet well-understood. From Erdman et al. [1993], beta-carotene absorption can be as low as 1-2% from raw vegetables such as the carrot. Mild heating, such as steaming, appears to improve the extractability of betacarotene from vegetables, and also its bioavailability. (Remark: "extracted" means put in contact with digestive juices. Nutrients in fibrous plants are not necessarily easily extracted; and something that is extractable may not necessarily have good bioavailability, i.e., be efficiently usable by the body.) However, additional heating can transform the naturally occurring trans double bonds into cis configurations, which reduces the biological value of beta-carotenes. For example, canning of sweet potatoes and carrots results in the conversion of 75% of all trans Bcarotene to 13- or 9-cis isomers. (Note: Canning involves long periods at high heat, much more than ordinary steaming.) This means that 75% of the naturally occurring molecules of betacarotene are transformed into molecules having the same chemical formula, but a different shape, and which are not usable by the body. From Johnson [1991], heat treatment of raw carrot juice at temperatures comparable to those at pasteurization and boiling does not change the carotenes, while heating at temperatures used during sterilization results in rearrangement of the carotene molecules and a decrease in total available carotenes (by transforming the naturally occurring trans into cis form). Note: sterilization typically uses temperatures of 180C (356F) for a few seconds, and kills all germs, whereas pasteurization is done at temperatures below boiling, and only kills most germs. Finally, folate in raw broccoli is only slightly more bioavailable than folate in cooked broccoli [Clifford et al. 1990].

Effects of cooking on minerals


Since heating itself does not affect mineral levels, other factors in the diet have more bearing. Here, we shall compare the mineral content between raw and cooked foods. Though minerals are

354

not lost due to heat, but are usually leached if cooked in boiling water, steaming is probably the best cooking method with respect to mineral loss. Otherwise, heating doesn't destroy minerals. Since minerals remain better preserved than vitamins (especially if steaming is used instead of boiling), other factors than cooking, such as variety in the diet and (perhaps) soil mineralization and fertilization, are more important in ensuring adequate mineral intake. Again, the table below represents mineral content but not bioavailability: some antinutritional factors in plants must be taken into account in this regard, such as phytates (which are present in unsprouted grains) and oxalates (present in spinach and rhubarb). Another example is iron, which is more available in heme form (as in red meat) than in non-heme form (in plants).

MINERAL LEVELS IN RAW VS. COOKED FOODS (uncorrected for water loss) (Note: Amounts are expressed in units per 100g portion of food. In the table below, "boiled" = boiled, drained, without salt.) Food Broccoli, raw Broccoli, boiled Beef liver, raw Beef liver, braised Beef liver, pan-fried Carrots, raw Carrots, boiled Almonds, dried Almonds, dry roasted Mung bean sprouts, raw Mung bean sprouts, boiled Water Content 91% 91% 69% 66% 56% 88% 87% 4% 3% 90% 94% Mineral Assay (all amounts = mg) Ca 48 46 6 7 11 27 31 270 280 13 12 Fe .88 .84 6.8 6.8 6.3 .50 .62 3.7 3.8 .91 .65 Mg 25 24 19 20 23 15 13 290 300 21 14 P 66 59 320 400 460 44 30 520 550 54 28 K 320 290 320 230 360 320 230 730 770 150 101 Na 27 26 73 70 110 35 66 11 11 6 10 Zn .40 .38 3.9 6.1 5.4 .20 .30 2.9 4.9 .41 .47 Cu .045 .043 3.3 4.5 4.5 .047 .13 .94 1.2 .16 .12 Mn .23 .22 .26 .41 .42 .14 .75 2.3 2.0 .19 .14

355

Tomatoes, red, raw Tomatoes, boiled Mackerel, Atlantic, raw Mackerel, cooked, dry heat

94% 92% 63% 53%

5 6 12 15

.45 .56 1.6 1.6

11 14 76 97

24 31 220 280

220 280 310 400

6 11 90 83

.09 .11 .63 .94

.074 .093 .073 .090

.10 .13 .01 .02

We can note in the above table that in a number of cases the values for minerals are higher in the cooked food than when raw (even allowing for differences due to water loss). As with the earlier table on vitamins, some inconsistencies in this regard are unavoidable, since different varieties of a food may have differing mineral contents, and differing samples may have been used. Since one would not expect cooking to increase mineral content (obviously impossible), we can safely assume that such differences are due to such variation in the food samples measured. This in itself is quite interesting, however, for the following reason.

Mineral losses from cooking are so low that they are overshadowed by background variation in nutrient samples. Whatever actual differences there may be between raw and cooked
foods with respect to minerals, the data available here shows that in many cases they are simply not significant enough to be clearly distinguishable against the background level of variation and sampling error that normally exists between food samples. It appears, then, that mineral content of a food is probably more variable from sample to sample (according to season, horticultural practice, climate, soil, etc.) than whether or not it is cooked or raw. Hence, we see that under close examination, the alleged huge differences between raw and cooked foods--with respect to minerals, at least--are not in evidence.

Does cooking render minerals "inorganic" or less assimilable?


One claim made by a number of raw "experts" is that "cooking makes all minerals inorganic," i.e., cooking converts all the (organic) minerals in food into an inorganic form. The further claim is then made that the inorganic form cannot be used by the body. This section will examine the validity (or lack thereof) of the first claim, as the second (regarding assimilation of inorganic minerals) depends somewhat on the first.

An old claim, but what is its origin? The idea that cooking converts organic minerals to an inorganic
form is fairly common in rawist circles--one can find natural hygienists, fruitarians, and others making the claim. It is sometimes attributed to Herbert Shelton or T.C. Fry (raw/predominantly raw diet advocates of the past), but apparently neither actually originated the claim. The earliest citation for the claim that could be located for the present write-up is one found in the book Uncooked Foods and How to Use Them, by Mr. and Mrs. Eugene Christian, published in 1904. From pp. 77-78: When we apply it [fire] to our food in the process of cooking, it results in such a change as destroys the elementary plant form, and the mineral elements return to their inorganic condition. Thus it appears that the claim originated long ago, before the days of Shelton or Fry, though the above book could be described as having (approximately) a "traditional" natural hygiene orientation. This suggests the claim may have originated in the 1800s (or earlier), perhaps as part of the nature cure movement, or in the earliest days of the natural hygiene movement. (If any reader is aware of citation(s) regarding an earlier origin for this claim, don't hesitate to forward the information to one of the site editors.)

What is actually meant by "organic" vs. "inorganic" minerals?

356

The first point to note is that the claim about the difference between "organic" vs. "inorganic" minerals is relatively vague. The terms used are never defined, thus many readers--or raw-fooders themselves--may not know exactly what the claim means. So let's begin by defining terms.

Organic: The standard chemistry definition is that a molecule is organic if it includes at least one
carbon atom. Inorganic: The standard chemistry definition is any molecule that is not organic, i.e., has no carbon atoms at all. Cooking: Heating food to a minimum temperature. The precise value of the minimum temperature is subject to discussion; details below. Mineral: (1) something that is neither animal nor vegetable matter, (2) a crystalline substance formed via inorganic processes.

[Detailed definitions are available via http://www.m-w.com/.] The term mineral is still not as precise as one would like; one might ask whether metals are minerals or not. By definition (1) under the last bullet above, metals are minerals, but by definition (2) they might not be. Because of this fine point, we give counterexamples below for both metals and minerals. Another fine point concerns the value of the minimum temperature in the definition of cooking. A few potential minimum temperatures that come to mind are:

The boiling point of water: 100C (212F). The temperature at which foods are said to become "leukocytic" per Kouchakoff
(the limitations/problems inherent in this approach are discussed elsewhere in this paper): approximately 90C (194F), though this temperature is said to vary with the particular food in question. The temperature at which foods are steamed (but not pressure-steamed): varies with the conditions the foods are steamed under, but may be near 100C (212F).

Thus one can argue for a definition of "cooked" in the range 90-100C (194-212F). For convenience here, we will adopt 95C (203F) as the standard for cooked food: at this temperature, food is "cooked" according to criteria that many raw-fooders share: most enzymes are degraded, and the food is leukocytic if one believes old research by Kouchakoff [1930, 1937].

Restating the claim in more precise language


We can now clarify the claim: Cooking food--heating it to 95C (203F)--will cause chemical changes to any/all organic molecules (present in the food) that include or are chemically bound to minerals, such that all the minerals in the organic compound(s) are converted into inorganic form (i.e., inorganic compounds and/or free ions). The term "mineral" as used here can follow either definition above. At this point most readers with even a limited knowledge of organic chemistry are probably chuckling with mirth, as the claim is, quite frankly, ridiculous. However, let's continue to take it seriously, and thoroughly assess it.

Counterexamples
Because the claim alleges that ALL minerals are converted to inorganic form, all that is needed to disprove it is a counterexample of an organic compound that includes a mineral (or metal) that is at least partially heat-stable at 95C (203F), i.e., can be heated to 95C and does not completely break down. A few select counterexamples follow.

357

Metals Cobalt is a metal, hence a mineral by definition (1) above, but not necessarily by definition (2) above.
Cobalt is an essential part of cobalamin, a compound better known as vitamin B-12. Herbert et al. [1984] reports that vitamin B-12 was heated to 200C (392F) for 6 days, with only 15% loss. That is, 85% of the vitamin B-12 survived the heating. This is a counterexample to the claim above, since if the claim were true, 100% of the B-12 should degrade due to loss of cobalt.

Copper: For an example from the plant world, Neumann et al. [1995] discuss how the plant Armeria
maritima binds a heavy metal (copper, from natural copper in the soil near a copper mine) into heat-stress proteins within the plant, which are stable and extracted for analysis at 95C.

Non-metallic minerals Sulfur: Two important sulfur-containing amino acids, methionine and cysteine (both found in many plant
foods, see Giovanelli [1987]) survive--to a large extent--after cooking. See Clemente et al. [1998] and Chau et al. [1997] for two research papers reporting the survival of these amino acids with cooking. Note that although there is some loss of sulfur-based amino acids in cooking, the claim the cooking makes minerals inorganic, if it were true, would require (nearly) 100% loss of methionine and cysteine. As that does not happen, these two common amino acids are counterexamples to the claim. (Note: the claim, if it were true, would require the consumption of at least some raw (rather than cooked) protein, as otherwise all the sulfur-based amino acids would be lost, including methionine, an essential amino acid.)

Additional examples Phytates and tannins are common antinutrient compounds in certain plant foods. Phytate contains
phosphorus (a metal), and also forms complexes with metal ions. Phytates and the phytic acid-metal ion complex are only partially destroyed by heat. Similarly, various tannins are known for forming complexes with minerals/metals, especially iron, and for being at least partially heat-resistant. We conclude, in this case, that cooking makes part, but not all, of the minerals inorganic.

Burden of proof Without evidence, claims are simply speculation. The above counterexamples neatly disprove the
claim. However, one might argue that disproof is not required here, because the promoters of the claim have never put forth any credible proof FOR the claim--they are simply speculating. Strictly speaking, the burden of proof is on those who make the claim, and in this case, no proof has ever been presented. In order to understand this point, consider what credible proof for such a claim would look like. Basically, proof for the claim should be a voluminous database, whose entries would show: Chemical compound name. Specification of the minerals the compound contains--type(s), number of atoms of each mineral. Specification of the foods that contain the compound--with reference citations. Reference citations proving the compound degrades at (or below) a specific minimum cooking temperature, into inorganic compounds/ions and organic compounds that do not contain any minerals. The database should contain all known organic compounds that contain mineral atoms, in all foods of interest.

Obviously, developing such a database would be a very large research project. However, instead of the above credible proof, what the raw-fooder receives/offers as "proof" are the unsupported/unscientific assertions of alleged raw "experts."

358

Old, speculative, and/or unscientific claims often accepted as "raw truth" without evidence or proof. Now consider the apparent age of the claim, and that the claim has been not only uncritically
accepted but, further, actively promulgated down to the present day by raw "experts," despite the apparent lack of even a semblance of legitimate scientific proof. Consider the credibility of the raw diet "experts" who promote as "fact" what is really unsupported speculation on their part. What does this suggest about their credibility on other points, especially if they claim that their raw diets are cure-alls, will work for everyone, are "ideal," etc.? The above is just one example of the burden of proof--when you hear raw "experts" claiming that "spices are toxic," "wheatgrass juice is toxic," etc., the burden of proof is on those making the claims. Also, due to the prevalence of crank science in raw, it is wise to check any "proofs" offered up by the "experts," if any are given. An interesting situation highlighted by the issue of burden-of-proof is that some of the allegedly "scientific" raw vegan "experts" accept the claim above with no proof whatsoever, while simultaneously angrily demanding "proof" for every criticism of their diet. Is that irony, hypocrisy, or both?

Rationalizations to defend the claim


As convenient rationalizations are easily spun when idealistic theories are at issue, two potential rationalizations to watch for here are:

RATIONALIZATION: The counterexamples cited above were not heated high enough. If
you heat something hot enough, it will turn to ash and the minerals will then be inorganic.

REPLY: This is an attempt to change the claim. The claim is that cooking converts minerals to inorganic form, not that incineration converts minerals to inorganic form. It is true that
given enough heat, carbon can be driven off and one gets an inorganic molecule. However, there is a big difference between cooking something via baking, and heating it to the point that the item is incinerated. As mentioned above, phytate is an antinutrient that contains phosphorus. Phytate is common in wheat, and sufficiently heat-stable that significant quantities of phytate can and do survive the bread-baking process [Buonocore et al. 1977]. Hence phytate provides a counterexample to the claim.

RATIONALIZATION: The claim is not ALL minerals, but only that MOST minerals are converted to inorganic form by cooking. REPLY: This is really speculation; once again, the burden of proof is on those who make this claim. Without a comprehensive database of organic compounds in foods that contain minerals, and
information on the temperatures at which they break down into inorganic forms, one cannot even say "MOST."

Absorption of organic and inorganic minerals in the human body


It is appropriate to briefly comment on the topic of minerals in the human body. The human body includes both:

Organic minerals, e.g., hemoglobin contains iron; many amino acids and proteins contain
sulfur; as well as

Inorganic minerals, e.g., salt--sodium chloride--in the blood, lymph, and many other body
fluids.

359

Obviously, you would die without hemoglobin (organic iron) and salt (inorganic sodium, a metal/mineral). Additionally, the body can certainly use inorganic iron; see Fomon et al. [1995], Abrams et al. [1996], and Cook and Reddy [1995] for experimental verification. Thus we note that the body needs and/or can use both organic and inorganic minerals, and the raw-foodist claim that the body cannot use any/all inorganic minerals is simply nonsense. Once again, the burden of proof applies to those who make this claim.

Are inorganic minerals "toxic"? Some raw-fooders make the extreme claim that all inorganic
minerals are "toxic." (And note here that, as with the failure to define "organic" vs. "inorganic," advocates are usually also quite sloppy in their use of the word "toxic.") Such claims are an example of the narrow, binary thinking common in the raw community. The fact that a substance (e.g., a specific inorganic mineral) is toxic in isolation and in huge doses does not mean it is toxic at the levels encountered in a particular food, nor does it mean the food is "toxic." One should beware of the raw "experts" who see toxins everywhere, except, of course, in the very few foods they promote.

Conclusion on the nutritiousness of raw vs. cooked food Considering trade-offs rather than spurious black-and-white divisions. We see from the many
considerations above that there is no clear-cut conclusion that can be stated with confidence about the question of raw vs. cooked foods as a whole. However, if we break the question down into simpler aspects, there are several general observations that can be made. First, the two major overarching considerations are:

Amount of nutrients in a food (the potential "benefit"). First, virtually all foods contain
more nutrients in the raw state. On the other hand, the differences are not very great: ranging from approximately 10-25% in the case of most vitamins, while the difference is negligible (almost zero) with respect to minerals. "Cost" to obtain the nutrients. However, there are also digestibility, antinutrients/toxicity, and bioavailability to take into consideration when assessing how many nutrients can actually be assimiliated from a particular food. Cooking can affect these considerations positively or negatively, depending on the circumstance.

"Net value" of a food depends on assessing the cost/benefit trade-offs. Putting the two above
points together means that one must consider the cost/benefit trade-offs--that is, the nutrients present vs. the "cost" to get them--the latter determined by both absorbability and antinutrient concerns.

Fruits can and do contain tannins, which are potent enzyme inhibitors and can cause unwanted
side-effects when consumed in large quantities [Mehansho et al. 1987, as cited in Etzel 1993]. However, the levels of tannins in ripe juicy fruits are generally low, and as cooking decreases the vitamin content and has only limited impact on tannin content, these will be better eaten raw. Meat. The organs of many animals (as well as human blood and urine) contain proteinase inhibitors [Vogel et al. 1968]. This suggests that raw (muscle) meats may contain some proteinase inhibitors as well. However overall, whether cooked or raw, meat is generally easy to digest especially when considered in terms of the high bioavailability of a range of nutrients (iron, zinc, vitamin A, taurine, protein, essential fatty acids). Raw meat can also contain parasites. However, the cooking of meats creates substances such as HCAs with potential carcinogenicity, though this is probably a concern primarily with high-temperature cooking. Aged or cured meats are said to be particularly easy to digest, and are prized in some cultures. Hence the tradeoff, raw vs. cooked, is not so clear for meat. More research would be desirable here. With vegetables it depends on the specific item in question. One consideration here is that bioavailability of beta-carotene is generally low from raw vegetables and is improved by steaming (i.e., in carrots for example). However, one must also weigh this against cooking's effects in

360

decreasing other vitamin levels, though since the effect is relatively modest (perhaps a 10-25% loss) in most cases, the decision is not always a clear one. Virtually all grains, legumes, and starchy tubers (potatoes, sweet potatoes) have a better net value when cooked. (Sprouting is another possibility, but may also greatly change nutrient composition. For example, while on a dry-weight basis [i.e., measured after complete dehydration], sprouting of wheat increases protein content (and also vitamin content), on a wetweight (fresh) basis, protein content considerably decreases, as does caloric value, for example. The situation is difficult to evaluate because of somewhat limited available data on sprouts. But in general, the basic picture is that while sprouting may increase the digestibility of a range of nutrients, at the same time it also increases water levels which dilutes their concentration [Vanderstoep 1981]. Nuts can contain trypsin inhibitors, lectins, saponin, and phytate; see Makkar et al. [1998] for a specific example. Conservative cooking may increase protein digestibility by deactivating heatlabile antinutrients (trypsin inhibitors, lectins), and by increasing protein digestibility. Sprouting of nuts (when feasible) should have an effect similar to that observed with grains and legumes, but nutritional data on sprouted nuts are not available to confirm. Overall, however, it is unclear at what level trypsin inhibitors, lectins, saponin, and phytate occur in nuts generally, and thus whether they constitute much of a problem, considering that most people seem to handle nuts well raw, with some exceptions of course. The antinutrient question with nuts is an area we hope to be able to resolve in more detail at a future date, at which time an update and/or new section on them for this paper will be provided. No one-size-fits-all answer. The nutritiousness of a given food may or may not be improved by cooking, or only by certain methods of cooking. There may be an ideal temperature, or an ideal cooking time. Caloric considerations on high-bulk all-raw diets. An additional issue is the difficulty on all-raw vegan diets of getting enough calories due to bulk. (See The Calorie Paradox of Raw Veganism for an examination of this problem, and why so many raw-fooders end up emaciated or hungry all the time.) Cooking coarse veggies makes them softer, and easier to eat more. While this might make overeating easier, it also allows one to get more nutrients as well. Most vegetables provide, on a caloric basis, relatively little nutrition and thus require consumption of very large amounts daily if one bases their diet primarily around them. It is hard to eat nutritionally significant amounts of coarse veggies when raw--but it's easier when cooked. Cooking's role given safety issues under modern conditions. In our modern day and age with less-than-ideal livestock practices, there can be obvious safety considerations with eating meats raw (even though the nutritional content and bioavailability is higher) due to concerns about parasites or bacteria when eaten uncooked. Thus, some compromise may be necessary. (Also, as we will see in Part 3, hunter-gatherers typically cook most of their meats, and have excellent general health.)

Raw idealism about nature vs. real-world practicalities encountered by actual hunter-gatherers. Some will say that if a food cannot be eaten raw, it should be avoided.
However, as we will see in Part 3 with hunter-gatherers living in the wild, this may restrict the diet to the point of unsustainability. Eating totally raw is not necessarily realistic even under more "natural" conditions of living off the land when's one life is dependent on using what is available in one's immediate environment. Cooking in the context of other "simple," "natural" processing techniques. Finally, to put cooking in a broader perspective, many other processing techniques have been used by humans to improve digestibility, such as soaking, sprouting, fermentation, aging, leaching of toxins and antinutrients, acid or alkaline treatment, etc. Cooking is sometimes helpful--overdoing it isn't. Obviously, whenever a food can be eaten raw with better results, it is preferable to do so and to be recommended.

361

PART 3: Discussion: 100% vs. Predominantly Raw


Controversy within raw-food circles. In light of the first two parts, it should be clear that some
benefits (although not perfect health) can be expected from increasing the amount of raw food compared to the average Westerner's diet. However, one of the main controversies in raw-food circles is whether eating 100% raw constitutes an improvement compared to a "predominantly raw" diet. While a general answer would be difficult to give because of the number of other parameters (what, specifically, would you cook? what would you eat raw?), we claim that in general, going from predominantly raw to 100% doesn't constitute a clear improvement, and even that including some cooked food is often beneficial.

Why did humans begin cooking? The first section, mostly speculative, discusses the reasons
why humans started cooking their food. What kind of cooking? Since the word "cooked" is very imprecise and can refer to steamed kale as well as potato chips or chocolate cake, we try to clarify what kind of mixed (raw/cooked) diet might be most prudent, and examine whether some forms of cooking should be avoided. How does cooking affect the overall diet? Then, we analyze in more depth how balance of nutrients is affected by cooking in regard to minerals, etc.

Cooking practices among hunter-gatherers and reputedly healthy traditional peoples. Observing that there are no raw-food cultures in the world, we give an overview of the
cooking practices of the peoples who reportedly enjoy far better health than in industrialized countries. Assessing what Instinctive Eating and Natural Hygiene say about cooking. Obviously, a number of different raw-food trends exist. It is said by Instinctive Nutrition that the diet requires 100% raw food for the food-selection instinct and satiation mechanism to work efficiently; and it is said by some Natural Hygienists that, since cooking is unnatural, we should eat 100% raw, according to the strictest interpretation of the principles of Natural Hygiene. We discuss successively these arguments, and we claim that they have some serious weaknesses. Weight of anecdotal evidence in the raw-food community. Finally, we review some anecdotal evidence: we suggest that many of the improvements experienced by people starting raw food can be explained by numerous reasons unrelated to whether their food is heated or not. We also examine what happens when a long-time raw-foodist comes back to a partially cooked diet.

Why does Homo sapiens eat cooked food?


This section is primarily speculation, but well worth the exercise. One might wonder: Why did humans start cooking their food, if cooking is so bad? When asking people in the street, answers would be "to kill germs," "to kill parasites," "to tenderize and improve digestibility," "to improve taste," or, more naively, they simply believe that cooking is part of what makes humans civilized and superior to animals.

A Natural Hygienist would object that germs are not dangerous in the context of a robust, healthy, and
naturally fed body--that cooked food is "dead" and has lost most of its nutritive value, and that cooking is unnatural and unnecessary.

An Instincto will say that, once humans started cooking, say, a sweet potato, the quantity they ate was
much higher than usual because the instinctive "stop" mechanism only works well with original, raw foods we've been exposed to during evolution, but is subverted by cooking. Thus, since their metabolisms became overloaded, raw sweet potatoes tasted less good the following morning, and the only way to get enough satisfaction with food was to continue cooking. In a sense, humans became prisoners of the vicious cycle they had built themselves.

362

What likely happened in reality? While more specific details about the earliest beginnings of fire use
will probably always be speculative, there are still a few things that can be said. Goudsblom [1992, pp. 1223] provides a grounded view of the plausible stages of increasing human familiarity with, and eventual control over, fire, recapitulated here in the following bullet points:

First natural wildfires. Evidence for naturally occurring forest fires goes back as far as
evidence for forest vegetation itself--about 350 million years, with such fires (as with wildfires today) likely being ignited by lightning and/or volcanic eruptions. While fire in nature is often viewed by modern-day humans as mainly destructive, and this is certainly true of its initial effects, the longer-term effects of wildfires in a primitive natural setting can be positive. Some plants and trees, for example, are "pyrophytes" that depend on fire to propagate and spread successfully, and many of them provide food and/or shelter for animal species.

Typical responses to wildfires by predators and other animals as guide to early primitive human relationship to fire. Based on present-day observations of how animals
react and respond to wildfires, which can be used as a behavioral baseline, one can infer that early humans would have exhibited at least as sophisticated responses to it. Typically, predators move in soon after a fire to forage for food among the charred or partially burnt remains. Ruminants later visit to lick at the ashes (for salt), and in general, mammals visiting the site appear to enjoy its warmth at night. Goudsblom refers to these types of behavior as "passive" use of fire. It may also have been at this stage that humans would first have begun to appreciate not just the different and perhaps appealing taste of fired food, but more importantly its effects in preserving meat for later consumption when it would otherwise spoil if not soon eaten--a survival advantage. The transition from "passive" to "active" use of fire. However, perhaps the most interesting and, of necessity, speculative question is how the transition occurred from the initial stage of such passive, opportunistic use to more "active" control and deliberate use of fire. Here, it can be supposed that at first, as with other animals, humans were wholly dependent on intermittent, fortuitous encounters with wildfires in gaining familiarity and experience. At some point, however, poking around like other predators in the aftermath of a still-smoldering fire for food, but utilizing a stick for a tool in typical human fashion, they would have noticed that branches used in this way might reignite the fire. From there it is not hard to imagine the dawning recognition that fire could be kept burning and transported from place to place for safekeeping, prior to the stage of being able to create fire directly. (Side note: This basic scenario obviously is what provided the ideas for the underlying plotline of the 1981 movie Quest for Fire.)

Where later controlled use of fire is concerned and can be documented by science, the most
reliable studies seem to indicate that humans did not start using fire consistently until about 400,000500,000 years ago (see discussion in Fire and Cooking in Human Evolution), perhaps to drive predators away and keep warm, but may not have used it yet at that time to cook food on a regular basis. It's commonly accepted in the paleoanthropological community that where the less equable environments of the temperate zones are concerned (and particularly where the harsher climates of higher latitudes or altitudes are concerned), fire probably would have been essential for warmth and to thaw out meat that had become frozen. From there it likely would not be much of a leap to proceed to the next step of beginning to cook food.

Speculative scene around an ancient campfire. In any event, it's not difficult to imagine that, once
fire use became widespread, one evening, when a family gathered for dinner around the fire--with some individuals tossing into it various objects (like branches), or the occasional piece of passed-over food to see what might happen--one of them got the idea of approaching a piece of meat thrown into the fire earlier. Humans are very curious by nature, so it is impossible to imagine that cooking occurred completely by accident; experiments like this must have commonly been made.

363

Coming back to our prehistoric individual warming themselves by the fire, the Eland steak, now cooked rare, began to acquire a delightful aroma of roasted meat, much stronger than what they were used to with raw meat. Perhaps they would have been a bit fearful the food might be altered in such a way that it became dangerous, but on the other hand, their senses told them that this piece of meat was decidedly attractive. With some hesitation, they tried a little bit of it, and waited anxiously for the next morning. The family then went to sleep, and, at dawn, apparently no ill-effect was observed. After a number of such hesitant experiments by themselves and others in the group as well, it was concluded that cooking improves taste and is not harmful, and they decided to continue from then on. Obviously, they didn't think at all about germs, since Pasteur wasn't born yet.

Was cooking an evolutionary "error"? This story may or may not be valid, but an important
question is: was cooking a "mistake"? Aside from any biochemical questions, this is the primary question that those who align themselves with "philosophical naturalism" (the position that what is natural is inherently good and to be emulated) must answer. Children will also put their fingers in the fire, but as they burn themselves, they learn from their mistake and don't do it again. Cooking doesn't produce ill-effects, at least in the short-term, so humans might not have been aware of their "mistake" yet, and it is tempting to conclude that now that we have grown up as a species, we should realize it is time to stop playing with fire.

Enhanced survival is the test of adaptation. In principle this argument makes a certain amount of
sense on the face of it, but overlooks the fact that, by cooking, many inedible foods become edible, and thus confer a significant adaptive advantage in an uncertain and wild environment, since the range of the diet is extended. We will see below that part of the success of the !Kung San in their marginal environment can be in part attributed to the use of fire and other types of rudimentary food processing, without which life would be much more difficult, perhaps impossible.

Idealist views of nature contradicted by examples of foods obtainable by actual huntergatherers. It has been claimed by numerous raw-foodists that originally humans started cooking because,
as they migrated out of the tropical African climate in which the species began, fruits became unavailable in winter, and the only way to be able to eat a sufficient amount of food or meat was to cook it. Again, this argument might make some sense if one could point to at least a few tribes in tropical countries who consumed predominantly raw fruit, but this is simply not the case. In actuality, there are no tribes in the tropics who eat this way for the simple reason that fruit is not as abundant or easily available to human foragers as raw/fruitarian advocates seem to suppose. See Hawkes et al. [1982] for a good discussion of how "optimal foraging theory" applies in the case of the tropical rainforest Ache hunter-gatherers of Paraguay, how it does a good job of predicting and explaining the composition of their diet, and why fruit constitutes only a modest part of it. Hunter-gatherers, even in tropical environments (like the Ache and some of the Aborigine tribes), eat a fair amount of meat, which is usually cooked, and also cook tubers. Only traditional Inuit/Eskimos seem to make a point of regularly including some raw animal foods in their diet (though it constitutes only a part of their diet).

Cooking in light of optimal foraging theory. The most reasonable explanation for why huntergatherers--the best and most primitive examples we have of anyone living as close to nature as possible with only the most rudimentary kind of "technology"--cook some of their food can be formulated in terms of "optimal foraging theory." This branch of study (which has been developed to explain and predict the feeding behaviors of animals in general) states that organisms tend to optimize energy expended in acquiring food vs. the energy and nutrients available in the food. In other words, if a food is easily collected and edible raw, then it will generally be eaten that way, but some foods, which are inedible in their raw state, would be able to provide a high energy or nutrient yield, in the sense that gathering, processing, cooking them would represent a significant gain of time and effort compared to eating only raw foods. This is true in particular for certain tubers or root vegetables, some of

364

which are partially edible raw but become much more bioavailable after cooking, and are concentrated sources of energy. It's also worth noting that smoking, drying, or cooking meats can also allow them to be preserved and more fully utilized without waste or spoilage, which is important in a wild environment where a regular supply of meat is not assured. On the other hand, foods that are difficult to collect, and which require too much time preparing and cooking would be unlikely to be eaten, period.

Natural "Garden of Eden" simply a myth. The conclusion of this is that the cooking of some foods,
by saving time and effort and extending the range of the diet, would have enhanced survival in a significant way, even in more supposedly "ideal" or tropical environments, but especially in areas where edible raw foods are scarce. That lack of ease in obtaining food occurs only in temperate zones and higher latitudes is not true is shown by the examples of the Australian Aborigines [O'Dea 1992], the Bushmen [Bicchieri 1972 and below] as well as many others [Bicchieri 1972], including the tropical rainforest Ache of Paraguay [Hawkes et al. [1982]. (Note that even in the case of the Ache hunter-gatherers of Paraguay, one of the few examples of a primitive people that have been extensively studied who subsisted in a dense rainforest habitat--the type of environment considered ideal by raw-fooders--significant amounts of their food were cooked; and fruits, a typical raw-foodist staple, were not as easily obtainable compared to other foods in their diet [Clastres 1972, see esp. p. 156; also Hawkes et al. 1982, and Hill et al. 1984].) The Garden of Eden is a myth--or, if you like, doesn't exist on Earth. The bottom line then, is that--however cooking may have gotten started--it's use conferred significant (evolutionary) survival advantages, or it would not have eventually become part of the regular repertoire of that most opportunistic of all animal species: human beings.

What kind of combined raw/cooked diet? Making intelligent choices


The point that the issue of raw vs. cooked foods is not an easy or black-and-white question has been made in a number of ways by now in this paper. Implicit in the recognition that the issue is not black-and-white is the resulting consideration of trade-offs. In this regard, there are two primary trade-offs to assess when making the decision whether to make certain foods a part of one's diet, and whether to cook them or not, that we'll examine here.

Trade-off #1: Should we cook to neutralize toxins/ improve digestibility of potentially valuable foods?
We have seen that cooking sometimes destroys antinutrients or toxins, so that a food previously toxic becomes edible. However, cooking often leaves some toxins undestroyed, and the result of cooking is that we eat much more of that food than if it were in its raw state. An excess of cassava, for instance, can result in a serious poisoning by cyanogenic glycosides; other examples are numerous, and much could be said against grains. (Antinutrients in grains, such as phytates which bind minerals, can cause rickets and pellegra if grains constitute a large enough portion of the diet. See the site article The Late Role of Grains and Legumes in the Human Diet, and Biochemical Evidence their Evolutionary Discordance for more information and references.)

"Optimal foraging" in the modern supermarket? Cooking proves to be useful for modern huntergatherers (and as we have said, would presumably also have been so for prehistoric hunter-gatherers once they had developed the level of acumen necessary to utilize fire) so as to enhance chances of survival. However, nowadays in the twentieth century, with all sorts of imported foods widely available, it may be that cooking loses its importance. On the other hand, the range of foods available in our supermarkets and health-food stores is implicitly intended to be sufficient for people using a stove, but might not be adequate for those who don't, since many of the tubers (i.e., potatoes) and vegetables (turnips, kale, etc.) are barely edible raw; and modern, highly bred fruits are excessively high in sugar compared to their wild counterparts.

365

So, raw-fooders end up with a diet which differs considerably from what they would be able to obtain under natural conditions (assuming this term has any meaning for humans that all could agree on). So even if "optimal foraging theory" doesn't apply anymore here, cooking and accepting a (very mild) natural toxin load (like solanine from potatoes) might help achieve a more balanced diet and the net result could be positive, as we shall see.

No "perfect" food or set of foods enabling avoidance of all toxins. While it certainly makes sense
to limit the consumption of foods that are inedible raw--all other things being equal (an important condition, because sometimes they aren't)--the presence of natural toxins shouldn't be a deterrent, unless they are present in hazardous levels. There is no perfect food. One role of the liver is to eliminate the toxic constituents in order to be able to utilize food sources for nutrients even if they are accompanied by some level of antinutrients or toxins, and the nutrients would be difficult to come by otherwise.

Trade-off #2: Will an all-raw diet require excessive bulk to obtain sufficient nutrition?
In principle, raw foods can provide all the necessary nutrients (except for the thorny issue of B-12 in vegan diets), especially if a variety of foods is utilized, including sprouts, nuts, organ meats, seaweed. Obviously, there is no black-and-white answer, given the extreme variability of raw-food diets. But while no one claims that cooking is an absolute necessity, in practice, a balanced raw diet is rather difficult to achieve.

Idealism vs. real-world practicalities. Some of the reasons why all-raw diets can be impractical to
implement include:

The calorie problem in strict raw vegan diets. Regarding deficiencies specifically, if raw
foods that are more concentrated in protein and fat are avoided or minimized for whatever reason (i.e., nuts, avocados, coconuts, etc.), raw vegetarian diets often lead to emaciation since these foods would otherwise be the main dense-calorie sources in an all-raw diet. (See The Calorie Paradox of Raw Veganism for an in-depth examination of these problems.) Or alternatively, to get sufficient calories, one may instead be forced to eat large quantities of sweet fruit, which can eventually lead to long-term problems with sugar metabolism and/or deficiencies, since, while sweet fruits are high in certain vitamins and minerals (such as vitamin C, B-6, nicotinamide, potassium), they are very low in others (vitamin D, B-12, biotin, calcium; fruits also have a low Ca/Mg ratio). Note that although the levels of some minerals are low in milk, e.g., iron, such minerals are very bioavailable and are readily absorbed. Bioavailability concerns can necessitate higher, unsustainable levels of intake. Since nutrients in some cooked foods have better bioavailability or edibility than the raw version (some root vegetables, tubers, grains, etc.), if one avoids such foods just because they are not so palatable raw, one can become forced into eating huge quantities of lower-bioavailable raw foods, which ends up making one's life revolve around food in an unbalanced way. (Again, see The Calorie Paradox of Raw Veganism for the practical problems imposed by the volume of food that can be required.) Raw diets often lack nutritional variety under modern conditions because of the relatively narrow range of palatable (raw) foods available in today's supermarkets, which can lead to boredom and/or deficiencies. Despite the fact that modern supermarkets give us foods from all over the world, they are often insipid and tasteless varieties that stand shipping/storage well, but may be unappetizing or otherwise unattractive or inedible to someone as concerned about quality food as a raw-foodist. Binge-eating due to unmet needs or caged desires. If such boredom and lack of variety is an issue, periodic binge-eating on non-raw or processed foods can become a problem because of the decrease in range of taste pleasures one may experience on such an all-raw diet. Despite what the more zealous 100%-raw-food advocates may often claim, some people find the all-raw diet to be not nearly as appetizing as "advertised." This can be true even after a lengthy period of

366

adjustment to eating raw. It can also be just as true even when eating a raw diet composed of highquality organic produce. Sacrifices in other important areas of one's life. One can attempt to avoid the problem of insipid supermarket produce by seeking out sources of tastier, higher-quality organic or farmers' market foods, or participating in CSA's (Community Supported Agriculture cooperatives), for example, where one works on the farm in exchange for a take of the food. However, these avenues take more time or money than some people have, and may also force one's life to become more centered around getting food than they would like, and create social isolation from others not "into" the same thing. (Others, of course, enjoy all this.) Social isolation can be a particular problem if one insists on eating all-raw all the time: bringing your own food to parties, or declining invitations to restaurants or family get-togethers can have a negative impact on your social life. Excessive mental preoccupation. In favor of eating some cooked food, a less extreme diet can help to think in a less extreme way, and thus gain mental balance. Some readers might not like the term "extreme," which has negative connotations. By "extreme," we simply mean "very different from what is commonly practiced by other humans." Extreme difference or originality can be a sign of genius, or of mental imbalance, or neither. We are not claiming that all people eating 100% raw are imbalanced; in fact, some are successful. But it is a fact of experience that excessive preoccupation with dietary purity often results in fanatical attitudes, or in ways of thinking that can spoil your life Important distinction to be made between less-than-100%-raw diets vs. SAD/SWD. In closing this section, we emphasize once again that there is a huge difference between predominantly raw diets, including some gently cooked items (no processing, no salt, no sauces, no frying or use of vegetable oil in cooking) and the "standard American (cooked) diet" (SAD), alternatively also called the "standard Western diet" (SWD).

Misrepresentations by extremists. One of the more inexcusable, often tacitly implied


misrepresentations in the typical extremist raw-foodist rationale is the tendency to indiscriminately lump in most any cooked-food diet with the SAD. Or to irrationally claim one doesn't get the bulk of the benefits unless one achieves the magic 100%-raw number, whereupon the benefits suddenly manifest themselves. On the contrary, it is possible to get most, if not all, the benefits that raw-food confers if one eats a predominantly raw rather than all-raw diet, without suffering from the inconveniences and potential nutritional downsides.

How is the balance of nutrients one obtains affected by eating partially cooked?
Let's compare a few foods. We give the mineral composition for standard 100-gram portions. (Notes: dates are dried, with 22.5% moisture; broccoli is boiled, without salt, drained; potatoes are boiled, without skin.)

SAMPLE FOODS IN A MOSTLY-RAW VEGAN DIET (nutrient values in mg per 100g portion) Chemical abbreviations for the elements listed in the table are: Ca = calcium, Fe = iron, Mg = magnesium, P = phosphorus, K = potassium, Na = sodium, Zn = zinc, Cu = copper, Mn = manganese. FOOD Grapes Avocado Minerals (mg per 100 grams) Ca 14 11 Fe .3 1.0 Mg 5 39 P 10 41 K 191 599 Na 2 10 Zn .04 .42 Cu .04 .26 Mn .72 .23

367

Dates Broccoli Romaine Potato

32 48 36 8

1.1 .9 1.1 .3

35 25 6 20

40 66 45 40

652 325 290 328

3 27 8 5

.29 .40 .25 .27

.29 .04 .04 .17

.30 .23 .64 .14

Of course, not all minerals are equally important, the most important ones being calcium, magnesium, iron, and zinc. Some copper deficiencies sometimes occur. Now, let's examine the composition for 100-calorie portions rather than weight. (Analysis by energy-calories--makes more sense when comparing foods eaten largely for their energy content, like avocados or potatoes.)

SAMPLE FOODS IN A MOSTLY-RAW VEGAN DIET (nutrient values in mg per 100 calories) Chemical abbreviations for the elements listed in the table are: Ca = calcium, Fe = iron, Mg = magnesium, P = phosphorus, K = potassium, Na = sodium, Zn = zinc, Cu = copper, Mn = manganese. FOOD Grapes Avocado Dates Broccoli Romaine Potato Minerals (mg per 100 calories) Ca 21 6.8 12 170 260 9.6 Fe .44 .62 .43 3 7.8 .37 Mg 7.5 24 13 86 43 24 P 15 25 14 210 320 48 K 290 372 230 1,000 2,100 400 Na 3 6.2 1.1 94 57 6 Zn .06 .26 .10 1.4 1.8 .32 Cu .06 .16 .10 .15 .26 .20 Mn 1.1 .14 .11 .79 4.5 .17

Including some cooked food may diversify the diet and reduce narrow dependencies. We see
that among the most dense foods calorie-wise (avocado, dates, and potatoes), avocados and potatoes are approximately equivalent as to mineral richness per 100 calories and beat dates on most points. Further, potatoes are well-digested and can therefore advantageously replace dates as a source of carbohydrates, allowing one to reduce the sugar load and excessive fruit intake. They also allow one to reduce avocado consumption: "addiction" to avocados is rather common, especially among raw vegans. In fairness, perhaps we should not label this dependence on avocados an "addiction," since it may well be that one quite naturally, if unconsciously, turns to avocados for calories and/or fat in the diet when it is otherwise low in these. And one can argue that such an addiction is not so big a deal. (Better to be "addicted" to avocados rather than, say, marijuana, of course.) Nevertheless, (over)eating the same thing every day is unlikely to be a very healthful practice (either physically or psychologically).

Utilizing some cooked food to supply dense calories can help resolve the high-volume intake problem. Sweet fruits like grapes might constitute a reasonably good source of energy, but are less

368

concentrated than dates, avocados and potatoes, and therefore cannot supply the bulk of calories unless an unreasonable volume is ingested, potentially resulting in flatulence, bloating, and/or watery stools. People on high-fruit diets will rarely admit that such things happen--but talk to enough ex-fruitarians and you'll find that they can, and if they do, tend to worsen over time. We also note that green vegetables are an excellent source of minerals, and induce a low glycemic response. However, few would be willing or able to chew and swallow one pound of raw broccoli every day (think about the mountain gorilla who spends 40% of the day chewing); therefore, some cooking allows one to increase mineral intake from such foods even if some modest amount is lost from the cooking itself.

Is "what's possible" necessarily optimal or most practical? Note: sprouts and animal foods were
not included in the table above for comparison. However, our intention is not to prove that boiled potatoes are necessary; in fact, they are not if the sources of nutrients are diversified enough, and if most of the calories are supplied by non-fruit foods. The point of the above table is simply to demonstrate that some diets, which are 100% raw and high in fruit, can be improved by the addition of some cooked foods.

How diversifying the diet with animal foods can affect nutrient balance
Now, let's have a look at vitamin content for the same plant foods outlined just previously, plus the following animal foods: pork (fresh, loin, top loin [chops], boneless, separable lean and fat, cooked, broiled), beef (composite of trimmed retail cuts, separable lean and fat, trimmed to 1/4" fat, all grades, cooked), mung sprouts, beef liver (raw). Note these abbreviations in the charts below: B1 = thiamine, B2 = riboflavin, B3 = niacin, B5 = pantothenic acid, B6 = pyridoxine, B9 = folate, B12 = cobalamin. "A" below refers (in animal food) to actual vitamin A (retinol), or in plant food to beta-carotene, which, assuming it is extractable, can be converted by the body into vitamin A. (As discussed earlier in this paper, the extractability of beta-carotene is usually very low in plant food unless cooked to release it from the fibrous plant matrix; or high-speed mechanical juicing may also serve the same function.)

SAMPLE FOODS IN A MOSTLY-RAW DIET THAT INCLUDES MEAT (nutrient values per 100g portion) The measurement units for vitamins B1, B2, B3, B5, B6, and E are milligrams. B9 and B12 are in micrograms, and A is in IUs (International Units). FOOD Grapes Avocado Dates Broccoli Romaine Potato Pork Beef Vitamins (per 100 grams) C 4 7.9 ? 75 24 7.4 .3 0 B1 .09 .11 .09 .05 .10 .10 .85 .08 B2 .06 .12 .1 .11 .10 .02 .31 .21 B3 .30 1.9 2.2 .57 .5 1.3 5.0 3.6 B5 .024 .97 .78 .51 .17 .51 .70 .35 B6 .11 .28 .19 .14 .05 .27 .38 .33 B9 3.9 62 13 50 136 8.9 8 7 B12 0 0 0 0 0 0 .69 2.44 A 100 612 50 1,388 2,600 ? 6 ? E .34 1.3 .1 1.7 .44 .05 ? .2

369

Mung beans Beef liver

13 22

.08 .26

.12 2.78

.75 13

.38 7.6

.09 .94

61 248

0 69

21 35,346

.01 .67

...and for 100-calorie portions: FOOD Grapes Avocado Dates Broccoli Romaine Potato Pork Beef Mung beans Beef liver Vitamins (per 100 calories) C 6 4.9 ? 270 170 8.9 .13 0 43 15 B1 .10 .07 .03 .20 .71 1.2 .37 .03 .28 .18 B2 .09 .07 .04 .40 .71 .02 .14 .07 .40 2.0 B3 .4 1.2 .8 2.1 3.6 1.6 2.2 1.2 2.5 9.1 B5 .04 .60 .28 1.8 1.2 .61 .31 .12 1.3 5.3 B6 .17 .17 .07 .50 .33 .32 .17 .11 .29 .66 B9 5.9 38 4.7 180 990 11 3.5 2.3 200 180 B12 0 0 0 0 0 0 .30 .79 0 48 A 150 380 18 5,000 18,000 ? 3 ? 69 25,000 E .51 .81 .04 6.1 3.1 .06 ? .07 .03 .47

Again, we note that potato fares pretty well compared to dates and avocados. Next, we comment one by one on each vitamin:

Vitamin C: Obviously not a problem for raw-fooders. Vitamin B-1 (thiamine): Sprouted wheat (.22 mg per 100 g) and sprouted lentils (.23 mg) are
good sources, but pork and beef liver are the richest. Not everyone will eat these foods raw. Vitamin B-2 (riboflavin): Present in sprouted legumes and wheat, as well as green leafy vegetables and liver. Vitamin B-3 (niacin): Mainly present in meat, fish, organs, and nuts (almonds: 3.4 mg per 100 g) but also to a lesser extent in sprouted wheat. Vitamins B-5 (pantothenic acid) and B-6 (pyridoxine) are usually widely available; deficiencies are rare. Vitamin B-9 (folate): Liver, beans, green vegetables, avocados, eggs (yolk: 146 mcg per 100 g), and nuts (almonds: 59 mcg). Vitamin B-12 (cobalamin): Liver, other animal foods. Vitamin A: The best source is liver, but vitamin A is present in the form of beta-carotene in orange vegetables (carrots) and dark green vegetables (spinach). However, for up-to-date recent research indicating bioavailability of beta-carotene in plants may be lower than previously believed, see Vitamin A and Beta-Carotene, under the subsection "Key Nutrients vis-a-vis Omnivorous Adaptation and Vegetarianism" within the article Comparative Anatomy and Physiology Brought Up to Date elsewhere on this site. Vitamin E: Rich sources are vegetable oils, nuts, dark-green leafy vegetables, organ meats, seafood, eggs, avocados.

370

Finally, vitamin D didn't figure into the table above, but this important vitamin (for calcium transport) is present in animal fats (mainly fish fats). It is also synthesized when the skin is exposed to sunlight/ultraviolet light.

Moral of the story: Diversity in nutrient sources is at least as important as raw vs. cooked considerations, if not more. The conclusion of the above is that no food is rich in all vitamins, except
perhaps liver (some people may argue that it also has a "superior" content in toxins--but that's an issue for another discussion), and that a diversified diet is important--probably more important than preserving every milligram of vitamin by eating 100% raw.

Example of improving nutrient intake through diversification with specific key foods Effect on mineral intake. Now let's take an example to illustrate the effect of variety in improving
nutrient intake: Suppose the menu of the day consists of 1,000 calories of avocados, 1,000 calories of dates, 300 grams of lettuce and 200 grams of carrots. (This is obviously an oversimplified menu; however, we use it here simply for the sake of avoiding too-lengthy calculations.) Note that we don't recommend eating exactly this way! What follows is just for illustrative purposes.

SAMPLE MENU FOR ILLUSTRATIVE PURPOSES (effect on mineral intake) 1,000 calories each of avocados and dates, plus 300g (10.6 oz.) romaine lettuce, 200g (7 oz.) carrots. FOOD Avocado Dates Romaine Carrots TOTAL: Minerals Ca 68 120 108 54 350 Fe 6.2 4.3 3.3 1.0 14.8 Mg 240 130 18 30 418 P 250 140 135 88 613 K 3,720 2,300 870 646 7,536 Na 62 11 24 70 167 Zn 2.6 1.0 .75 .40 4.75 Cu 1.6 1.0 .12 .09 2.81 Mn 1.4 1.1 1.9 .3 4.7

Note the importance of lettuce for calcium. Now, what happens if we supplement with 300 grams (10.6 oz.) broccoli? Minerals Ca 350 144 494 +41% Fe 14.8 2.7 17.5 +18% Mg 418 75 493 +18% P 613 198 811 +32% K 7,536 975 8,511 +13% Na 167 54 221 +32% Zn 4.75 1.2 5.95 +25% Cu 2.81 .12 2.93 +4% Mn 4.7 .7 5.4 +15%

FOOD Prev. meal + broccoli New meal GAIN:

371

One may argue that supplementing with anything will increase mineral intake--which we won't deny. The point of the above is to illustrate the mineral-richness of green vegetables, and that replacing some fruits with some cooked green vegetables results in a net gain.

Effect of diversification on vitamins. We won't repeat the analysis of the above foods for vitamins,
but it is clear that broccoli is a good source of vitamin C, folate, and beta-carotene. Instead, we will investigate what happens when the sample meal above is supplemented with 50 grams (1.8 oz.) of beef liver--a very modest amount. Note: the following is not to convince anyone that liver is necessary for health, but to illustrate how variety can have a significant impact on the nutrient content of your meals.

SAMPLE MENU AS ABOVE but supplemented with beef liver instead of broccoli (effect on vitamin intake) 1,000 calories each of avocados and dates, plus 300g (10.6 oz.) romaine lettuce, 200g (7 oz.) carrots, 50g (1.8 oz.) beef liver. FOOD Avocado Dates Romaine Carrots TOTAL: Liver GAIN: Vitamins C 49 ? 72 28 ? 11 ? B1 .7 .3 .30 .3 1.6 .13 B2 .7 .4 .30 .18 1.58 1.39 B3 12 8 1.5 2.8 24.3 6.5 B5 6.0 2.8 .51 .60 9.91 3.8 B6 1.7 .7 .15 .44 2.99 .47 B9 380 47 408 42 877 124 B12 0 0 0 0 0 34 / A 3,800 180 7,800 84,000 95,780 17,673 +18% E 8.1 .4 1.3 1.4 11.2 .34 +3%

+8% +88% +27% +38% +16% +14%

We thus see the dramatic effects of adding a tiny supplement (50 grams = 1.8 oz liver), especially on B vitamins, including of course B-12, but also vitamin A, which is considerably more bioavailable in that form than in the form of beta-carotene as it appears in vegetables.

In summary, for this section:


Cooked starches advantageously replace raw (non-animal) concentrated sources of energy on


all points: vitamins, minerals, digestibility/bioavailability, sugar load. Dark green vegetables (which are easier to eat cooked) are an invaluable source of minerals. To obtain an adequate vitamin intake, most important is variety. It's certainly possible, though more difficult, to have a raw varied diet; but incorporating some cooked vegetables may help.

Cooking practices of hunter-gatherers


Though hunter-gatherers cook, degenerative diseases are rare. In general, the scientific evidence
available about hunter-gatherers who follow their traditional ways (or did follow them, since most

372

traditional hunter-gatherers have now been acculturated) shows that globally they enjoyed good health with remarkably low rates of degenerative diseases. Perhaps, even, the lowest on the planet. (This will be discussed following our look below at some of the eating and cooking practices of hunter-gatherers; and a pointer will also be given to more extensive information available about disease incidence elsewhere on the site.) We cannot here review the voluminous hunter-gatherer literature exhaustively. However, a few examples looked at in some depth will help to get an idea on the subject. Covered here are Australian Aborigines, the San Bushmen of the Kalahari desert in Africa (with primary emphasis on the well-known !Kung tribe), and the Inuit ("Eskimos").

Australian Aborigines
The diet of Australian Aborigines has been extensively studied by O'Dea. The most detailed information exists for Aborigines of Northwest Australia. During a two-week period, intake of various foodstuffs was measured [O'Dea, 1984]. Animal food contributed to 64% of total energy intake. Main staples were antelope kangaroo (36%), freshwater bream [a fish] (19%), and yams (28%). All other listed foods accounted for only 17% of total energy intake.

Aboriginal cooking practices. The following information about cooking methods of Australian
Aborigines is quoted from http://online.anu.edu.au/Forestry/fire/ecol/as34.htm: Cooking fires usually suited the food to be prepared. They were controlled by a range of techniques such as using different types of timber, twigs or leaves. Hot stones were ample to fry Bogong moths; small banks of coals suited marsupial rodents; somewhat larger, specially shaped hearths baked cakes, cooked tubers, and leached toxins from various foodstuffs. Kangaroos were usually cooked where they were killed and required larger, temporary fires. The cooking proceeded in stages--the carcass would be singed on both sides, then removed and scraped clear of fur, gutted and thrown back into the coals for deep roasting [Pyne 1991, p. 89]. Heated stones were useful to open hard fruits and explode Acacia seeds. Cockles (mollusks)--consumed by the tens of millions--were prepared for eating by heaping the shells into piles, then topping the mound with a small fire, which heated the valves sufficiently to pop them open without the need for breakage [Pyne 1991, p. 89]. Perhaps the most sophisticated cooking fire was that made on a layer of clay or seaweed and carried in the bottom of the bark canoes used by fishing parties [Nicholson 1981, p. 63]. The ever-handy firestick carried by the Aborigines ensured that cooking fires could be manufactured when and where required.

Food cooked by the Aborigine Overview: cooking may be necessary to render the available foods edible. Detailed information
about plant foods consumed by Aborigines can be found at http://osprey.erin.gov.au/anbg/aboriginaltrail.html, from which the bullet-point information below is quoted. Of the many plant foods consumed by the Aborigine, some were eaten raw, others cooked. For some plants, information about food preparation is not available. The list below is not exhaustive, but provides a good overview of the way plant foods are processed, and shows that, in some cases, cooking is necessary to render the food edible. Alocasia macrorrhizos (Cunjevoi). [New South Wales, Queensland] The swollen stems are starchy and fibrous, but are poisonous if eaten raw, causing the mouth and throat to swell, sometimes fatally. The Queensland Aborigines repeatedly roasted and

373

pounded the plant to remove the poison. Cunjevoi is an Aboriginal name from southern Queensland. Araucaria bidwillii (Bunya Pine). [Queensland] When this tree is mature it will bear large green cones, and inside each scale of the cone will be found a hard-shelled nut about 5 cm [2 inches] long. These nuts were such a popular food that tribes came from hundreds of kilometres around the Bunya Mountains in southern Queensland to feast on them. Particular trees were considered to be the property of certain Aboriginal families, but everyone was invited to share the delicious nuts, which are not unlike chestnuts when roasted in the fire. Although found only in Queensland, Bunya Pines have been planted in the southern Australian states, and the nuts may sometimes be bought in Sydney markets. They can be boiled or roasted. Dendrobium speciosum (Rock or King Orchid). [New South Wales, Queensland] The swollen stems were beaten to break up the fibre and then cooked on hot stones. Doryanthes excelsa (Gymea Lily). [New South Wales] The flowering stems grow up to 4 meters [13 feet] high, but were cut when young, about 0.5 meters [20 inches] long and thicker than a man's arm, and roasted. The roots were also roasted and made into a sort of cake. Macrozamia spp. (Burrawangs). [New South Wales, The Northern Territory, Queensland, Western Australia] The seeds of these and other cycads are borne in a large cone and have an orange outer coat. [Note: A cycad is a palmlike, cone-bearing evergreen tree native to warm regions.] They are poisonous, but the Aborigines knew how to treat them to remove the poison, and so take advantage of the large amount of food provided by a single plant. One of the ways was to cook the seed, break it up, and then soak it for up to three weeks in running water. In Western Australia, only the outer red part was eaten, after treatment by washing and burying. Microseris lanceolata (Murnong or Yam-daisy). [New South Wales, South Australia, Tasmania, Victoria, Western Australia] This small perennial plant was the favourite food of the Aborigines of central and western Victoria, and was also eaten in South Australia and New South Wales. It has a radish-shaped tuber, which is renewed each year. In the spring the plant forms a yellow flower-head like a dandelion, and in the summer the leaves die off and the tuber becomes dormant. The tubers were cooked in baskets in an earth oven, producing a dark sweet juice which was much liked. Once a common plant, Murnong became scarce due to grazing by sheep.

The San (Kalahari desert, Africa)


The San are better known popularly as the "Bushmen." Food is available in the Kalahari desert, and people spend an average of 32.5 hours a week in food procurement outside of camps [Tanaka 1980, p. 77]. The figure of 32.5 hours does not include time spent in camp preparing food for consumption (i.e., cooking), or in-camp manufacture and maintenance of tools needed for hunting. Here are a few quotations from Tanaka [1980]: p. 38: A fair portion of the San's food is eaten raw, but most of it is cooked. Those foods eaten raw include berries, green vegetables, plants providing moisture, and some of the root foods. The melons and roots are more usually eaten cooked.

pp. 38-39: Cooking may consist of pan frying or of baking directly in the fire. The pans are all iron,
acquired relatively recently through trade. Before the introduction of iron pans, it is thought that the people probably cooked their food either directly over the fire or in the middle of the hot ashes. Animal flesh is never eaten raw. When meat is cooked in a pan, it is simmered for over an hour in a little water (for which

374

melon pulp is often substituted); when it is so tender that the sinews will fall apart, it is usually crushed in a mortar. The San have no salt and use no seasonings; on rare occasions, however, a little antelope fat will be added to improve the flavor. When a pan is not available, the San may bury a large piece of meat in the embers or hot sand and leave it to bake for about an hour and a half. In addition, thin pieces of meat can be cooked on top of the fire. As for the plants, melons are stewed; when a pan is not available, they are buried in hot embers or ashes as in the case of meat. When the flesh of fruit is steamed through, the rind is discarded and the rest ground in a mortar as a gruel. All roots (except Raphionacme burkei) are cooked over the fire or in the ashes. The Bauhinia petersiana beans and the berries of Ochna pulchra are also cooked in the ashes, but as these are about 1 cm [3/8 inch] in diameter and are hard to separate from the ashes and sand, a sieve woven of grass is used to separate the beans or berries.

p. 148: Fire is widely used, but its main purpose for man is to cook food. The San now use pans to cook
melons and meat, but their traditional way of cooking is to use direct fire.

Staples of the !Kung diet


The !Kung San, a subcategory of the San reviewed above, became a cause celebre in anthropological circles during the 1960s and 1970s [Lewin 1988], and as a result were one of the most extensively studied of all hunter-gatherer tribes. The following information about their diet comes from Lee [1979]. A distinguishing characteristic of the diet of the !Kung is the superabundance of the mongongo nut, which is a very high-fat food (at 80% fat), and which constitutes over 1/3 of their diet. The !Kung's main staples, their description, and their consumption are described below.

Diet consists of both raw and cooked foods, based on what is available in the environment.
As we shall see, some items are consumed raw in their natural state, while others are cooked and mixed. Also, keep in mind that while the plant foods presented above are those which are the most commonly eaten, the !Kung exploit over 200 plant species, most of which are less palatable. In addition, it should be pointed out that the overwhelming majority of the calories in the !Kung's diet are from the mongongo nut and meat. The caloric and protein levels in the !Kung diet were recorded during the winter of July-August 1964:

MAIN STAPLES OF THE !KUNG SAN'S DIET (in July/August 1964) (Special note: The figures in the chart below represent a seasonal high in mongongo nut consumption [high in fat at 80% of calories], and therefore should not be taken as representative of average macronutrient composition of the !Kung's diet.) FOOD Meat Mongongo nuts Other plant foods TOTAL: % of Diet by Weight 31% 28% 41% 100% Weight (g) 230 210 300 740 Protein (g) 34.5 58.8 3.0 96.3 Calories per person per day 690 1,365 300 2,355

375

Note that the !Kung are rather short and thin (about 160 cm and 50 kg, or 5'3" and 110 lbs for males), so for their size they get adequate amounts of calories.

Macronutrient ratios (seasonal high in fat). A quick calculation based on the above table gives the
percentages of macronutrients by calories below:

Protein: 16% Carbohydrate: 14% Fat: 70% (Technical note: In the above calculations of macronutrient ratios, we used the fact that meat has
negligible carbohydrate, and a table from Lee [not reproduced here] giving the composition of the mongongo nut. We also assumed, to keep things simple--and which should not affect accuracy much, if at all--that "other vegetables" have no fat, since vegetables rich in fat are usually rich in protein, and these "other vegetables" supply only 3 grams of protein.) Of course, the proportions vary depending on the period of the year, and the above reflects a seasonal high in terms of mongongo consumption (and thus fat as well, since the mongongo is 80% fat), but overall, meat and mongongo nuts constitute perhaps 35% and 40% respectively of the total calories on average. Based on these additional figures, then, a more representative average yearly intake of the above foods, plus a breakdown for protein, carbohydrate, and fat proportions in the diet would be as follows.

MAIN STAPLES OF THE !KUNG SAN'S DIET (estimated yearly averages based on factoring out seasonal highs) (Figures assume approximately 35% meat, 40% mongongo nuts, and 2,400 kcal/day. While these figures may not correspond exactly to yearly averages, they are probably close to reality.) FOOD Meat Mongongo nuts Other plant foods TOTAL: % of Diet by Weight 27% 14% 58% 100% Weight (g) 280 148 600 1,028 Protein (g) 42 41 6.0 89 Calories per person per day 840 960 600 2,400

Note: First numeric column does not add up to 100% due to rounding. Yearly average macronutrient ratios by calories, based on the above (adjusted) table are: Protein: 15% Carbohydrate: 25% Fat: 60% The !Kung San's main plant foods Plentiful, but life would be difficult if all-raw consumption were attempted. We shall see that
the !Kung's foods are indeed plentiful and nutritious, but that without processing (such as roasting to facilitate mongongo nut cracking, burying wild oranges, roasting some roots), food would be more

376

monotonous, less palatable, and certainly life more difficult in general. The material below has been condensed and paraphrased from Lee [1979].

The Mongongo Description: The mongongo is a highly nutritious fruit and nut that constitutes the main staple in the diet
of the !Kung Bushmen. Indeed, nuts represent over 1/3 of their total calories, and are available almost all year long. The fruit of the mongongo is composed of five layers: 1. The fruit skin, which is not consumed: it is removed and discarded. 2. The green or red fruit flesh, which has a dry and spongy texture. The taste of the fruit flesh is similar to a date, although it is not as sweet as the date varieties that are the standard in international commerce. 3. The outer shell of the nut, which is very hard and difficult to crack. (The fact that the nut is so difficult to crack has prevented the commercial cultivation of the mongongo.) 4. A thin inner shell (1 mm, or 0.04 inch). 5. The nut kernel, which looks like a small hazelnut (except that it is skinless), and breaks easily into halves. The taste is similar to that of cashews or almonds that have been dry-roasted. With long roasting, the nut develops a flavor similar to an aged cheese.

Consumption: Under normal climatic conditions, the mongongo season begins when the fruit first ripens
and falls to the ground in April. After the fruit flesh has been consumed, the nuts are roasted, cracked, and eaten. By August, and lasting until approximately November, the fruit flesh has dried, and has been partially eaten by insects (the nut kernel is still okay at this point). Despite the insect predation, some (dried) fruits are edible after soaking and cooking; the insect-damaged fruits are roasted to burn off the damaged fruit flesh, and the nuts are cracked. From November to March, the fruit flesh is gone--eaten by insects--and only clean nuts are available. The flesh of the mongongo fruit was eaten whole and raw in the past, but now it is cooked in an iron cooking pot for 20 minutes. The nuts are roasted about 5 minutes in a mixture of coals and a small pile of dry, loose sand; then they are cracked. They can be eaten whole, or pounded in a mortar, or mixed with a variety of vegetable or animal foods. The mongongo nut is an excellent source of protein (28% by weight) and energy (654 calories per 100 grams), as well as magnesium.

Baobab Description: The seedpod of a very large tree. The seedpods are 10-15 cm long, 80-200 gm in weight,
and have a dry pulp with 20-30 seeds. The composition of pods (by weight) is: 22% pulp, 31% seeds, 47% waste. The pods are in season in the period May-September.

Consumption: In immature pods, the seeds and pulp are eaten together. With mature pods, the pulp is
pounded to remove the seeds; after removing the seeds, the dried pod pulp is pounded to produce a flour. The flour is then used to make pudding or drinks. The fruit has a pleasant flavor but is acidic. The seeds are roasted and consumed. The baobab (nut and fruit) contains 14% protein (by weight), and is an excellent source of magnesium, calcium, potassium, phosphorus, thiamine, and vitamin C (213 mg per 100 grams of pulp).

Vegetable ivory palm (!Hani) Description: The spherical seed (5 cm in diameter) of a palm tree; consists of four layers: (1) An outer

377

skin (inedible); (2) edible fruit pulp (3-5 mm); (3) nutshell (inedible); (4) an extremely hard nut approximately 15 mm in diameter. Early in the season, in June, about 33% of the total weight consists of the fruit pulp. Later in the season (October), the seeds dry out and the fruit pulp proportion falls to 25%.

Consumption: The skin is peeled off and the fruit is pounded to remove it from the nutshell. The fruit
pulp may be eaten raw, as-is, or ground into a coarse meal. It is always eaten raw, unsalted, and never mixed with water. It may be eaten with baobab fruit (flour). The flavor of the fruit pulp is similar to dates. A dwarf form of the vegetable ivory palm exists, although it may actually be a different palm species. The dwarf form yields a large, edible palm heart. The palm heart is roasted in a pit and eaten.

Marula nut Description: Season: March-October. The oval seed, about 2.5 cm long, consists of a skin, a juicy pulp,
and a hard shell enclosing a small kernel. The juicy pulp has a wonderful flavor; however, the nutmeat is the most important part of the food.

Consumption: The taste of the nut is superior to that of the mongongo, but is much smaller, and the
nutmeats are extracted with difficulty using a long thorn. The nuts are often eaten as-is, but they can be used as a substitute for mongongo nuts in recipes.

Wild orange Description: 10 cm diameter, 438 gm average weight. Lee [1979, p. 482] describes it as follows:
The fruit looks superficially like a large Sunkist orange about 10 cm in diameter. But the rich orangecolored rind is hard and woody, and the pulp inside is quite unlike an orange, consisting of 30 lozengeshaped pips surrounded by a sticky brown pulp. The fruits are in season from September-December. However, the common custom is to collect unripe fruits by knocking them down with sticks or snagging them with a probe. The unripe fruits are then buried at a depth of 0.5 meters in the ground, where they remain for approximately one month. Burying the fruit speeds up ripening and protects the fruit from insect attack. The wild oranges are popular with other local tribes, and are a trade commodity.

Consumption: The fruit is cut with a knife and the pulp eaten out-of-hand or with a spoon. The fruit is
sweet and has a nice fragrance. The seeds are discarded; only the pulp is consumed. There is another form of the wild orange; it is not buried but roasted. When tree-ripened, it can be eaten in the raw state. The !Kung understand that unripe wild oranges are unsuitable as food. Lee [1979, p. 482] reports: The !Kung older people caution the children to never eat either species [of wild orange] unripe, saying that doing so will make them vomit.

Sour plum Description: The oval fruit, about 2 cm long, consists of a soft, juicy, astringent-tasting skin, a layer of
acidic orange pulp, and a seed (inedible). The fruits are collected in season (December-February) directly from the bushes; however, fruits that have fallen on the ground are neglected.

Consumption: The fruits are eaten raw, peeled and/or unpeeled. They may be crushed and eaten as pulp.
The seeds are roasted and used for medicinal purposes. Despite the high acidity of the fruit, it is eaten alone and not mixed with other (less acidic) fruits.

Berries (Grewia species)

378

Description: There are two major species: Morethlwa and Mokomphata. Their size is comparable to that
of a pea, and they consist of a thin, edible skin; an edible orange pulp; and a tiny, inedible seed.

Consumption: If ripe and fresh, they are eaten raw as-is. Later in the season, after they have dried and
become stringy, they are pounded with water in the mortar, and the seeds removed. The pulp may then be eaten as a pudding, or a drink may be made from the mixture. The "grewias" are always eaten raw and alone. There can be side-effects to eating these berries. Lee [1979, p. 484] reports: The !Kung consume large quantities of the berries, including the pits. The latter are passed intact through the digestive system and expelled in massive wads in the feces. One of the hazards of eating Grewia in large quantities is the danger of fecal impaction. According to the !Kung, people have died from this condition though we never observed such a case in our studies.

Wild mango
One of the best-tasting of the !Kung plant foods. The fruit is about the size of a cherry (~3 cm). It consists of an inedible rind, an edible orange stringy pulp similar in taste and appearance to the mango, and a hard, inedible seed. The peel is discarded and the pulp is eaten from the seed. The fruit is eaten alone and never mixed with other foods.

/Tan root
The root is shaped like a yam, with a thick skin and fibrous white flesh. The roots lie 25 to 60 cm below ground, and usually weigh 1 to 2 kg, although large roots may weigh as much as 10 kg. The roots are extracted using digging sticks, and it is a very strenuous process that may take 20+ minutes for a large root. (Some of the soils in the area the !Kung inhabit are compacted, making digging very difficult.) The root is always eaten roasted, never raw. It can be eaten alone (it is delicious) or mixed with other foods. The !Kung report that the root can cause diarrhea or stomachache if eaten raw.

!Xwa water root


It is an important source of water. One may eat the pulp, or squeeze the liquid out to eliminate the bulk. However, the technique of squeezing !xwa to get water is only 63% efficient. The !xwa has a sweet, pleasant flavor. !Xwa is a starchy food and may be eaten with mongongo nuts. Its juice is used as a water source when water is scarce.

Sha root
A popular food. It is edible raw, but is usually eaten cooked (roasted). The entire root is eaten. It has an excellent flavor and can be eaten by itself, but is usually mixed with other foods.

Tsin bean
Lee [1979, p. 487] reports that, "The tsin bean is the second most important food of the !Kung in the southern part of the Dobe area and in Nyae Nyae." The seedpod of a vine, it has an inedible shell but contains edible seeds. The tuber of this plant, known as "n//n," is also edible. Immature beans are collected in January, and the beans are peeled and roasted. After April, the beans are mature and the shells hard. Beans are often roasted in-shell by burying batches in hot ashes. The beans are cracked and eaten whole; they are said to have a good flavor. The shelled beans might be ground up and water added to produce a soup or pudding.

Tsama melon
Grows abundantly in the central and southern Kalahari desert. They are a major food and a water source. The melons are round, pale green or yellow in color, and weigh around 1 kg. They are easy to find and collect. The seeds are edible and are eaten roasted. The melon flesh is white, hard, and is more bitter than the domesticated watermelon. A large melon might be sweet in the center, although the degree of sweetness depends on the pollination parent.

379

A related melon found in the Kalahari is the "bitter" melon, dcha, which is always cooked to improve its taste.

The traditional diet of the Inuit (Eskimos): Were they the only raw-food culture? Stefansson's firsthand early-1900s accounts of unacculturated Inuit
The explorer Vilhjalmur Stefansson spent several years with the Inuit ("Eskimos") of Northern Canada and Alaska in the early 1900s, speaking their language and eating the same food. Although by that time some Inuit had already partially adopted the Western diet and lifestyle, those living in remote areas had not yet met any Westerners. Stefansson [1913] relates his first encounter with the Dolphin and Union Straits Inuit, and provides a firsthand account of the traditional diet and lifestyle of an unacculturated group of Inuit. Stefansson reports that the Dolphin and Union Straits Inuit still used stone-age technology to procure and process their foods, and that he was the first Westerner to make contact with them. They spoke the same dialect as the Mackenzie River Inuit, which enabled Stefansson to interact with them, since he had previously lived 3 years with the Western Inuit groups and spoke the language.

Report of the Inuit diet on first contact. In regard to the diet of the unacculturated Dolphin and
Union Straits Inuit, Stefansson [1913, pp. 174-178] reports: My host was the seal-hunter whom we had first approached on the ice (...). [His wife] boiled some sealmeat for me, but she had not boiled any fat, for she did not know whether I preferred the blubber boiled or raw. They always cut it in small pieces and ate it raw themselves; but the pot still hung over the lamp, and anything she put into it would be cooked in a moment. When I told her that my tastes quite coincided with hers--as, in fact, they did--she was delighted. People were much alike, then, after all, though they came from a great distance. She would, accordingly, treat me exactly as if I were one of their own people come to visit them from afar... When we had entered the house the boiled pieces of seal-meat had already been taken out of the pot and lay steaming on a side-board. On being assured that my tastes in food were not likely to differ from theirs, my hostess picked out for me the lower joint of a seal's fore leg, squeezed it firmly between her hands to make sure nothing should later drip from it, and handed it to me, along with her own copper-bladed knife; the next most desirable piece was similarly squeezed and handed to her husband, and others in turn to the rest of the family.... Our meal was of two courses: the first, meat; the second, soup. The soup is made by pouring cold seal blood into the boiling broth immediately after the cooked meat has been taken out of the pot, and stirring briskly until the whole comes nearly (but never quite) to a boil. This makes a soup of thickness comparable to our English pea-soups, but if the pot be allowed to come to a boil, the blood will coagulate and settle to the bottom...

Comments, clarifications, and conclusions. A few clarifications on the above, from Stefansson
[1913]. The fuel used to boil the seal meat was seal oil. Stefansson describes an important cultural practice among the Inuit: families that had seal meat to eat shared their surplus with the families that did not. (Food sharing is a common cultural--and an important survival--practice among hunter-gatherers.) As the above represents first contact with an unacculturated group of Inuit living their traditional lifestyle, and the evidence indicates that blubber (animal fat) is eaten raw by the Inuit but seal meat routinely cooked, we conclude that the Inuit were not 100% raw. Whether they met the standard terminology used in this

380

paper (and elsewhere in the raw community) of 75+% raw foods by weight (to qualify as "raw-fooders") is uncertain--this is discussed further below.

More reports on Inuit diets from early contacts Point Barrow, Alaska, 1881-1883. As part of the International Polar year, a group of people from the
U.S. Army lived among the Inuit in the Point Barrow area of northern Alaska. The group included John Murdoch, an anthropologist, and his report includes discussion of the Inuit diet. Murdoch reports (as quoted in Stefansson [1960, p. 120]) that: ...Food is generally cooked... Meat of all kinds is generally boiled... Fish are also boiled but are often eaten raw... The women keep a supply of cooked food on hand for anyone to eat...

Bering Sea, Alaska, 1896. Stefansson [1960, pp. 49-50] discusses the early contact by J.H. Romig,
M.D., who visited the Inuit of the Bering Sea region of Alaska in 1896 and found them living a traditional lifestyle. Romig reports (as quoted in Stefansson [1960, p. 50]: Their food was cooked mostly by boiling, and was rather rare; they ate as well, especially in winter, raw frozen fish and raw meat. Labrador, Canada, 1902-1913. S.K. Hutton, M.D. spent several years in the early 1900s among the Inuit of Labrador in eastern Canada. Hutton is quoted in Stefansson [1960, p. 152] as reporting: [C]ookery holds a very secondary place in the preparation of food--most of the food is eaten raw and the diet is a flesh one. The above quote might be interpreted as suggesting that the diet of the Labrador Inuit was indeed a "rawfood diet" in the modern sense of the term, albeit a raw flesh diet. But--is that really the case? Elsewhere in Stefansson [1960], Dr. Hutton is quoted as reporting [Stefansson 1960, p. 57]: ...Plain raw flesh is the Eskimo's favorite food... Other flesh foods, less important because less plentiful than the seal's flesh, are walrus meat, caribou meat, bear, fox, and various birds. These are eaten raw or cooked. Fish is the staple food during the warmest part of the year. Trout and cod are to be had in plenty and are eaten either fresh (raw or boiled) or dried without salt.

No reports of 100% raw diets among Inuit. Thus we see that even the Labrador Inuit consumed
cooked foods and were not 100% raw. Stefansson [1960, p. 68] reports that the Labrador Inuit were "the greatest raw-flesh eaters of the whole Eskimo world," but that consumption of cooked foods was higher in other areas, especially among the Copper Inuit (his first contact report above), the Mackenzie Inuit, and the Inuit of northern Alaska.

Were any Inuit groups "raw-fooders" in the modern sense of the term? What percentage qualifies someone as a "raw-fooder"? The term "raw-fooder" in the modern
sense, and as used in this paper, denotes an individual whose diet is on average 75+% raw foods by weight. Some of the more extreme rawists insist on requiring a 100% raw diet to call someone a "raw-fooder." We note from the above that none of the Inuit groups encountered followed a 100% raw diet, thus by that definition cannot be referred to as rawists. The question then follows, for those of us who use the more realistic definition of 75+% raw: Did the Inuit groups mentioned above meet the 75+% level, such that they could reasonably be called raw-fooders? The answer to this is unknown at present. The write-ups in Stefansson [1960] do not provide percentage estimates (raw vs. cooked) of the Inuit diets. In some cases, Stefansson quoted from reports written long ago, which one might be able to locate with some effort. Whether such reports include data on raw vs. cooked-food percentages is unlikely, but one must examine the original reports to be certain. Other material

381

cited (Romig, above) was from personal correspondence with Stefansson. Again, no percentage data, and no way to check for it unless Stefansson's personal papers are archived in some university library.

Level of evidence available doesn't permit exact determination as to percentages. Thus


although the evidence available is that the Inuit diet included both raw and cooked foods, and in the case of the Labrador Inuit was "mostly raw" (paraphrase of Hutton quote above), we don't know if the diet of any Inuit group met the 75+% raw-by-weight definition. Thus we cannot say the Inuit were rawists, and we cannot say they were not. All we can say is that they had a mixed raw and cooked diet, with the percentage of cooked food consumption varying by Inuit group, and by season.

Why has picture of Inuit/Eskimos as a totally raw culture persisted if the evidence has never supported it? It is worth mentioning here that for some raw-food diet advocates (perhaps instinctos,
perhaps others), the Inuit have served as perhaps the only potentially credible example of a pre-modern raw-food diet culture (the word "Eskimo" reportedly means "they eat their meat raw" in the language of the Algonquin tribe [Stefansson 1960, p. 67]). What is obvious, given the above, is that the idea the Inuit eat some of their meat raw has somehow been incorrectly transformed and reported instead as meaning that they eat all or nearly all of their meat raw. Ironically, however, the above picture, which has been available for many decades now, shows the idea of the Inuit as a raw-food diet culture to be simply another unsupported assumption that dies hard--and one that lives on for no other reason than lack of motivation to utilize actual verified evidence as one's standard, and to take the time to look for the available evidence at hand.

Comments on hunter-gatherers' health Cooking appears to have little bearing on hunter-gatherer health. Further examples would not
much change the above picture of the widespread use of cooking by hunter-gatherers. Suffice it to say that hunter-gatherers don't eat all-raw diets, not even predominantly raw. The above examples of the Aborigine, !Kung, and Inuit are quite typical of the hunter-gatherer literature, which suggests that there has never been, anywhere, a known culture with a 100%-raw diet. Thus, proponents who aggressively promote eating a 100%-raw (vegan) diet are promoting an unproven concept, or, if you prefer, an outdated and anachronistic concept that requires that one go back before the use of fire, and ignore the evolutionary changes since that time. In summary, it can be noted that as far as plant foods go, at least, much of the cooking is done to neutralize native toxins in raw plants so that they can make use of what's actually available to them in their environment. They don't use particularly gentle methods of cooking either (which doesn't mean the raw part of their diet has no importance). And yet they have low cholesterol, and extremely low rates of heart disease and other degenerative diseases [Eaton 1996, 1985].

Other protective lifestyle factors. It should be noted that hunter-gatherers enjoy many factors
protective against cancer, such as: high antioxidant and fiber intake; virtually no exposure to pollution, chemicals, pesticides, or preservatives; and have a healthy lifestyle in general. Similarly, they enjoy many protective factors against cardiovascular disease [Eaton 1996, 1985]. At least, what we can learn from the diet of hunter-gatherers is that diseases of civilization were not born with the advent of fire.

Very low incidence of degenerative disease. Okay, so hunter-gatherers have what appears to be a
protective diet and lifestyle, but are they disease-free? A glance at Eaton [1996, 1985] shows that the incidence of diabetes and heart disease is extremely low, and that their cholesterol levels are astonishingly low, ranging from roughly 105 to 145 in most cases [Eaton et al. 1988], especially given the high levels of (cooked) meat in their diet. They do suffer from some infectious diseases, and their lifespan is intermediate between that of developed nations and Third World agricultural nations (which, given the lack of medical care, indicates the superiority of hunter-gatherers' diets compared to a basic agricultural diet). It is

382

important to understand that while hunter-gatherers suffer from more infectious disease than those of us living in modern sanitized conditions, so also do animals eating their native diet suffer from infectious diseases. Diet is but one factor in susceptibility to infectious disease. In terms of long-term health, a better way of determining if a diet is "good" or not in such circumstances is to compare the incidence of degenerative diseases such as cardiovascular disease or cancer. (Average life expectancy figures by themselves can be heavily influenced by factors not necessarily directly related to diet, such as infant mortality, death by accidents, etc. (See Dunn [1968] for a discussion of the influence of such factors on mortality in hunter-gatherers.)

Cancer rare. The following is from "Food, nutrition and the prevention of cancer: a global perspective,"
World Cancer Research Fund [1997, p. 35]. It has often been said that cancer was rare among gatherer-hunter and pastoral peoples living in remote parts of the world, such as the Himalayas, the Arctic and equatorial Africa, when these were first visited by explorers and missionaries [Williams 1908, Bulkley 1927, Schweitzer 1957]. A summary of these early accounts can be found in Cancer Wars [Proctor 1995]. Such accounts have been taken to mean that cancer was generally rare in early history. The African explorer, Dr. David Livingstone, suggested that cancer is a "disease of civilisation" [Maugh 1979]. Practically nothing is known about rates of cancer until careful records were first kept in Europe in the eighteenth century. These suggest that, historically, cancer might have been a relatively uncommon disease. The above brief synopsis is presented only in summary fashion here, and only hints at the known evidence in regard to the rarity of cancer in hunter-gatherers noted by Western explorers and anthropologists at first contact. For a more in-depth look at some of these accounts, refer to Hunter-Gatherers: Examples of Healthy Omnivores elsewhere on the website.

Other factors protective against cancer in the hunter-gatherer lifestyle. Let's add that
diet/cooking is not the only factor that can contribute to cancer. Recall that humans are exposed to many natural carcinogens [Ames 1990], and that smoking is quite common in hunter-gatherers [Bicchieri 1972]. In addition, carcinogens themselves are not the only factors in the development of cancer, as many other aspects of lifestyle may play a significant role. For hunter-gatherer women, late onset of menarche (16 years old), having a first child at a relatively young age (19.5 on average), longer duration of breast-feeding (average length 2.9 years for each child), large average number of children (6), earlier menopause (47 years old), exercise throughout life, along with dietary habits all suggest the incidence of breast, uterus and ovary cancer were very low in comparison with rates seen among women in modern industrialized countries [Eaton 1994].

Cooking practices of supposedly healthy peoples cited by vegetarian lore


The Hunza, Vilcabambans, and Georgians/Abkhasians. In another realm, with less rigorous
information available, vegetarian lore has it that the people of Vilcabamba (South America), Hunza (Pakistan), and Abkhasia (formerly Soviet Georgia) were long-lived and had good health. (Note for clarity: Abkhasia is a region within the [former Soviet] Georgia--but Abkhasians and Georgians as peoples are two different ethnic groups, with different languages. Claims that refer to long-lived "Georgians" are actually references to Abkhasians. This follows from information in Benet [1974, pp. 9-15].) The claims of health/longevity may or may not be true, but in addition to looking at the scientificially documented examples of hunter-gatherers, for the sake of discussion it is worthwhile to examine whether these other traditional people, who are supposedly much healthier than everywhere else in the world, do or don't have healthier cooking practices.

Varying reports and possible exaggerations make meaningful evaluation difficult. A number
of books have been written on these peoples [Davies 1975, Sidky 1995, Shahid 1979, Rodale 1948, Benet 1974]. It is clear that they do not enjoy perfect health, and it is possible that longevity may have been exaggerated in some cases. For example, claims are sometimes made that Georgians/Abkhasians have been known to take the name of their father or their grandfather to escape military service, thereby corrupting the

383

(old Soviet Union) census records with false ages. However, Benet [1974, pp. 14-15] discusses the considerable lengths Soviet medical research teams (studying the aged in Abkhasia) went to, to secure reliable age data, and the extensive efforts the census of the Soviet Union (1959) went to, to obtain reliable age data from elderly Abkhasians. In preparing this write-up, no credible documentation was available to support the claims of Abkhasian age data corrupted by draft-avoidance. Until such evidence is presented, it seems best to regard the claims of widespread draft-avoidance to be unsupported, dubious, and possibly insulting to Abkhasians.

The overall health picture: problems with acute and infectious diseases but very low incidence of long-term degenerative disease. Despite the above potential concerns, reviewing a few
aspects of health in these societies may be of interest. An interesting discussion on health and diet of these peoples can be found in Schmid [1997]. The incidence of diseases such as cancer, heart disease, diabetes, high blood pressure, etc., among these populations is very low. On the other side of the ledger, mortality in infancy and early childhood of Vilcabambans is high [it is among hunter-gatherers also; this is not necessarily a strike against their diet, however, since it is likely attributable to social conditions], and losing teeth at an early age is rather common. According to Clark [1956], Hunzans suffer from a variety of problems, including malaria, dysentery, worms, impetigo, goiter, dental decay, rickets, and tuberculosis. Again, as with hunter-gatherers, given the less sanitary living conditions that often prevail in poorer, developing countries, it is long-term disease conditions that should perhaps be of the most concern. Vilcabambans, Hunzans, and Abkhasians indeed seem to have a remarkably low incidence of long-term degenerative diseases, but don't enjoy perfect health. According to the analysis of Schmid, Hunzans are less healthy than Georgians/Abkhasians because they are deficient in animal foods (note that rickets are said to be a problem for the former), and Vilcabambans are intermediate. One may or may not agree with that conclusion, but nevertheless, it is interesting to note in the context of our discussion here that the supposedly healthiest (Georgians) cook more, and consume more animal products, than Hunzans. Note: One problem that may be a significant source of confounding error in interpreting observations about the health of these peoples is the timeframe from which various observations about them come. As with the Inuit--about whom legends are rife, and whose health began deteriorating with increasing Westernization-it can be similarly difficult to disentangle myth from fact where these other peoples are concerned as well.

The Hunza
Hunzans grow various sorts of cereals such as barley, wheat, millet, and buckwheat (with which they bake bread), as well as a variety of vegetables and root vegetables. But fruits constitute a very important part of their diet, especially apricots. They eat very little meat (due to scarcity), but consume dairy products (including ghee). They don't overcook their food due to lack of fuel. The vegetables are boiled in covered pots, but only a small amount of water is used at a time.

Vilcabambans
Vilcabambans eat root vegetables, maize, beans (such as soya beans), milk, eggs, green vegetables, a kind of cabbage, marrows (vegetables of the same family as zucchinis), pumpkins, and fruit. They cook most of their food.

384

Georgians/Abkhasians
The following material is summarized from the discussion on the topic in Benet [1974].

Staple plant foods. A cornmeal mash, known as abista, is a major staple, as are other cornmeal dishes.
Abista may be eaten at each meal of the day (i.e., 3 times/day). Nuts are heavily consumed, and are used for flavoring foods in place of butter. Wild chestnuts are abundant in the region, and the nuts are a staple food for winter. Pickled vegetables and lima beans are also popular; from Benet [1974, p. 24]: Vegetables may be served cooked or raw, but are most commonly pickled. A favorite dish eaten almost every day is baby lima beans, cooked slowly for many hours, mashed and flavored with a sauce of onions, green peppers, coriander, garlic, and pomegranate juice. With rare exceptions, vegetables are preferred one of two ways: raw, or cooked in very small amounts of water... Fruits. Fresh fruit is available seasonally in Georgia, and the Abkhasians eat large amounts. Pears are commonly cooked to produce a thick syrup which is used as a sugar substitute. Abkhasia is a major wine producer, and grapes are a popular food. Benet [1974, p. 25] claims that "a man may eat fifty kilograms [of grapes] in a single season." The Abkhasians grow pomegranates, which are used as basting sauce for meats. They also dry fruits for storage and later consumption. Citrus fruits are popular as well. A local spice is used in place of salt: adzhika, a mixture of locally grown, mostly pungent/bitter plant foods mixed with nuts. Abkhasians also eat some wild plants, most notably barberry, Barberis vulgaris.

Staple animal foods. From Benet [1974, p. 25]:


The Abkhasians eat relatively little meat--perhaps once or twice a week--and prefer chicken, beef, lamb, and kid [juvenile goat]. The meat is always freshly slaughtered and either broiled or boiled for a minimal amount of time... Not more than two or three eggs are eaten a week, and these are either boiled or fried... Fat from meat and poultry is not used at all, and butter very seldom. The Abkhasians dislike fatty meat and typically trim off as much fat as possible before eating. Honey, an insect-processed product, is also consumed. Although Abkhasians eat little meat, they do consume considerable amounts of dairy. They consume 1-2 glasses of matzoni per day, a locally produced fermented milk product that is similar to buttermilk. Additionally, they typically cut up goat cheese and cook it in their meals of abista, the cornmeal mash that is a major staple of their daily diet. The Abkhasian diet is 74% milk + vegetables (note: unfortunately, the source here [Benet 1974] does not give the exact split between milk and vegetables out of that 74%), and provides about 73 grams of protein, 47 grams fat, and 381 grams carbohydrate per day. The elderly consume about 1900 cal/day, which is on the low side.

Comments on cooking and longevity Hunzas, Vilcambans, Georgians aren't raw-fooders. It appears from the above that these

385

supposedly long-lived people do eat some raw food, but are not even predominantly raw, and don't use particularly gentle methods of cooking, except perhaps the inhabitants of Hunza. It has been argued that these populations live long in spite of cooking; that if they ate raw, they would live much longer--why not as long as Methuselah (969 years)?--and what is now considered as a long life is in fact ridiculously short compared to human longevity potential.

Actual facts turn out to be more mundane than the mythical reality. While we don't consider
such arguments much worth discussing (they are simply speculation--the burden is on the claimants to provide actual evidence of such), let's mention the observations and evidence we already have in hand regarding the situation. Given current scientific knowledge about the theories of aging; given that no non-human mammal lives much beyond 100 years; that, from experience, no raw-food eater looks incredibly younger than his age; that no raw-food eater has ever beaten longevity records (even if there is a very small pool of raw-fooders to draw from, if the diet is so incredibly good, we should still see a high percentage of them who have approached these records); that, since hunter-gatherer populations can learn enough of their environment and reproduce successfully while having shorter lifespans than people in industrialized countries, there is no selection pressure for living much beyond 60 years old in good health; given all these arguments, it seems incredibly unreasonable that any human, on raw-food or not, could live much beyond the current longevity records.

Are Instinctive Nutrition and cooking completely incompatible?


Does cooking thwart instinct? A common Instincto myth is that even the slightest amount of cooked
food--since its intake is not regulated by instinct--induces metabolic imbalances, and disrupts the normal functioning of instinct and decreases the amount of pleasure with raw food. Therefore, predominantly raw (as opposed to all-raw) diets don't work. Yet anecdotal evidence among others eating natural foods-type diets, plus the experimental data available, suggests:

The claims are exaggerated. Provided cooked foods are consumed whole, and cooked
conservatively, with minimal mixings, then all these claims are untrue. Unrestricted instinct itself is not infallible. Pure, unalloyed instinct itself, on its own, is actually not as efficient as claimed, because experience shows that if left unrestricted, even instinctos tend to overeat sweet fruits and neglect other foods. Thus, even with 100% raw (under which conditions instinct should theoretically be unperturbed), instinctos often use many "rules" to keep instinct within limits.

Experiments with self-selected foods (some cooked) by infants result in satisfactory nutrient balance. Without any rules, and with the inclusion of some cooked foods, instinct of
young children is quite effective at selecting foods and obtaining a satisfactory balance of nutrients [Davis 1928]. Here, perhaps it could be said cooking improves the taste of some foods with a good nutrient profile that would otherwise be avoided in their raw state.

Research on self-selected foods in infants


Food preferences have been widely studied since Clara Davis [Story et al. 1987]. We emphasize here that highly processed foods such as candies or ice cream should be excluded, but even if some cooked foods and some non-"original" (or non-Paleolithic) foods like whole grains and dairy are given, the result is quite satisfactory, at least in terms of nutrient intake. We quote here from Story et al. [1987]: In the 1920s and 1930s, the pediatrician Clara Davis conducted pioneering studies, now considered classic, and published at least 12 papers on the selection of diets by infants and young children (Davis 1928, 1934, 1938, 1939). In the first study (Davis 1928), three infants (7-9 months old) were involved, two for six

386

months and the third for one year. In 1939, Davis reported in much less detail the results of a study involving 12 more children over a period ranging from 6 months to 4.5 years (Davis 1939). The research protocol was the same for both studies. Of the 34 foods offered, 90% of the energy intake for all three infants was derived from 14 foods. Of these, 9 were preferred by all three infants (bone marrow, milk, eggs, banana, apples, oranges, cornmeal, whole wheat, and oatmeal). Bone marrow was the largest single source of calories (27%) for one infant, whereas milk provided the bulk of calories for the other two (19 and 39%). All three infants shared a low preference for 10 vegetables, as well as for pineapple, peaches, liver, kidney, ocean fish, and sea salt. These foods constituted less than 10% of the total energy intake.

NUTRIENT BALANCE IN SELF-SELECTED FOODS BY 3 INFANTS (Intake of nutrients as a percentage of the [1980] RDA during 173 days of self-selected diets in three infants. Values shown are the average for all three infants.) FOOD CONSTITUENT Energy Protein Vitamin A Vitamin C Thiamine Riboflavin Niacin Vitamin B6 Folacin Vitamin B12 Calcium Phosphorus Magnesium Iron Zinc Percent of RDA 131 367 365 404 193 361 152 258 465 890 150 336 355 88 164

With the exception of iron, the foods consumed equaled or exceeded the RDA for the nutrients examined. Milk intake, though it accounted for 19% to 39% of the total calories, supplied at
least 75% of the RDA in half the nutrients listed in the table above. For the instincto concerned about possible iron deficiencies, recall that milk is not a good source of iron, the best being red meat. Also, if it is

387

true--as instinctos like to observe--that cooking usually allows one to eat more of a food than they would otherwise, then cooking might encourage greater consumption, thus increasing the iron intake.

Iron nutrition in infants is unique. A note here regarding iron in mother's milk, and iron nutrition in
infants: While iron is so low as to be otherwise deficient in breast milk, its bioavailability therein is high enough to prevent deficiency [Lonnerdahl 1984]. Normal infants are also born with a store of iron inherited from the mother that lasts through the first 4 months of nursing [Stekel 1984].

MACRONUTRIENT RATIOS IN SELF-SELECTED FOODS BY INFANTS (Expressed as a percentage of total calories) FOOD CONSTITUENT Protein Fat Carbohydrate Infant #1 25 37 38 Infant #2 17 21 62 Infant #3 23 38 39

Note on inadvisability of extrapolating from needs of infants to needs of adults. One


cautionary note here regarding the above information about food selection preferences in infants: While the data does suggest that the instincts expressed by infants are adequate to supply them with sufficient nutrition, the results should not be taken as indicative of appropriate nutritional balance for adults. The nutritional requirements of infants are somewhat unique, and change considerably during/after weaning. (See "Is Protein Toxic?" on this site [not yet available] and the subsection on " The Perils of Extrapolating from Infants to Adults" in Section III of Fruit is Not Like Mother's Milk for more information on this point.)

Clarifications regarding instinct and cooking


Next, instincto purists will probably have the following remarks in mind:

Q: But if a food is heat-denatured, like honey or dates, I invariably overeat it. A: Note that we are not suggesting that it is profitable to cook all foods, only certain ones. As
we have seen, cooking's effects are not the same on all foods. Even with 100% raw food, instinct by itself may encourage overeating when virtually unlimited supply is available and one does not have to first take the time to seek out the food in its natural environment where there may be limits to its supply, such as hunter-gatherers have to do. The instincto theory that taste/smell should be our sole guide to proper foods even in a natural environment is in any event speculative and unproven.

Q: But any food in excess is poison. With honey and dates, I would exceed my body's needs. A: There are many reasons to think that the average instincto eats too much fruit anyway
(since modern fruit contains too much sugar), regardless of whether or not it is raw or cooked. Again, in reality, the aftereffects of eating a food are also important in determining which foods, or how much of them, are safe to eat, not simply how they taste. Hunter-gatherers, for instance, certainly pay attention to more than just taste and pleasure, particularly aftereffects, and regulate the consumption of select foods, via taboos. Heinz and Maguire [197X] evaluated

388

the level of knowledge of the !ko Bushmen of Botswana. They found that the !ko not only were cognizant of relevant economic uses and toxicity of the local plants, but also had acquired considerable knowledge of plant taxonomy, as well. Tanaka [1980, p. 80] reports that the San Bushmen use plants for healing, i.e., as medicinal herbs. Considering that the survival of many hunter-gatherers depends, at least in part, on their knowledge of plant foraging, it is not surprising to learn that they have a well-developed knowledge of plants, and their effects when consumed as foods. Overconsumption of even raw fruits can result in blood sugar problems, mineral imbalances/deficiencies, and digestive weakness. One may actually be "poisoning" themselves ("promoting nutrient imbalance" would be more accurate) all along every day by overconsuming fruits even in their raw state. Cooking of non-sweet foods, however, may allow you to increase their consumption, and thereby moderate your fruit intake.

Q: But meat is carcinogenic, especially cooked meat. And everyone knows that Guy-Claude
Burger's wife Nicole died of a cancer because she used to eat too much (raw) meat. We should never eat meat more than once a month, and never combine it with anything else to let your immune system work against the dangerous proteins that could enter your bloodstream!

A: As we have seen, hunter-gatherers, who cook on open fire, and eat considerable amounts of meat (50-65% by calories on average), have an extremely low incidence of cancer. Without
a wider base of instinctos to study--who in any event eat diets that vary in some significant respects from traditional hunter-gatherers--it will remain difficult to pinpoint why a few instinctos seem to have had troubles with meat consumption, or draw hard and fast conclusions.

Anecdotal evidence of raw vs. cooked diets in the raw-food community


Does Natural Hygiene require 100% raw?
One longstanding controversy within the Natural Hygiene movement is whether being a Natural Hygienist and eating cooked food are compatible. The common argument from those interpreting the question most strictly is that "no other animal on the planet cooks its food," and thus eating cooked food is an unnatural practice.

How to define "natural" with respect to humans? First of all, we would remark that there is no
good definition of "natural" for humans, since they have been doing "unnatural" things since--well, since they appeared, since humans ARE tool-users. (It's part of what defines us as a species, and marked the inception of our genus, Homo, beginning 2 to 2.5 million years ago.) And cooking has been used for a sufficient length of time so that at least some discussion concerning its naturalness is justified.

Weighing naturalism and pragmatism in the balance. Secondly, more important than trying to be
more natural, the concept of Natural Hygiene should be a practical, not absolute, guide for our actions, instead of encouraging us to be slaves of a doctrine. Some unnatural practices may be useful (wearing clothes in northern Canada, surfing on the Web and exploring sites related to health), while others may well not be. (An example would be taking antibiotics except in any but the most serious circumstances, as even traditional doctors now acknowledge the vast majority of acute illnesses are self-limiting and the body's own immune system can handle most infections fine anyway. It's also well-acknowledged now that anything more than minimal use of antibiotics breeds eventual resistance to them by quickly-evolving bacteria. So we are not saying all Hygienic advice about "naturalness" does not make sense. It is blanket pronouncements that are problematic.)

What do real-world results among practicing Natural Hygienists and raw-fooders tell us?
Eating a significant amount of fresh or raw food is important, but 100% raw is not necessarily better (based on attempting to assess the experience of many Natural Hygienists and other raw-fooders, given the lack of

389

any official studies). In fact, given that it often significantly restricts the diet, it actually can be worse than eating some cooked food. The best we can do here is to quote a couple of paragraphs from a recent article published in the ANHS (American Natural Hygiene Society) publication Health Science. From the article on p. 12 of the March/April 1998 issue, "Much Ado about Raw Food," by James Michael Lennon, the society's executive director: ANHS does not recommend a totally raw-food diet [i.e., 100% raw]. Experience has shown that people typically fare poorly over a long period of time on such a diet, and as a practical matter, it is extremely hard to implement. No one claims that eating a totally raw-food diet is absolutely impossible. But there is no credible evidence to show that a whole-food, plant-based diet that is entirely uncooked is more healthy than one that includes conservatively cooked vegetables and starches. By contrast, the diet that is recommended by most raw-food advocates is excessively high in fat and sugar, two factors that have been associated with a variety of health problems. In the article quoted from just above, Lennon also cites Dr. Alan Goldhamer of the Center for Conservative Therapy, a fasting facility. Goldhamer's comments echo the message of the site article, The Calorie Paradox of Raw Veganism, i.e., that diets based on low-calorie-density raw vegetables require the consumption of large amounts of foods (a bulk problem), and diets based on sweet fruit imply the ingestion of large amounts of sugar on a daily basis. Lennon's article also cites Dr. Ralph Cinque, a prominent natural hygiene health professional, who reminds us that Herbert Shelton (the founder of natural hygiene in modern times) did not advocate 100% raw, but instead served both cooked and raw vegan foods to the people at his retreat center (more accurately: the people who were not fasting).

And what about Living Foods?

Looking back, pioneers in the movement report frequent backsliding.


Some proponents of the "Living Foods" diet advocate eating a 100% raw diet. (Living foods is a raw vegan diet that emphasizes raw sprouted seeds, fermented foods, dehydrated foods, blended foods (raw soups), and wheatgrass juice or other high-chlorophyll foods.) However, it appears that following a living foods diet in a strict manner is not easy--extensive anecdotal evidence indicates that backsliding is frequent among those who attempt the diet long-term.

Results on the Ann Wigmore program in retrospect. The following comments come from the
article "Contributions to the Evolution of Live Food Trend," by Elliot Rosen (L.O.V.E. publications), part of a mailing circular packet that Viktoras Kulvsinkas, a prominent and pioneering figure in the Live Foods movement (and involved in the management of Ann Wigmore's program in its earlier years), distributed widely in early 1998 as a promotional mailing. Very few folks stuck to the cleansing program [at the Rising Sun Christianity of Rev. Ann Wigmore], with poor consequence, usually return of the disease. Those who stuck to it, ran into problems, eventually low energy with underweight. Some who went excessively on the buckwheat and sunflower greens developed

390

the "tingles"--which was associated with supersensitivity to heat as well as a low red blood cell count. There were others, who, due to poor knowledge of food preparation, ended up eating foods that were more suitable for compost, many developed over-acidity due to excess consumption of the "rejuvilac" and other live sour foods. They were pioneers and exploring a new potential. Recent changes to Hippocrates program implicitly acknowledge past problems. In another article from the above circular, Kulvinskas discussed changes that have been made to the regimen followed at Hippocrates Health Institute, one of the most well-known Live Food health retreats. By implication, his observations acknowledged the difficulties people had had with the program. From the article "Anecdotal healing experience with Super Blue Green Algae": Hippocrates Institute has initiated the use of the Super Blue Green Enzymes and the Algae (...) about three years ago, and the healing speed improved on their live food program. Also, when HHI clients and folks had dietary backslides, they were not wiped out for days because of the radical change; hence, they stuck to the good diet at least 80% of the time, and made consistent progress in their healing. Eventually, they also lost all the cravings for low vibrational foods, and never felt deprived on the Hippocrates meal plan. Report from Viktoras Kulvinskas about his own problematic history. Viktoras Kulvinskas himself had a long history of eating disorders (bulimia), due, he says, to a dysfunctional childhood. Regardless of how the eating disorder may have originally begun, however, it was only after he relinquished his all-live-food regime and reintroduced some cooked foods that his eating-disorder behavior ceased. From Kulvinskas and Lahiri [1997, pp. 116-117]: Even when I was eating only live food, I would eat huge meals that would leave me feeling bloated and fatigued. I found from experience that if all this food stayed in my stomach, it would ferment and bring on headaches and sciatic pain. So, for relief, I made myself vomit. Eventually, vomiting became a daily practice. I fell into eating more and more cooked and junk food. The problem became more disabling, both physically and psychologically, for I was living a dual life. In public, I would give health talks and lifestyle consultations. In private, I was a junk food eater and bulimic. The hypocrisy of it all was destroying me from the inside out. Viktoras reports that the Super Blue Green Algae helped him overcome his eating disorder. His diet is currently at least 90+% raw, but he notes [Kulvinskas and Lahiri 1997, pp. 119-120]: However, I do not have an addictive relationship with food in order to maintain this diet nor do I feel deprived on this very functional nutritional program. I feel comfortable, as I see fit, to enjoy cooked sweet potatoes and sprouted bread. I occasionally eat beans, rice or millet and take extra enzymes when I eat these cooked foods.

Transcending magical thinking about the properties of raw food Initial improvements on raw food. Many people first coming to raw food appear to experience
dramatic improvements, although most individuals eventually find they are not able to sustain their raw diet continuously for long periods of time. (We emphasize that this last point is important to keep in mind as a significant limitation of the potential of a total raw-food diet for long-term benefit.) In any event, the classical explanation for the positive results obtained is that raw diets are "cleaner"--that they got rid of the person's major source of intoxification.

Most beneficial effects of raw-food diets are likely attributable primarily to factors unrelated to whether the food is raw or not.
It may well be that detoxification is a factor in the improved health of those first switching to a raw-food diet (particularly for those who have been eating very highly processed diets previously; or who have been on long-term drug regimens). But keeping things in perspective, it is also very likely simply one factor among others, and it is not necessarily the case that an all-raw-food diet will contain significantly less toxins than a diet that is predominantly raw/partially cooked.

391

While it has been demonstrated earlier in this paper that cooked food (and more particularly, food cooked at high temperature) contains some harmful substances, raw foods, too, may contain a certain number of toxic substances. Measuring what differences there may be between the two is not something that can be easily calculated at present; but it is far more likely that numerous other factors are sufficient to explain the majority of the improvements attributed to raw food besides the fact it is uncooked.

By going raw, one is not simply ceasing to cook their food, but changing the amounts, types, and proportions of foods and nutrients in a profound way at the same time, which in and of itself
is not necessarily dependent on going raw. A raw-food diet is just one way among others to achieve this. Some of the relevant factors here are:

Character of fats. Raw-fooders usually eat less saturated fat and more unsaturated fats than the
average American, which is beneficial for the blood cholesterol profile. Even instinctive eaters-who are often concerned about eating leaner cuts of meat to begin with--still tend to eat even less meat than the average American, which might cut their fat consumption even more. (Especially when starting their raw diet, many people are reluctant to try raw meat at the beginning.) Rawfooders who follow instinctive eating also eat more fish (high in omega-3s). The fats that rawfooders in general eat, even when they eat relatively high amounts of fats, often come from avocados and nuts, and thus predominate in unsaturated fats. Trans-fatty acids (margarines and hydrogenated oils)--a widespread ingredient in many of the processed foods the average American eats freely of--are completely eliminated in raw diets. These fats are known to interfere with omega-3 absorption, and also are now coming to be recognized as perhaps the most disease-promoting of any of the fats.

Phytochemicals and antioxidants. Quite obviously a raw diet, particularly a vegetarian one
(except perhaps for certain exceptions such as the macrobiotic diet), is going to be high in fruits and vegetables. Consequently, regardless of whether the diet is raw or not, a diet which predominates in fruits and vegetables will be high in phytochemicals, antioxidants, and other related health-promoting, disease-preventing substances currently receiving wide study in the nutritional community. Fiber. The average American diet is very low in fiber. Raw-food diets will automatically be high in both soluble and insoluble fiber because of the inclusion of vegetables, fruits, nuts, etc. Soluble fiber is known to lower cholesterol. Insoluble fiber improves bowel movements and is protective against colon cancer. Note again, however, that a diet similar to a raw diet in terms of the array of foods eaten, but with some of them cooked, may well be lower in fiber than an all-raw diet, but still much higher in fiber than the average American diet. Raw diets are just one way that higher levels of fiber can be achieved. Absence of added salt. With very few exceptions (perhaps a little sea salt here and there for a small minority), raw-fooders add no salt to their food. Salt is thought to increase blood pressure (although it's controversial), and is also associated with asthma, stomach and nasopharyngeal cancer, calcium loss, and increased stroke mortality independent of the effect on blood pressure. Near or total avoidance of certain potential "problem" foods. Grains, which were not a part of the human diet for its 2-million-year history until the last 10,000 years, and can be tolerated raw only with difficulty, are avoided or minimized by raw-fooders. This may be significant, since recent Paleolithic diet research is implicating grains in promoting hyperinsulinemia, and in increasing the risk of autoimmune diseases through molecular mimicry. (For more on the issue of hyperinsulinemia/insulin resistance, see Insulin Resistance: The Carnivore Connection Hypothesis of Miller and Colagiuri [1994]. Autoimmunity and molecular mimicry in connection with grains are discussed in The Late Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance.) While this research is still controversial outside the field of Paleolithic diet research, if corroborated it would indicate that overconsumption of grains could be a significant factor behind

392

some of the diseases of civilization. Grains are completely excluded by instinctos, and minimized by many other raw-fooders. For raw-fooders, grains--if or when they are eaten at all--are generally eaten only in sprouted form (in which case the gluten and phytate content are reduced, but not completely eliminated), and usually eaten in smaller quantities. Dairy is another food which was not a part of the human diet until very recently--about 6,0007,500 years ago [Simoons 1988]--and is avoided by most raw-foodists. (70% of the world's population [Mogelonsky 1995] is still lactose-intolerant after weaning and cannot drink milk without risking digestive distress.) Whole milk is also high in saturated fat, and even low-fat forms of milk may be problematic in terms of autoimmune response. Indeed, there is increasing evidence that exposure of infants to cow's milk protein can cause Type I (insulin-dependent) diabetes mellitus, characterized by beta-cells (cells of the pancreas that synthesize insulin) being attacked by antibodies to milk protein. Similarly, a relationship between autoimmunity and coronary heart disease was discovered a long time ago [Davies et al. 1969]. It has also now been shown that antibodies to the protein xanthine oxidase in cow's milk cross-react with and attack the membranes of arterial endothelial cells (the layer of cells lining the inside of arteries) [Bruder et al. 1984]. Since lesions to these cells are also characteristic of the first stage of atherosclerosis, the role of an autoimmune response in cardiovascular disease seems increasingly likely.

Elimination of processed foods. Refined sugar and refined flour can be harmful for several
reasons, including having a high glycemic index and a low vitamin content. Also, simply by default, their presence in the diet crowds out other more highly nutritious foods. Any potential harm that might come from artificial preservatives, colorings, flavorings, or pesticides and fungicides, is avoided or minimized, since most raw-food eaters eat at least partially organic, and the synthetic chemical load thereby lowered.

Increased attention to other diet and health factors. People who switch to a raw-food diet
often pay attention to other aspects of a healthier lifestyle, such as: o Drinking water rather than soft drinks or alcohol. o Trying to live less stressful, more "natural" lives. o Getting more sleep. o Exercising regularly. o Trying not to overeat. (Although this may backfire on more extreme raw diets and lead to binge-eating.) o Avoidance of prescription drugs in favor of natural healing methods, which reduces the potential for physiological impairment due to the cumulative side-effects of long-term drug use.

Additional point: Redefinition of detrimental symptoms as "healthy" detox


While we do not wish to denigrate the real results people obtain from eating a raw-food diet, the subject of perceptual issues is worth discussing in relation to just how successful raw-food diets are over the longterm. We do not deny that people can experience dramatic results, at least in the beginning. However, at the same time, particularly over the long term, people often display the tendency and show strong signs of ignoring persistent, problematic symptoms on their diets by viewing them through rose-colored glasses as "continued detoxification."

The subtle influence of creeping fanaticism in biasing perception. Raw-food diets are among the
most extreme ones, in the sense that they are not very accepted socially. Although not all raw-food eaters will admit it, a total raw-food diet is not an easy diet for most people to sustain. Few "reasonable" people will come overnight from hamburger and Coke to eating 100% raw, certified organic food, without having

393

a high motivation to the point that an almost religious faith in the diet has taken place, sometimes to the point of reaching fanaticism or zealotry. Nevertheless, we believe the itemized list of reasons above is sufficient to explain most cases of improvement on raw food. But leaving that aside here, it is worth mentioning that the stories one hears cannot simply be taken at face value without a certain amount of probing and cross-checking where possible.

When "bad" symptoms are seen as good things. The rationale behind a raw-food diet (and some
other alternative diets) is somewhat unique in being able to consciously acknowledge symptoms that other people would call bad, yet at the same time blithely reinterpret them as good ("detoxification," etc.). In other words, symptoms and conditions that most who are on conventional (non-raw) diets would describe (or define) as "illness" are labeled as "detox" when they occur on a raw diet. Some raw-fooders have not been "ill" or "sick" for "years," but have frequent "detox episodes." This is a type of self-deception that is far too common in the raw community, and is also a form of denial of reality. Some of the raw "experts" actively encourage this denial, by using "detox" as the first line of defense when problems arise on the diets they promote. The point here is that one needs to be skeptical and ask questions when trying to assess just how well someone (perhaps oneself) is doing on a raw-food diet. There are a number of potential signs and symptoms one can utilize in attempting to help gauge this. For more information on this topic elsewhere on the site, see The Psychology of Idealistic Diets, particularly the subsection on Symptoms of "Failure to Thrive" on Raw and/or Vegetarian Diets.

Effects of reintroducing some cooked food after a long raw period Are failures to digest "heavier" foods symptomatic of new sensitivity to "unfit food," or instead compromised digestive capacity?
Even after having read the many observations and arguments advanced in this paper, many raw-fooders who are experiencing problematic symptoms may still fear including any cooked foods in their diet again. Stories are legend in the raw-food community, particularly by fruitarians, of those who reintroduced cooked food such as baked potatoes or rice back into their diet (or even "heavier," sometimes-forbidden raw foods such as nuts) and experienced digestive difficulty, or the food passed through their intestinal tract very little-digested. Such results are interpreted by the most vocal raw-food advocates as evidence that a truly healthy body will not tolerate unfit food, and will reject it. We would like to offer a much different explanation. As discussed briefly elsewhere on the site (see Does strict fruitarianism accelerate B-12 deficiency? partway down on the linked page) the reason may instead be digestion that has been seriously weakened by a prolonged period on a diet too high in fruits.

Long-term overconsumption of fruit may lead to sluggish bowel function. Ronald Cridland,
M.D., a natural hygiene practitioner with lengthy experience caring for natural hygiene patients on rawfood diets, addressed this issue in a brief (question-and-answer) response published in the American Natural Hygiene Society's Health Science magazine ("Tired of Singing the Cooked-Food Blues" [Cridland 1998]): There is a tendency for a person on a diet of all raw foods to overeat on fruits, causing a potentially harmful high-sugar diet, which tends to be deficient in vitamins and minerals. (...)

394

(...) Sugar is fairly stimulating and gives a false sense of high energy. I have seen many patients try to subsist on a high-fruit diet, and many feel quite well for about two years, despite some initial weight loss. But after that, they begin to experience low energy, immune problems, skin problems, and fatigue. Many of these patients are sleep-deprived. Because of the stimulating effect of a high-sugar diet, they mistakenly feel they can get by on much less sleep. Consequently, they experience the symptoms of sleep deprivation, which include fatigue, poor immune function, allergies, depression, and sluggish bowels.

High-fiber diet and stimulative effect of excessive sugar/fruit can mask exhaustion of bowel function. Cridland goes on in the above-quoted passage to explain that the problems 100%-raw vegans
experience when reintroducing cooked foods into the diet are often the result of an accumulated fatigue that is, in effect, a sleep-deficit. Indeed, he suggests that reintroducing some cooked foods (and thereby eliminating excessive sugar) may help the individual to re-experience his or her normal energy level, as opposed to the stimulative effects of a high-sugar raw diet, which is ultimately exhausting and depleting. (Also it should be noted that many individuals eventually experience chronic, and/or frequent and intermittent, fatigue on long-term 100%-raw vegan regimes anyway, whether they reintroduce cooked foods or not.) Cridland suggests that the high fiber content of a raw diet, in this context, can be a confounding factor (perceptually) that prevents recognition of the sluggish bowels that can eventually occur on long-time (high-sugar) raw diets. He suggests fasting to provide rest for the digestive system, and claims that ~2 years on a vegan diet, plus adequate sleep each night, will help the system to return to normal. Of course, other individuals coming off of an all-raw-food diet, especially one that was not overly high in fruits, may have had the opposite problem: low energy all along from a diet predominating in low-caloriedense foods. Such individuals often find that the extra calories they get from more dense cooked foods improve their energy and feeling of well-being in fairly short order. Some people will claim, of course, that at least three, or maybe even five or six, raw generations are necessary to become truly healthy; and if you feel better when eating a diet that includes some cooked foods, it's because cooked foods are blocking the detoxification process. There is apparently no limit to denial! :-)

In summary

Vitamin and mineral content in raw vs. cooked foods. The vitamin content of raw foods is
higher than cooked foods, though by relatively modest amounts that appear to range from (roughly) 10-25% greater in most cases. The difference in mineral content between raw and cooked foods appears to be negligible. (Cooking itself does not appear to compromise mineral content. However, when water is used in cooking, there may be a small amount of leaching of minerals into the water.) Cooking can increase bioavailability of some nutrients. As discussed in earlier sections, conservative cooking greatly increases the bioavailability/digestibility of starch. Bioavailability of beta-carotene can be very low from raw vegetables such as carrots, but is improved by steaming. Some proteins are better assimilated after cooking, although digestibility of others may be negatively impacted somewhat. Cooking also neutralizes many antinutrients and thereby may increase the bioavailability of a number of minerals (e.g., zinc, iron) in many plant foods. Cooking creates some toxins, neutralizes others. All plants contain at least some amount of "nature's pesticides." There is no such thing as a toxin-free diet. Within a normal range of consumption, toxins resulting from conservative cooking techniques can be safely handled by the body's normal mechanisms, and do not seem to increase the incidence of degenerative diseases.

There are no known all-raw or, apparently, with the possible lone exception of a few Inuit groups, even predominantly raw cultures on the planet, including the healthiest
traditional cultures, and including the most primitive of hunter-gatherers. Note: The Inuit

395

("Eskimos"), to whom legend has often attributed the eating of all of their meat raw, were found upon early or first contacts by the noted explorer Stefansson and other fellow explorers to eat some fat (i.e., blubber) raw; and some groups ate large amounts (though not all) meat raw; but other groups cooked much if not most of their meat. Use of fire goes back tens if not hundreds of thousands of years into prehistory. The most reliable evidence suggests that fire was initially controlled by humans (for warmth and protection against predators) approximately 400,000-500,000 years ago, with widespread cooking having been practiced for at least the last 40,000 years, possibly longer. Substituting cooked starches for overly high fruit consumption improves the diet. Including cooked vegetables and/or starches considerably improves the nutrient profile of a (vegetarian) raw-food diet that predominates in fruits, which although abundant in phytochemicals and in certain select vitamins (C, B-6, nicotinamide, potassium), are on the whole low in vitamins and minerals overall (particularly vitamin D, B-12, biotin, calcium), and high in sugar. Long-term success on all-raw vegan diets is rare. While increasing the percentage of rawfood in the diet appears to improve the health of people who have been eating the standard Western diet, few people, even the most enthusiastic of adherents, have been able to maintain an all-raw-food diet over the long term, and for more than simply social pressures. Among those who have, there is a high prevalence of emaciation.

Most people who eat vegetarian diets maintain better health over the long term when including a portion of cooked foods compared to going all-raw.

In the end, as with many other dietary issues, the question of raw vs. cooked foods comes down to idealism vs. realism. Which is more important: maintaining a philosophy--or maintaining your health? That's the bottom line. --Jean-Louis Tu

What Every Vegan Should Know About Vitamin B12


by Stephen Walsh, Ph.D., Trustee of The Vegan Society [U.K.], and other members of the International Vegetarian Union science group (IVU-SCI)

Very low B12 intakes can cause anaemia and nervous system damage. The only reliable vegan sources of B12 are foods fortified with B12 (including some plant milks, some soy products and some breakfast cereals) and B12 supplements. Vitamin B12, whether in supplements, fortified foods, or animal products, comes from micro-organisms. Most vegans consume enough B12 to avoid anaemia and nervous system damage, but many do not get enough to minimise potential risk of heart disease or pregnancy complications. To get the full benefit of a vegan diet, vegans should do one of the following:

1. eat fortified foods two or three times a day to get at least three micrograms (mcg or g) of B12 a
day or

2. take one B12 supplement daily providing at least 10 micrograms or 396

3.

take a weekly B12 supplement providing at least 2000 micrograms.

If relying on fortified foods check the labels carefully to make sure you are getting enough B12. For example, if a fortified plant milk contains 1 microgram of B12 per serving then consuming three servings a day will provide adequate vitamin B12. Others may find the use of B12 supplements more convenient and economical. The less frequently you obtain B12 the more B12 you need to take, as B12 is best absorbed in small amounts. The recommendations above take full account of this. There is no harm in exceeding the recommended amounts or combining more than one option. Good information supports vegan health, pass it around. If you dont read another word about B12 you already know all you need to know. If you want to know more, read on. This information sheet was prepared by Stephen Walsh (Stephen_walsh@vegans.fsnet.co.uk), a UK Vegan Society trustee, and other members of the International Vegetarian Union science group (IVU-SCI), in October 2001. This information may be freely reproduced but only in its entirety (list of endorsers may be omitted). Endorsers include:

Vitamin B12 and Vegan Diets Lessons from history


B12 is an exceptional vitamin. It is required in smaller amounts than any other known vitamin. Ten micrograms of B12 spread over a day appears to supply as much as the body can use. In the absence of any apparent dietary supply, deficiency symptoms usually take five years or more to develop in adults, though some people experience problems within a year. A very small number of individuals with no obvious reliable source appear to avoid clinical deficiency symptoms for twenty years or more. B12 is the only vitamin that is not recognised as being reliably supplied from a varied wholefood, plant-based diet with plenty of fruit and vegetables, together with exposure to sun. Many herbivorous mammals, including cattle and sheep, absorb B12 produced by bacteria in their own digestive system. B12 is found to some extent in soil and plants. These observations have led some vegans to suggest that B12 was an issue requiring no special attention, or even an elaborate hoax. Others have proposed specific foods, including spirulina, nori, tempeh, and barley grass, as suitable non-animal sources of B12. Such claims have not stood the test of time. In over 60 years of vegan experimentation only B12 fortified foods and B12 supplements have proven themselves as reliable sources of B12, capable of supporting optimal health. It is very important that all vegans ensure they have an adequate intake of B12, from fortified foods or supplements. This will benefit our health and help to attract others to veganism through our example.

Getting an adequate amount of B12

397

National recommendations for B12 intakes vary significantly from country to country. The US recommended intake is 2.4 mcgs a day for ordinary adults rising to 2.8 mcgs for nursing mothers. The German recommendation is 3 mcgs a day. Recommended intakes are usually based on 50% absorption, as this is typical for small amounts from foods. To meet the US and German recommendations you need to obtain sufficient B12 to absorb 1.5 mcgs per day on average. This amount should be sufficient to avoid even the initial signs of inadequate B12 intake, such as slightly elevated homocysteine and MMA levels, in most people. Even slightly elevated homocysteine is associated with increased risk of many health problems including heart disease in adults, preeclampsia during pregnancy and neural tube defects in babies. Achieving an adequate B12 intake is easy and there are several methods to suit individual preferences. Absorption of B12 varies from about 50%, if about 1 mcg or less is consumed, to about 0.5% for doses of 1000 mcgs (1 mg) or above. So the less frequently you consume B12, the higher the total amount needs to be to give the desired absorbed amount. Frequent use of foods fortified with B12 so that about one microgram of B12 is consumed three times a day with a few hours in between will provide an adequate amount. Availability of fortified foods varies from country to country and amounts of B12 vary from brand to brand, so ensuring an adequate B12 supply from fortified foods requires some label reading and thought to work out an adequate pattern to suit individual tastes and local products. Taking a B12 supplement containing ten mcgs or more daily provides a similar absorbed amount to consuming one mcg on three occasions through the day. This may be the most economical method as a single high potency tablet can be consumed bit by bit. 2000 mcgs of B12 consumed once a week would also provide an adequate intake. Any B12 supplement tablet should be chewed or allowed to dissolve in the mouth to enhance absorption. Tablets should be kept in an opaque container. As with any supplement it is prudent not to take more than is required for maximum benefit, so intakes above 5000 mcg per week should be avoided despite lack of evidence for toxicity from higher amounts. All three options above should meet the needs of the vast majority of people with normal B12 metabolism. Individuals with impaired B12 absorption may find that the third method, 2000mcg once a week, works best as it does not rely on normal intrinsic factor in the gut. There are other, very rare, metabolic defects that require completely different approaches to meeting B12 requirements. If you have any reason to suspect a serious health problem seek medical advice promptly.

Symptoms of B12 deficiency


Clinical deficiency can cause anaemia or nervous system damage. Most vegans consume enough B12 to avoid clinical deficiency. Two subgroups of vegans are at particular risk of B12 deficiency: long-term vegans who avoid common fortified foods (such as raw food vegans or macrobiotic vegans) and breastfed infants of vegan mothers whose own intake of B12 is low. In adults typical deficiency symptoms include loss of energy, tingling, numbness, reduced sensitivity to pain or pressure, blurred vision, abnormal gait, sore tongue, poor memory, confusion, hallucinations and personality changes. Often these symptoms develop gradually over several months to a year before being recognised as being due to B12 deficiency and they are usually reversible on administration of B12. There is however no entirely consistent and reliable set of symptoms and there are cases of permanent damage in adults from B12 deficiency. If you suspect a problem then get a skilled diagnosis from a medical practitioner as each of these symptoms can also be caused by problems other than B12 deficiency. Infants typically show more rapid onset of symptoms than adults. B12 deficiency may lead to loss of energy and appetite and failure to thrive. If not promptly corrected this can progress to coma or death. Again there is no entirely consistent pattern of symptoms. Infants are more vulnerable to permanent damage than adults. Some make a full recovery, but others show retarded development.

398

The risk to these groups alone is reason enough to call on all vegans to give a consistent message as to the importance of B12 and to set a positive example. Every case of B12 deficiency in a vegan infant or an ill informed adult is a tragedy and brings veganism into disrepute.

The homocysteine connection


This is not however the end of the story. Most vegans show adequate B12 levels to make clinical deficiency unlikely but nonetheless show restricted activity of B12 related enzymes, leading to elevated homocysteine levels. Strong evidence has been gathered over the past decade that even slightly elevated homocysteine levels increase risk of heart disease and stroke and pregnancy complications. Homocysteine levels are also affected by other nutrients, most notably folate. General recommendations for increased intakes of folate are aimed at reducing levels of homocysteine and avoiding these risks. Vegan intakes of folate are generally good, particularly if plenty of green vegetables are eaten. However, repeated observations of elevated homocysteine in vegans, and to a lesser extent in other vegetarians, show conclusively that B12 intake needs to be adequate as well to avoid unnecessary risk.

Testing B12 status


A blood B12 level measurement is a very unreliable test for vegans, particularly for vegans using any form of algae. Algae and some other plant foods contain B12-analogues (false B12) that can imitate true B12 in blood tests while actually interfering with B12 metabolism. Blood counts are also unreliable as high folate intakes suppress the anaemia symptoms of B12 deficiency that can be detected by blood counts. Blood homocysteine testing is more reliable, with levels less than 10 mol/litre being desirable. The most specific test for B12 status is methylmalonic acid (MMA) testing. If this is in the normal range in blood (<370 nmol/L) or urine (less than 4 g/mg creatinine) then your body has enough B12. Many doctors still rely on blood B12 levels and blood counts. These are not adequate, especially in vegans.

Is there a vegan alternative to B12-fortified foods and supplements?


If for any reason you choose not to use fortified foods or supplements you should recognise that you are carrying out a dangerous experiment - one that many have tried before with consistently low levels of success. If you are an adult who is neither breast-feeding an infant, pregnant nor seeking to become pregnant, and wish to test a potential B12 source that has not already been shown to be inadequate, then this can be a reasonable course of action with appropriate precautions. For your own protection, you should arrange to have your B12 status checked annually. If homocysteine or MMA is even modestly elevated then you are endangering your health if you persist. If you are breast feeding an infant, pregnant or seeking to become pregnant or are an adult contemplating carrying out such an experiment on a child, then dont take the risk. It is simply unjustifiable. Claimed sources of B12 that have been shown through direct studies of vegans to be inadequate include human gut bacteria, spirulina, dried nori, barley grass and most other seaweeds. Several studies of raw food vegans have shown that raw food offers no special protection. Reports that B12 has been measured in a food are not enough to qualify that food as a reliable B12 source. It is difficult to distinguish true B12 from analogues that can disrupt B12 metabolism. Even if true B12 is present in a food, it may be rendered ineffective if analogues are present in comparable amounts to the true B12. There is only one reliable test for a B12 source does it consistently prevent and correct deficiency? Anyone proposing a particular food as a B12 source should be challenged to present such evidence.

A natural, healthy and compassionate diet

399

To be truly healthful, a diet must be best not just for individuals in isolation but must allow all six billion people to thrive and achieve a sustainable coexistence with the many other species that form the "living earth". From this standpoint the natural adaptation for most (possibly all) humans in the modern world is a vegan diet. There is nothing natural about the abomination of modern factory farming and its attempt to reduce living, feeling beings to machines. In choosing to use fortified foods or B12 supplements, vegans are taking their B12 from the same source as every other animal on the planet micro-organisms without causing suffering to any sentient being or causing environmental damage. Vegans using adequate amounts of fortified foods or B12 supplements are much less likely to suffer from B12 deficiency than the typical meat eater. The Institute of Medicine, in setting the US recommended intakes for B12 makes this very clear. "Because 10 to 30 percent of older people may be unable to absorb naturally occurring vitamin B12, it is advisable for those older than 50 years to meet their RDA mainly by consuming foods fortified with vitamin B12 or a vitamin B12-containing supplement." Vegans should take this advice about 50 years younger, to the benefit of both themselves and the animals. B12 need never be a problem for well-informed vegans. Good information supports vegan health, pass it around.

Further information:
Dietary Reference Intakes for Thiamin, Riboflavin, Niacin, Vitamin B6, Folate, Vitamin B12, Pantothenic Acid, Biotin, and Choline, National Academy Press, 1998, ISBN 0-309-06554-2; http://books.nap.edu/books/0309065542/html/306.html#pagetop. Vitamin B12: Are You Getting It?, by Jack Norris, Vegan Outreach, 2000, available on the web: http://www.veganoutreach.org/health/b12.html. Homocysteine in Health and Disease, ed. Ralph Carmel and Donald W. Jacobsen, Cambridge University Press, 2001, ISBN 0-521-65319-3.

Editorial and Technical Notes by Tom Billings


This section copyright 2001 by Thomas E. Billings; all rights reservedThese notes are not part of the above article; their purpose is to provide relevant background information. Thanks to Stephen Walsh and the contributing members of IVU-SCI for making this article available to the public. See the Site Contributor listing for Stephen Walsh for a disclaimer regarding the use of his material on Beyond Veg. We chose to exclude the list of endorsers. We did this because a few veg*ns regard Beyond Veg as controversial, and we don't want the endorsers to be harassed by individuals who may be upset because the endorser's name is mentioned on this site. Regarding the above claims that the vegan diet is natural: Here at Beyond Veg, we assert that the natural human diet is defined by evolution, and a wide range of evidence indicates that humans evolved on a diet that includes both plant and animal foods, i.e., a non-vegan diet. From this viewpoint, vegan diets are (at best) a strict restriction of the natural diet, i.e., may be considered (potentially) incomplete in some sense. At worst, strict vegan diets based on grains and legumes may be seen as agriculture-based diets, and significantly different from evolutionary hunter-gatherer diets. The term plant milk appears above. This is a reference to milk analogues or substitutes made from plant foods, e.g., soymilk, rice milk, almond milk, etc.

400

Units: weight. g = mcg = micrograms = 10 -6 grams = 1 millionth of a gram. mg = milligram = 10 -3 grams = 1 thousandth of a gram. Units: solution strength The basic unit here is molarity, the number of moles of vitamin B-12 (cobalamin) per liter of liquid solution. A mole is Avogadro's number (6.022 x 10 23) of molecules of the compound of interest. We have here: mol = micro-moles = 10 -6 moles = 1 millionth of a mole. nmol = nano-moles = 10 -9 moles = 1 billionth of a mole. For a more detailed discussion of moles and how they are used to measure concentration of solutions, see a basic chemistry text, e.g., Oxtoby and Nachtrieb [1986].

References for notes:


Oxtoby DW, Nachtrieb NH (1986) Principles of Modern Chemistry. Saunders College Publishing, Philadelphia, Pennsylvania

The Calorie Paradox of Raw Veganism


by Tom Billings Copyright 1999 by Thomas E. Billings. All rights reserved.

Introduction: The Paradox in a Nutshell


The calorie paradox for raw vegan foods can be stated succinctly, as follows:

The raw vegan diet promoted as the "party line" does not satisfy minimum calorie requirements in its strict form. Active healthy adults (i.e., in Western countries) need, on
average, at least 2,000 calories of energy per day from their diet. However, the foods that usually comprise the raw vegan diet, with the exception of avocados, nuts, dried fruit, and sweet fruit juices, are relatively low-calorie-density. That is, they provide only limited calories per pound or kilogram of food. Hence, to acquire the 2,000 calories needed per day, raw vegans who follow the "party line" and limit their consumption of avocados, nuts, and dried fruit need to consume, as we will see: (a) very large amounts, each day, of the lower-calorie-density foods (raw vegetables, fruits, legume sprouts), or (b) ingest significant quantities of sweet fruit juice (which has the sideeffect of excess sugar consumption). It should be noted that actual calorie requirements will vary by individual according to age, gender, size (weight), level of physical activity, special situations (e.g., growing children, pregnancy and lactation, etc.), and other factors. The figure of 2,000 calories is used as a standard in this paper because it is a relatively conservative minimum for adults in most Western countries. Of course, as requirements vary, individuals who wish to have a more precise estimate of their

401

calorie requirements might want to consult Recommended Dietary Allowances (National Research Council, 1989, National Academy Press, Washington, D.C.) for details.

What are the predictable and/or actual problems that happen when individuals do try to follow the "party line" strictly? Two questions arise here which characterize the raw
vegan calorie paradox: The first is whether it is possible to attempt option "a" above without losing excessive weight and eventually becoming emaciated on the (relatively) modest quantities of plant foods some advocates claim to get by on (compared to what would actually be required). And the second is whether the amounts of fruit juices necessary under option "b" are advisable or even sustainable in the long run (due to the very high sugar content) without risking serious health problems.

Compelling insights into raw diets to be gleaned from analyzing the "calorie paradox." After
first laying to rest an outdated but still-popular rationale for why some people believe that calories don't matter and are irrelevant, we'll then take a look at the caloric values for each of the classes of food types permissible in a raw vegan diet, and the amounts of each type of food necessary to meet daily caloric requirements. The resulting figures provide compelling insights into the various types of eating patterns seen among raw vegans and demonstrate why these commonly seen patterns inevitably exist. Finally, these same insights and figures also provide hard criteria by which to assess the credibility and truthfulness of individuals claiming to have bypassed the raw vegan calorie paradox. The discussion here of the calorie paradox is inspired by the discussion of the topic in The Natural Hygiene Handbook (1996, American Natural Hygiene Society, pp. 46-47), although they don't use the terminology "paradox." Careful consideration and evaluation of the raw calorie paradox provides valuable insights into raw veganism, and goes a considerable way toward explaining why so very few people succeed or thrive, long term, on 100% raw vegan diets, and also why so many raw vegans are extremely thin/emaciated.

Calories ARE Relevant: A Critique of Herbert Shelton on Calories


Some raw-fooders will tell you that they don't believe in calories or the "calorie theory." If an emaciated raw-fooder is making this claim, however, perhaps they ought to start paying more attention to these calories they don't believe in--after all, those who consume adequate calories, in general, are usually not emaciated! :-)

Definition of calorie. Calories are a measure of heat energy: the amount of heat required to raise 1 gram
(1 ml) of water 1C in temperature. Food calories are actually kilocalories, i.e., 1,000 calories per the preceding definition. The discussion in this paper deals solely with food calories. As heat potential and energy potential are biochemically related, food calories are a measure of the latent or potential energy in the food, and hence provide an estimate of (or proxy for) the energy that you can derive by eating and digesting the food.

NOTE: Those who recognize the calorie as a measure of the energy available in a particular food may want to skip to the next section if they do not need further convincing of this elementary fact of nutritional biochemistry.
The basis for common raw-fooder criticisms of calories dates back at least to Herbert Shelton (the principal founder, in modern times, of the natural hygiene movement), and perhaps even earlier (to Arnold Ehret, who promoted what would be known today as a fruitarian-type diet). To illustrate the logical weakness of these objections, let me summarize Shelton's points and comment on them. (The Shelton material below is

402

summarized from chapter 5, "Calories," in The Science and Fine Art of Food and Nutrition (The Hygienic System: Volume II), 1984, sixth edition--first edition was 1935; Natural Hygiene Press, pp. 90-97.)

Shelton (summary): Calorie requirements are based on an inappropriate standard. Calorie


requirements were derived by Voit of Germany, based on what people actually eat. This is inappropriate because: It's better to base it on what (Shelton thinks) people should eat. The requirements tell people to overeat, because the average person actually overeats (and the requirements are based on average, or non-rawist, people).

Reply/Comment: Calories are based on reality. From the point of view of establishing useful
estimates of energy requirements, you must consider what people actually eat. Regardless of whether average people may overeat or not, a caloric standard based on sedentary, emaciated raw vegans will not be appropriate (and probably not adequate) for a heavy meat-eater who earns a living via hard physical labor. The standard must serve the people, and not the other way around, as Shelton appears to imply or suggest. As for Shelton's second point above, those who indulge in overeating are usually ignoring a standard based on conventional diets. People who ignore a standard derived from a conventional diet will also ignore a standard based on raw veganism (an extreme diet, in the view of many who follow conventional diets).

Modern methods for measuring calories are technically advanced. Note also that the methods
for measuring energy needs (calories) have advanced considerably since the time of Voit (and Shelton). For more information, see: Recommended Dietary Allowances, 10th edition (chapter 3, "Energy," pp. 24-38), National Research Council, 1989, National Academy Press, Washington, D.C.; and "Energy Needs: Assessment and Requirements," by Yves Schutz and Eric Jequier, chapter 5 (pp. 101-111) of Modern Nutrition in Health and Disease, vol. I, 8th edition, edited by Maurice E. Shils et al., 1994, Lea & Feibiger, Philadelphia.

Shelton (summary): Calories are merely a system of "fire-box dietetics"! Valuing foods by
calories alone leads to a system of "fire-box dietetics" in which refined high-calorie foods are regarded as more valuable than lower-calorie, whole natural plant foods like vegetables and fruits. Humans cannot live exclusively on a diet of refined foods; fruits and vegetables are needed in the diet as well.

Reply/Comment: Calories are not the whole story. Shelton is partially correct here. Humans cannot
live well exclusively on refined foods, and we do need vitamins and minerals, for which vegetables and fruits are an excellent source. Back in Shelton's time, it appears that some conventional medical writers overemphasized the value of calories. However, that is not true today; calories are recognized as only one of the many factors that determine the value of a food--other factors include vitamins, minerals, amino acids, fatty acids, and so on. Inasmuch as Shelton's criticism here is a reaction to now-outdated views extant during his time, we observe that Shelton's criticism is no longer important.

Shelton (extract/summary): Calories are the least important nutritional factor. From p. 91:
"The human body is more than a mere furnace or fire box into which we must continue to shovel fuel. The fuel value of food [calories] is the least valuable thing about it." Shelton goes on to say that calories measure only the combustible part of foods. The portion ignored--ash--is vitally important as it contains the mineral content. Calories also ignore acid and alkali, and most high-calorie foods are acid-forming. Animal tests show that low-calorie foods (whole, unrefined plant foods) sustain life, while exclusive feeding of high-calorie refined foods kills the animals. Low-calorie foods are high in vitamins and minerals.

Reply/Comment: Calories are one of multiple, important factors. Calories are certainly not the
only measure of a food's value; instead they are only one of many factors to measure and consider. Here Shelton is grossly overemphasizing that calories are only a partial measure, and this may lead the reader to

403

the false conclusion that calories are worthless and can be ignored. This part of Shelton's writings borders on demagoguery, in my opinion. There is an additional irony here. Shelton argues against calories on the grounds that the calorie model is a narrow measure, as it does not include other factors (vitamins, minerals). The irony is that Shelton's argument itself is narrow--one cannot disregard calories and focus exclusively on other factors like vitamins and minerals. To make this point clear, if we simply note that one cannot live on vitamins and minerals alone, then if we were to apply the same type of reasoning here, we see how the bogus "logic" of Shelton would suggest that vitamins and minerals are also worthless.

Shelton (extract): Calories depend on digestive strength. From p. 94: "We may add that the value
of any food to the individual is partly determined by its digestibility and by the individual's present nutritive needs and powers of digestion and assimilation."

Reply/Comment: The absorption of any/all nutrients depends on digestive strength. In


context, the preceding remark by Shelton was made concerning the limitations of the calorie measure. However, the remark as stated has more general application as well: the same kind of limitations apply to vitamins, minerals, amino acids, fatty acids, and so on. As such, it is not a criticism of calories (alone), but a statement of a constraint that applies to all the "values" our bodies try to extract from the foods we eat. Again, narrowly applying such an analysis only to calories while ignoring its application to nutrients across the board borders on demagoguery.

Shelton (extract): From p. 95: "A table giving the caloric values of different foods tells us merely how
much heat can be produced in the laboratory by burning these foods. Such tables are fairly accurate indexes to the fuel values of the foods listed..."

Reply/Comment: The above, as with much of Shelton's presentation on calories here, is potentially
ambiguous. By fuel value, does he mean the energy value available to the body from eating the food, or is he discussing testing food for fuel value only as one would assess the value of gas, coal, oil, or wood in a furnace? The latter seems unlikely, so is Shelton contradicting himself here in now agreeing that calories do in fact give close estimates of their available food energy?

Shelton (extract): Calorie requirements are highly variable, hence dubious. From p. 95: "The
amount of heat and energy required by various individuals varies so greatly with the conditions of sex, climate, occupation, age, size, temperament, etc., that food values based on the calorie standard are of no practical value."

Reply/Comment: Shelton's claim is illogical and overemphasizes individual differences. Let


me restate the above, with a few minor changes, to illustrate the defective logic: "The amount of vitamins required by various individuals varies so greatly with the conditions of sex, climate, occupation, age, size, temperament, etc., that food values based on the RDAs (recommended daily allowances) of vitamins are of no practical value." Once again, Shelton is overemphasizing differences due to individual variability, and his demagoguery here may lead readers to the false conclusion that calories should be ignored. Shelton appears to be trapped in the binary thinking mode that is a very serious limitation of the natural hygiene approach.

Shelton (extract): Eating for calories can only lead to trouble. From p. 96: "It should be easily
seen that a system of feeding based on the caloric or fuel value of foods must inevitably lead to mischief."

404

Reply/Comment: Today, no one advises eating solely for calories: other nutrients are also important. If one eats only for calories, Shelton's remark is relevant. However, no rational person would
try to live on a diet of, say, only cheesecake and cookies (no matter how good they taste). At the present time, no reputable nutrition expert would suggest eating ONLY for calories; instead, reputable nutrition experts recommend eating a variety of foods to get adequate vitamins, minerals, fatty acids, protein, AND calories. You need all of those factors--you need vitamins, minerals, fatty acids, protein, AND you need the energy from food that the calories are a measure of. Once again, Shelton's comments here are outdated.

Calories are a useful measure of energy. Over and above the calorie standards suggested for
individuals, the calorie is a useful measure. It provides the best estimate of the energy one can get from a particular food. Calories are also used to measure not only the energy quotient of a food, but individual basal metabolic rates, as well as the bodily energy required for various daily physical tasks. They are not just a measure of food energy, but biochemical energy--which also applies to the energy required for bodily tasks and functions that no one can escape. You need energy from food to survive and thrive. Contrary to some rawist fantasies, food is not just a source of building materials--it is also an energy source. We know that without energy, a pile of building materials cannot be transformed into a building. Similarly, the human body cannot build--or live--without energy. In summary, the calorie is not only a convenient, but a useful--and tested--model of the reality of our energy needs.

Breatharianism: energy from the ether, or just hot air? No confirmed breatharians (or yetis either). In closing this section, it is worth mentioning that
before Shelton's time, the fruitarian extremist Arnold Ehret (note: I am a former follower of Ehret) also popularized the idea that you did not need to eat food to get energy. In more modern times, the term "breatharian" has been coined to describe a person who supposedly gets energy strictly from the sun or from "prana" in the air and does not need to eat. The problem, however, is that there are no scientifically confirmed cases of breatharianism (no surprise there; as yet, there have been no confirmed sightings of any yetis either). Have your cake and eat it anyway. Even worse, a recent advocate of breatharianism was found to be a fraud (he said he did sometimes eat hamburgers but since those are not real food, his view was this showed he was actually living on air ;-) ); and a currently active advocate of the practice advises her followers that they can be breatharians and eat whenever they want to, for fun or for social reasons. Quite frankly, that is suspicious and ludicrous--it's like saying you can be a raw vegan but eat meat anytime you want to (and implicitly, in any quantities), for fun or for social reasons.

The Paradox: Quantity of Raw Vegan Food Required for Sufficient Calories
Let us begin our examination of the paradox with a table that shows the quantity of each food or food type needed by itself (i.e., a mono-diet of one food or type of food) to provide 2,000 calories per day. Please note that this is for comparison purposes only; it is not suggested that you try to live on only one food, or food category.

Daily Food Quantities Needed (in a Mono-Diet) to Provide 2,000 Calories

405

Note: calories below are for raw foods, except where indicated. Food Category English System Kcal/ Lb.* Lbs. Req'd Metric System Kcal/ 100 gm* Kilos Req'd Info Source

VEGETABLES Common Veg. (avg.) 100.00 20.00 SWEET FRUIT Sweet Fruit (avg.) Sweet Fruit (blend of 4) Sweet Fruit Juice Dried Fruit 300.00 216.05 540/qt. 1,170.3 6.67 9.26 3.7 qt. 1.71 66.14 47.63 570.62/lt. 258.00 3.02 4.20 3.5 lt. 0.78 ANHS Milk paper USDA USDA 22.05 9.07 ANHS

NEUTRAL FRUIT Cucumbers Tomatoes 58.97 86.18 33.92 23.21 LEGUMES Sprouted (avg.) Mung Bean sprouts Cooked (avg.) 444.53 136.08 580.61 4.50 14.70 3.44 98.00 30.00 128.00 2.04 6.67 1.56 USDA USDA USDA 13.00 19.00 15.38 10.53 USDA USDA

STARCHES / GRAINS Wheat, Sprouted Grains / Tubers (cooked) 898.13 500.00 2.23 4.00 198.00 110.23 1.01 1.81 USDA ANHS

FATTY FOODS Avocados 800.00 2.50 176.37 1.13 ANHS

NUTS & OILY SEEDS Soaked / Sprouted Dry 1,250.0 2,500.0 1.60 0.80 275.57 551.15 0.73 0.36 estimate ANHS

*Liquid measures are in quarts and liters. For details on data sources and entries above, see Appendix 2.

406

Let's now review and evaluate the data in the table.

Vegetables only--the "gorilla diet" Massive bulk of food required. Eating vegetables and greens only could be called the gorilla diet, because (mountain) gorillas are folivores and eat a diet that is predominantly leaves and vegetative matter, although they also eat smaller amounts of fruits, roots, and insects. (Note: gorillas are not vegans--they deliberately eat insects, if a modest amount.) Lowland gorillas are foli-frugivores and eat more fruit, and more insects, than mountain gorillas. If you try to live on only raw vegetables, you will need to eat 20 pounds (9.1 kg) per day, just to satisfy your calorie requirements. That's a lot of food to eat each day! Consider the time required to eat (and excrete) so much food on a daily basis. Indeed, you would be forced to lead a life parallel to that of the gorilla--a life that consists primarily of eating, excreting, and sleeping. (It's no coincidence that many raw-foodists seem to be primarily focused on the topic of food and eating it for much of their day.)
The human body is not that of a folivore, although we do have some (limited) adaptations for it. The noted comparative anatomy expert, D.J. Chivers, reports that the human gut (digestive system) is that of a faunivore (meat-eater) with some of the adaptations of a folivore. (References: Cambridge Encyclopedia of Human Evolution, 1992, Cambridge University Press, pp. 61, 64; see also D.J. Chivers and P.J. Langer, The Digestive System in Mammals: Food, Form, and Function, 1994, Cambridge University Press, p. 4.) In other words, we have the mixed set of adaptations of an omnivore/faunivore, so although we are not purely or primarily folivores, we can certainly eat raw vegetables. [Note: Chivers uses a slightly different definition of the term "omnivore," and suggests instead that the human gut should be described as that of a faunivore, rather than omnivore.]

Diuretic effect of common green vegetables. A diet of exclusively raw vegetables provides
challenges over and above the volume problem. Many common vegetables are diuretic: asparagus, celery, dandelion greens, fennel bulb, onion, parsnip, and parsley. (References: The Encyclopedia of Herbs and Herbalism, ed. Malcolm Stuart, 1981, Crescent Books, pp. 148, 154, 161, 235-236, 270-271. Also see The Complete Medicinal Herbal, by Penelope Ody, 1993, Dorling Kindersley, pp. 37, 59, 103.) The ultimate result of a diet high in diuretics can be chronic dehydration--which can cause serious problems in and of itself. This comment also applies to vegetable juices, specifically green juices (which usually have a celery juice base and/or include other diuretics). Side note: The reference above by Penelope Ody reports that many sweet fruits are diuretic: oranges (p. 49), strawberries (p. 60), apples (p. 77), raspberry (p. 93). Those readers who have followed diets high in sweet fruits have probably experienced the excess urination so common on such diets. This leads us directly to the next topic...

100% sweet fruit diets-are you sure that you are getting enough sugar? :-)
From the table, we see that one needs to eat 6.7-9.3 pounds (3.0-4.2 kg) of fruit daily to satisfy calorie requirements. Further, the 9.3 pound (4.2 kg) figure is for the "4 sweet fruit" blend, on a net (edible portion only) basis. Assuming an average waste factor (peels, inedible seeds, cores) of 28.5% (per USDA Handbook--see Appendix 2), we divide the net fruit consumption estimates by 0.715, to find that one needs to eat 9.3-13 pounds (4.2-5.9 kg) of whole fruit per day to get the required net weight. That's a lot of fruit to eat! Further, the "4 sweet fruit" blend is slightly over 10% carbohydrate by weight, nearly all of which is sugar. Applying this to the weight of fruit required indicates that one would be eating ~0.67-0.926 pounds (300-

407

420 gm) of sugar each day. Needless to say, that explains why many fruitarians experience the symptoms of excess sugar consumption (frequent urination and thirst, fatigue, sugar highs/blues, etc.). Also, it seems reasonable to regard a diet in which sugar is the overwhelmingly predominant calorie source (little fat, starch, or protein) as a form of "sugar addiction."

Fruit-juice diets are not a long-term solution


The table shows that fruit juices are high in calories and are a concentrated food: you can satisfy your calorie requirements with 3.7 quarts (3.5 liters) of juice. On the surface, it appears that juices are a solution to the calorie paradox. However, consuming this amount of juice would give you a very large dose-approximately 1.066 pounds (0.483 kg)--of sugar each day. Such a large quantity of sugar may cause negative side-effects, particularly in the long run. Some fruitarian "experts" argue that you should juice your food, to avoid the calorie paradox. While juices are natural, there is much to consider before basing one's diet on fruit juices. Note that:

Chimps practice a crude form of juicing, called wadging. (This means that a food may be
crushed with their palate, the juices sucked out, and the leftover pulp spit out.) However, this constitutes a very small portion of their food intake; their diet also includes whole fruit supplemented by leaves (the two largest components), and insects, meat, and other foods (which make up the remainder). Extremists like to point out that other animals generally don't cook their food. Can the fruit-juice diet advocates name (and document) even one wild primate that lives on a diet that is predominantly fruit juice (juice specifically, and not the fruit flesh--some primates are frugivores)? Juices are fractionated foods. Juices are natural in that they are extracted from whole foods. They are, however, fractionated (partial) foods--something that raw-food advocates normally argue against on general naturalism principles. Juices lack fiber, and one major blender manufacturer reports that the pulp left over from juicing is more nutritious than juice. Electric juicers are UNnatural. Electric juicers, required if one is to consume large amounts of juice, are no more natural than electric stoves. Extremists like to say that you were not born with a stove on your back. This is true, and you were not born with a juicer on your back (or in your mouth), either! :-) (Note: Of course, the human mouth can juice soft fruits, but it is slow and inefficient. Also, extensive contact with fruit acid is known to erode tooth enamel.)

The above are mostly theoretical challenges to the fruit-juice diet. Of more practical relevance are the following concerns:

Very few rawists live on fruit juices alone, or even diets in which fruit juices are
predominant. There are very few, if any, credible success examples for long-term (5+ years) fruitjuice diets. Sweet fruit juice may cause sugar/insulin spikes. Juices are more rapidly absorbed by the body than whole foods. Sweet fruit juices are very high in sugar (~1.05 pounds of sugar per day, per the table). It is very easy to drink juices (too) quickly, especially sweet fruit juices. The result of this is that sweet fruit juices may produce sugar highs and insulin spikes that are more extreme than the reaction caused by eating whole fruits. (See The Diabetic's Total Health Book, by June Biermann and Barbara Toohey, 1992, Jeremy P. Tarcher/Perigee Books, pp. 88-89.) Your pancreas and liver may be able to handle this in the short run, but are such extremes good for you, multiple times a day, over the long run?

Effects of fruit juices aggravate the hunger, craving, and bingeing problems already present in fruitarian diets. Besides causing "sugar shock" when consumed in excess, fruitjuice diets are not satisfying. You drink juices and you are hungry a short time later. Fruitarianism

408

is notorious for cravings and binge-eating; trying to live on an exclusive diet of rapidly absorbed, high-sugar fruit juices will only make the problem worse. A diet of only fruit juices may be deficient in protein, even if one uses the low-protein requirements advocated by those promoting crank science theories that allege to "prove" protein is "toxic," in the sense that the metabolic by-products of protein digestion are allegedly harmful. (Such theories are based on an irrational, pathological fear of "toxins" and an amazing ignorance of the reality that the human body is well-equipped to dispose of the metabolic by-products of protein digestion. See: "Is Protein Toxic?" [not yet available] for an in-depth examination of the various nonsensical but influential theories--in fruitarianism/rawism--about protein.) To date, such theories are usually based on the protein content of whole fruit, not juices. The protein content of fruit juice may--or may not--compare to that of the underlying fruit (nutritional analysis data are necessary to establish this point). Note that protein deficiencies are rare or nonexistent, even in fruitarian diets. Three possible explanations: (1) the body can adapt to lowprotein diets within limits, (2) those on fruit juice diets (over-) consume large volumes of juice and get adequate protein anyway (along with plenty of that important nutrient, sugar! :-) ), or (3) the fruit-juice diet advocates get their protein via cheating and binge-eating. My opinion: the latter, #3, is the most plausible explanation.

Note: No doubt the advocates of fruit-juice diets will tell you that cravings and hunger are simply a sign of detox, and will go away once you are "pure" enough. Speaking from my direct experience as a fruitarian-8+ years, including ~2 years on 100% fruit--I can advise you that cravings do indeed go away for a while. However, after some time on a fruit diet, your "old friends," cravings and hunger, can and probably will return--stronger than ever, and missing you after being gone so long! :-)

To summarize: Fruit-juice diets are theoretical, not practical, and are not an effective long-term solution
to the calorie paradox. Also, please note that juices can be and are used in therapeutic diets, and can be consumed in moderation as part of a diverse diet. I do not wish to demonize fruit juices (or any other food) here. Used properly, juices (including fruit juices) can promote health; but trying to live on a diet that is predominantly sweet fruit juices is not a good idea.

Dried fruit--concentrated sugar


The table shows that dried fruits are a concentrated food: only 1.71 pounds (0.78 kg) will supply daily caloric requirements. Because they are high in sugar, this amount of dried fruit will provide ~1.1 pounds (0.5 kg) of sugar. Once again, the sweet fruit diet--whether fresh, juiced, or dried--has sugar as the primary calorie source. Trying to live on a diet of only dried fruit is not a good idea. Dried fruits are notorious for causing flatulence. The concentrated sugar may, in some individuals, promote or aggravate reflux--digestive acids coming up out of the stomach. Dr. Stanley Bass, as cited in Survival into the 21st Century, by Viktoras Kulvinskas, reports that consumption of dried fruit may aggravate hemorrhoids as well. (Note that the Survival... book is, in some sections, a compilation of fruitarian folklore. It's interesting that a book that glorifies fruitarianism also mentions some of the potential problems of the diet.) So, dried fruit as an occasional treat may be okay for some people, but making it the center of your diet is a bad idea.

Summary Table: Rough Estimates of Total Daily Sugar Consumption on Different Sweet-Fruit Diets DIET Daily Sugar Consumption Pounds Kilograms

409

Sweet Fruit (blend of 4) Sweet Fruit Juice Dried Fruit

0.93 1.07 ~1.10

0.42 0.48 ~0.50

Note: the above numbers are based on assumptions regarding USDA data and may slightly overestimate the total sugar. See Appendix 2 for details.

Neutral fruit does not solve the paradox/sugar problem


Some fruitarian "experts" advocate consumption of neutral fruit--primarily cucumbers and/or tomatoes--as the solution to the problem of excess sugar that is prevalent in modern hybrid fruits. The table above shows that such a solution might work ONLY in the short run. To get one's daily calorie requirements from cucumbers, one would have to consume a massive quantity each day--33.9 pounds (15.4 kg). Cucumbers. To get the caloric value equivalent to one pound (or kg) of sweet fruit, one needs to eat 4-5 pounds (1.8-2.3 kg) of cucumbers. Substituting cucumbers or tomatoes for sweet fruit will cause substantial weight loss, unless one eats a MUCH larger amount of cucumbers or tomatoes to get the same number of calories. Many people find that cucumbers are a "heavy" food, and it is very difficult to eat as little as 2.2 pounds (1 kg). Hence, the idea of consuming 33.9 pounds (15.4 kg) in one day is ludicrous. Further, reliance on cucumbers as the predominant food in one's diet may promote (the equivalent of) anorexia nervosa in the long run (some anorectics live on a diet of cucumbers and celery). Tomatoes are not much better than cucumbers. One would need 23.2 pounds (10.5 kg) per day to satisfy calorie requirements. Additionally, many people are sensitive to raw tomatoes and cannot eat them in quantity without negative (and unpleasant) reactions. Tomatoes are generally acidic, and 23.2 pounds (10.5 kg) of even the low-acid varieties might supply excess acid.

Notes on other neutral fruits:


Most neutral fruits are very low in calories, and it is hard to eat them in quantity, as they
can be coarse, heavy, not very appetizing, and/or cause side-effects. Zucchini (courgette) provides only 14 cal/100 gm = 63.5 cal/pound; eggplants (aubergine) are 26 cal/100 gm = ~118 cal/pound; yellow summer crookneck squash are 19 cal/100 gm = ~86 cal/pound; okra is 38 cal/100 gm = ~172 calories/pound. Sweet peppers (25 cal/100 gm = 113.4 cal/pound) are a popular neutral fruit; however, raw sweet peppers are notorious for causing gas--both stomach gas (belching) and flatulence. Basing your diet on sweet peppers (and/or other neutral fruits) would be very difficult indeed.

Some fruitarian "experts" can't tell the difference between a neutral fruit and an acid fruit. Classified as neutral by some fruitarian proponents are the following acid fruits:
lemons (29 cal/100 gm = 131.5 cal/pound), limes (30 cal/100 gm = 136 cal/pound), cranberries (49 cal/100 gm = ~222 cal/pound). However, a warning applies here again, which the fruit advocates often don't bother to tell you: consuming large amounts of acid fruits may cause serious damage to your tooth enamel. It's unfortunate when so-called "experts" can't tell the difference between acid and neutral fruits, and fail to warn readers of the risks of consumption of large amounts of acid fruits. Sweet corn (86 cal/100 gm = ~390 cal/pound) is another food that may be classified as a neutral fruit by some. However, sweet corn is 19.02% carbohydrate, including substantial sugar, and a

410

more appropriate classification for sweet corn is in the sweet fruit category. (Modern hybrid sweet corn can be very high in sugar.)

Some fruitarian "experts" don't distinguish between a juicy neutral fruit and an oily fruit, and may classify avocado and olives as neutral fruits. However, under the scheme used
in this paper, avocados (and olives) are considered to be fatty fruits.

Legumes--sprouted and/or cooked


The calorie table shows that 4.5 pounds (2 kg) of sprouted legumes per day will supply calorie requirements. However, note that the most popular sprouted legume, the long mung bean sprouts one finds in supermarkets, supply a mere 30 calories per 100 gm, which works out to 14.7 pounds (6.7 kg) to satisfy daily calorie requirements. Sprouted legumes, other than the long mung bean sprouts one finds in supermarkets, tend to be chewy, fibrous, and they produce satiety (i.e., satisfaction, or feeling "full") quickly. This makes eating 4.5 pounds (2 kg) per day a very difficult task indeed. The requirement to eat 14.7 pounds (6.7 kg) of mung bean sprouts brings one back to the gorilla diet discussed earlier. So, while sprouted legumes appear to be a solution to the calorie paradox, in reality they are not a total solution. Cooked legumes are much easier to eat, and cooking legumes disables more of the antinutrient factors present in legumes than does sprouting. (Hence cooked legumes may be more readily digested than raw sprouted ones.) The table shows that "only" 3.4 pounds (1.6 kg) of cooked legumes are required to satisfy calorie requirements. Again, the satiety-producing properties of high-protein foods may make it a challenge to consume even this amount of cooked legumes.

Sprouted grains and starch foods Is sprouted wheat an answer to the calorie paradox? The calorie table shows that one needs to eat a "mere" 2.23 pounds (1.01 kg) of sprouted wheat each day to satisfy calorie requirements. On the surface, that appears to be a possible solution to the calorie paradox. However, a closer look at the numbers raises some concerns about bulk. The USDA handbook (8-20, p. 101) reports that 1 cup of wheat sprouts weighs 108 gm. That works out to a volume of 2.34 quarts (2.21 liters) of wheat sprouts required to supply 2,000 calories. That's a lot of sprouts to eat in one day! Major hurdles: bulk and/or cloying super-sweetness. Further, the USDA handbook does not
specify how old the sprouts are. Wheat sprouts that are 1 day old are very chewy, especially if the wheat is high-gluten. Wheat sprouts that are a little more than 2 days old are usually super-sweet. In fact, 2+ day-old wheat sprouts could be described as "sickeningly sweet," and it is extremely difficult to eat more than a small amount of such sprouts. The bottom line here is that, despite the lesser weight of wheat sprouts required to supply 2,000 calories, the characteristics of the sprouts (chewy and/or super-sweet) make it very difficult to eat the amount required to satisfy daily calorie requirements. Options that one can use to increase consumption of grain sprouts are: make raw sprout breads and/or milk analogues. However, the milk analogues aggravate the bulk/volume problem, and sprouted breads tend to be very heavy foods and require considerable effort to make. It would be difficult to make and eat a large amount of sprouted grain bread each and every day.

Gluten and other problems with wheat. Additionally, wheat (and many other grains) contain gluten,
a protein that is one of the more common allergens, for certain individuals. Wheat and other grains were not part of humanity's evolutionary diet, as they have been available in significant quantities only since the inception of agriculture 10,000 years ago; thus, it should not be surprising that they may be allergenic for some subpopulations or individuals. Wheat also contains significant amounts of other antinutrients (e.g.,

411

alpha-amylase inhibitors, for example). While cooking may neutralize some antinutrients in grains, these concerns do raise the issue of whether certain individuals have adapted to diets that are grain-centered. To summarize: sprouted wheat is not the answer to the calorie paradox because it presents bulk problems, and for some, allergy problems. For those individuals who can digest wheat, it can be part of the solution. However, by itself it is not a viable long-term solution to the paradox. Note: the analysis here is limited to sprouted wheat because that is the only sprouted grain for which nutritional data are provided in the USDA handbook.

Cooked starch is generally more digestible than raw starch. The raw starch of sprouted
grains/legumes is not digested as effectively as cooked starch. Starch is a polymer made up of glucose molecules. The heat of cooking makes starch more digestible; the process is called gelatinization. When starch is heated to the range 55-75C (131-167F) in the presence of water, the starch granules swell and the crystalline structure degrades (starch granules are estimated as 20-50% crystalline per x-ray diffraction studies). Amylose is released during gelatinization. With sufficient heat, the granule structure is broken down. This increases surface area and makes the starch more accessible to digestive acid and enzymes, i.e., more readily digested. Starch foods are one of the foods that are more sensible to eat cooked rather than raw. Those raw-fooders who claim that cooked starch (or cooked food, in general) is poison are disseminating false information. Sprouting may reduce the levels of certain antinutrient factors (e.g., phytates), but that does not completely solve the underlying problem of the lower digestibility of raw starch. (References for the preceding: starch entry, McGraw Hill Encyclopedia of Science and Technology; also Bornet et al. (1989) "Insulin and glycemic responses in healthy humans to native starches processed in different ways: correlation with in vitro alpha-amylase hydrolysis," American Journal of Clinical Nutrition, vol. 50, pp. 315-323.)

Cooked starches
The calorie table shows one can obtain daily calorie requirements from a mere 4 pounds (1.8 kg) of (cooked) starch. Many raw starch foods are unappetizing, and are hard (or impossible) to eat raw, hence cooking is sensible. In contrast, many cooked starch foods are soft and easy to eat. Given that starch foods are so common and cheap, is it any wonder, then, that most of the people in the world (e.g., lesserdeveloped countries) have a diet centered around cooked starch? This reality, plus the limitations of sprouted grains, shows that those extremists who claim the world can go 100% raw RIGHT NOW are suffering from delusions. It should also be mentioned here that, like any other food, some of the cooked starches present problems if one tries to live on them exclusively. Grains and legumes are not part of humanity's original (evolutionary) diet, and consequently they often contain elements that present challenges when consumed--e.g., gluten and phytates in grain, and/or other antinutrients in legumes. Grains and tubers are primarily energy (starch) sources--if one tries to live exclusively on them, one will likely develop nutritional deficiencies (pellagra, scurvy, etc.). Cooking reduces, but does not eliminate, phytates--a limitation on diets that become too heavily centered around grains as the staple. (See The Late Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance for more on this issue.) To summarize, cooked starches/grains are a significant option for resolving the calorie paradox. The main caveat here is not to overcompensate or center the diet too heavily around them.

Fats--avocados and nuts

412

Only 2.5 pounds (1.1 kg) of avocado meets daily calorie needs, and ~1.6 pounds (0.7 kg) of soaked/sprouted nuts, or 0.8 pounds (0.36 kg) of dry nuts meets calorie needs. Despite the "party line" that avocados and nuts are heavy foods and should be consumed sparingly, they are a major staple for many raw-fooders, and the primary calorie source for many. In his book Long Life Now, Lee Hitchcox reports that the old Hippocrates diet (per the book, The Hippocrates Diet, by Ann Wigmore) is 26% fat by calories--due to avocados and nuts. Readers should note that the current Hippocrates Diet (see Living Foods for Optimal Health, by Brian Clement, for details) is different from the old, and the Ann Wigmore Institute in Puerto Rico promotes a different diet as well (see the book Rebuild Your Health, by Ann Wigmore, for details). Side note: The current Hippocrates Diet is two large glasses of green juice per day, plus sprouts, salads, and up to 25% cooked food. Hippocrates suggests ~100% raw for healing. In Rebuild Your Health, Ann Wigmore advocates a diet centered on blended foods (energy soup, smoothies), fermented foods (raw sauerkraut, rejuvelac), and wheatgrass juice.

Is a diet of sweet fruit and avocados the answer to the calorie paradox?
Presumably, some fruitarians may claim that the answer to the calorie paradox is to consume a diet that consists primarily of sweet fruit and avocados. This may even be presented as the "middle path," and anecdotal evidence may be cited to claim that such a diet is common among fruitarians. However, there are potential risks in such a diet. It turns out that avocados are the richest known source of a unique sugar, dmanna-sorbitol, that has a very unusual property: the sugar can inhibit the body's production of insulin and produce a temporary diabetes-like condition. See the section on combining sweet fruit and avocados in Fruit is Not Like Mother's Milk (about two-thirds the way down the linked page) for additional information and references on this topic. Thus, a diet of sweet fruit and avocados is a diet in which the two major calorie sources are (a) fruit sugar, which requires insulin for digestion, and (b) fat from avocados, which may inhibit your body's ability to produce insulin. This suggests that one should be careful about the timing of eating avocados vis-a-vis sweet fruit. Further, it may partially explain the incidence of diabetes-like symptoms (particularly excessive urination) reported by fruitarians. (My personal experience as a former fruitarian was that excessive urination can occur on sweet fruit alone, as well as on sweet fruit plus avocados.)

Is mono-eating the answer? Of course, some fruitarian advocates might claim that the problem of
timing the consumption of avocados versus sweet fruit is easily solved by mono-eating. Unfortunately, the reality of a diet high in sweet fruit is that sugar (and fat) cravings are quite common, as one can easily get habituated to sugar (sugar may be eaten for psychological reasons, even if there is no physical addiction), and sweet fruits are generally very low in fat and essential fatty acids. What can happen on such a diet (and this is written from direct personal experience with such a diet, including experimenting with mono-meals of avocado) is that one eats a mono-meal of avocados and feels temporarily full. Then about an hour or so later, one starts to feel hungry again, only this time for sweet/sugar rather than fat. So, you eat sugar in the form of sweet fruit--precisely at the time your body is digesting the avocado, and your ability to produce insulin (needed to metabolize the fruit sugar) may be impaired/inhibited. (In the short run, or as an occasional event, metabolic dysfunction in this respect might not be a problem; however as a long-term habit, it may be risky.) Additionally, the lack of credible success stories of individuals who strictly follow a 100% raw vegan, mono-eating regime on a long-term basis is yet another reality check against mono-eating (of avocados and sweet fruit) as a possible solution to the paradox.

413

An alternative is a diet of nuts (fats) and sweet fruit (sugar). Such a diet may work better than a diet of avocados and sweet fruits. However, such a diet violates the strict form of the raw "party line" that one should sharply limit nut consumption.

REALITY CHECK #1: Stomach Capacity vs. Food Required


If you are, or want to be, a 100% raw vegan who avoids the concentrated-calorie foods per the "party line," then you must get your daily 2,000 calories by some combination taken from the following categories. (Amounts given are the total of each category BY ITSELF that would be required to equal 2,000 calories.)

Raw Vegetables: 20 lb or 9.1 kg


Sweet Fruit: 6.7-9.3 lb or 3-4.2 kg Cucumbers: 33.9 lb or 15.4 kg Tomatoes: 23.3 lb or 10.5 kg Sprouted Legumes: 4.5 lb or 2 kg Sprouted Wheat: 2.2 lb or 1.0 kg

Eating a blend of the above, for example an equal percentage from each of the 6 foods listed, would involve eating--daily--approximately 15.3 pounds (7.0 kg) of food. Needless to say, that's a lot of food! Some readers would of course opt for mostly sweet fruit, which is a lower weight (8 lbs, or 3.6 kg net, average per day). To get an idea of how challenging (some might say unrealistic) eating such large amounts of raw vegan food/fruit is per day, do the following exercise:

Compare: size of stomach vs. volume of food required. Bring your hands together to form a loose
cup--little fingers and sides of the hands (below the little fingers) touching. Have the other fingertips touching as well. The volume that would fit in your hands is related to your frame size, and your stomach size is related to your frame size as well. The (average) volume of your stomach (unstretched) is approximately twice the volume of your cupped hands. Now think about the volume of the food mass you expect to eat: the 8 pounds (3.6 kg) of fruit (or 20 lbs/9.1 kg of vegetables, or 33.9 lbs/15.4 kg of cucumbers..., etc.), each and every day while on the "perfect" raw vegan diet. Finally, compare the two volumes--see if you can get an idea how many (unstretched) stomachfulls the "party line" 100% raw vegan diet will require. Even if you have a larger than average stomach, and/or your stomach can stretch by a factor of two or three, the food mass above still represents a lot of meals (or else much-longer-drawn-out meals, if fewer in number). Think how many times you must eat each day to ingest that volume, and the time involved in chewing and digesting such an amount of food. Remember to allow at least 1-2 hours between meals for the food to (start to) digest. Or conversely, if one eats gargantuan-sized meals to keep the number of meals down, generally one's stomach will have to stretch considerably and be trained into accepting such large volumes.

Effect on one's daily experience and mental outlook. Also note here, that, while it is not unheard of
to eat such volumes of raw food if one is determined enough--at least with fruit-heavy raw diets--one has to question the resulting obsession with food and eating that tends to result. The few examples of such people who have truthfully come forward with their actual daily personal habits on this score, in my experience, confirm the repercussions of the calorie paradox:

Huge volumes of food eaten;

414

Ongoing or quickly resurfacing hunger (despite the large amounts of food eaten) as a
central aspect of one's daily experience; which usually leads to:

Either lots of time spent eating, or lots of time thinking about it when not eating, due to
the mental effects that hunger has on mood and mental imagery.

By now, if you are like most people, you will probably concur that eating such a large amount of food each day is unrealistic, and you will eat some concentrated foods because you can't, or don't want, to spend so much of your day eating or obsessing over it!

REALITY CHECK #2: Excreting the Food Mass

What goes in must come out. Think of the food mass mentioned in reality check #1. So you ate it all-and it must come out as well, as urine and feces. How many bowel movements would that be? Depending on the person, perhaps as much as one per meal, both due to the large food volume, and because raw vegans rarely suffer from constipation. The amount of urination is also usually inconvenient--I remember when I was on 100% fruit (and eating quite a lot of avocados as well as sweet fruit) that frequent urination was a real hassle and inconvenience. (The fact that many sweet fruits are both high-water-content and diuretic makes the situation problematic.) Pardon the extremists for not telling you about this--reality makes it harder to sell idealistic dietary dogma. :-) A 100% raw friend on a vegetable-based diet (very large volumes) said the diet disabled her and made her a shut-in--she needed to urinate every five minutes or so.

REALITY CHECK #3: Using the Paradox to Assess Credibility of "Experts"


Calories provide a potential "truth serum" for assessing the adequacy of reported diets. If
an alleged dietary "expert" is promoting his or her diet as the best, or even "perfect" diet, ask what their daily diet consists of. Try to get the "expert" to specify types and quantities of foods consumed. If clear details are provided, you may be able to use the information in the calorie table here to check their diet, and their credibility. A few hypothetical examples will illustrate. 1. 2. 3. An "expert" claims to live on nothing but (on average) 6-8 pieces of sweet fruit per day. An "expert" claims to live on nothing but 1-2 liters of green juice, plus a relatively small amount (1-2 kg) of fruit per day. An "expert" claims to have a diet in which cucumbers and/or tomatoes are the predominant food, with some sweet fruit, and green vegetables, and to never/rarely eat avocados, nuts, or dried fruit.

Compare: reports of calorie consumption vs. realistic requirements. Note that the above are
summary descriptions; additional detail would be required for a formal assessment of "expert" #3 above. Once data on quantities are available, use the averages from the calorie table earlier in this paper to estimate the total calories consumed per day. Looking at "expert" #1, it is clear that they claim to thrive on a diet that is far below starvation levels. The only logical inference one can make here is that they are a fake. "Expert" #2 also appears (assuming the sweet fruit eaten is juicy, not dried fruit) to claim to thrive on a diet that is below the normal minimum calorie levels. Hence one suspects that "expert" #2 is an anorectic and/or a fake like #1.

415

As for "expert" #3, an assessment will depend on the amounts of sweet fruit that are consumed--the only apparent source of significant calories claimed in the diet. Such a diet probably does not satisfy the calorie paradox, unless #3 eats a huge amount of cucumbers and tomatoes (unlikely), or eats lots of sweet fruit (seems like a contradiction--the usual reason for eating cucumbers is to limit sweet fruit consumption).

When an "expert" claims to thrive on a diet whose calorie content is inadequate. For
discussion purposes, let us assume that you have captured a so-called "expert" in the calorie paradox. What are the possible explanations here?

Some mystical calorie source? Do they get their calories from a secret source? Is this secret
source something mystical--perhaps some higher spirit? (i.e., the so-called breatharians; a reminder, by the way--there are no scientifically verified cases of breatharianism). Binges and "exceptions." Do they get their calories from a more conventional (and more plausible) secret source--namely binge-eating, exceptions or "cheating"? (Very common in raw veganism.) Here the "expert" is not being completely truthful about his or her diet. Does the "expert" claim that calories are irrelevant? Some emaciated fruitarians make this claim--needless to say, such individuals ought to be paying more attention to calories. Note also that under special circumstances, one may be able to survive on less than 2,000 calories per day--examples include those suffering from anorexia, and some CRAN ("caloric restriction with adequate nutrition") folks. However, anorexia is a mental illness with severe physical consequences (extended malnutrition, sometimes terminating in death), and caloric restriction, like any other intense regime, may promote food (and self) obsessions, and the calorie restriction movement has not established a credible scientific record (in humans) yet.

Is the "expert" really, in effect, an anorectic? It may be possible under some (unusual)
conditions to live on somewhat less than 2,000 calories per day--the body, within certain limits, can adapt to low-calorie diets. The precise value of such a lower limit is of course, variable, and will depend to some degree on the individual. But that people occasionally die from starvation in anorexia proves that there is a lower limit. Also, those with anorexia/fruitarians may need less energy, as an emaciated body has less muscle and requires less energy to maintain or move. (Even at rest, muscle requires energy to maintain.) And of course, those who are sedentary will need less calories than those who are active--such sendentism often going along with restricted diets. (Just ask a range of people on such restricted diets how much exercise they are getting, and you will discover that many of them just don't do very much.) Thus, the "experts" who wish to use this type of explanation (who are often emaciated when caught up with; or if not, often found not to be following the diet they promote) are caught in another trap: Having to admit that for their caloric needs to be so low, they will have been on such a restricted diet for long enough that their bodies have likely adapted to their low-calorie regime in much the same manner as those with chronic anorexia nervosa--not a healthy situation to be in either mentally or physically.

"Experts" who fib about their actual intake is the most rational explanation for mismatches. In my opinion, the second explanation above (not being completely honest about the diet-which usually entails rationalizing not only to others, but to oneself) is the most logical one, with the third explanation (anorexia) as a possibility also, in the more extreme cases. Many so-called "experts" are in clear denial of reality regarding the theoretical basis of their diet; it seems intuitively reasonable that they would be in denial of their own behavior ("cheating"/binge-eating) as well. Thus, in light of these considerations, the calorie paradox--when you can apply it--provides a powerful, sobering check on the credibility of some of the so-called "experts" who promote their "ideal, perfect" diets.

Lower limits on daily calorie expenditures in starvation diets. There are some interesting
research results concerning lower limits on calories per day as it relates to weight loss. The results of Thomson et al., as cited and analyzed in Grande et al., of a study of 10 women on a starvation regime,

416

found that the women, on average, lost weight equivalent to 1,480-1,970 cal/day, depending on which type of tissues one assumes the women lost. Further, some of these women were extremely sedentary during the starvation period. (Reference: Grande et al., "Body Weight, Body Composition and Calorie Status," in Modern Nutrition in Health and Disease, edited by Robert S. Goodhart and Maurice E. Shils; 6th edition, 1980, Lea & Febiger, Philadelphia.) Grande et al. also note, p. 32, "There is great variability among outpatients in regard to body weight response to dietary change, partly because of differences in activity habits, partly because the truth about dietary intakes is not always easy to discover." The point of the latter remark--finding the truth about dietary intakes--is something that can be very difficult to ascertain when dealing with so-called "experts" who claim to thrive on extreme diets.

Physical effects of starvation diets. Hoffer (writing in Shils et al.) describes an experiment in the
1940s with young men who followed a diet of only 1,600 cal/day for 24 weeks. On average, the men in the experiment lost 23% of their body weight. (Reference: "Starvation," by John Hoffer, chapter 56 (pp. 927949) of Modern Nutrition in Health and Disease, vol. II, 8th edition (later edition than the above paragraph reference), 1994, edited by Maurice E. Shils et al., Lea & Feibiger, Philadelphia.) Thus, the above two research papers suggest that the figure of 2,000 cal/day as used in this paper is in all likelihood a very conservative figure as an average for active people.

REALITY CHECK #4: What is Your Primary Calorie Source?


If you stop and think about what you eat every day, the calorie table presented earlier will give you an idea of what your primary calorie sources are. You may be eating a lot more fat (via avocados and nuts, or even the oil in salad dressing) than you expect. (Raw fat is not as bad as cooked fat, and may be healthy for many. Still, you might not want a very high-fat diet in the long-term.) Alternately, you might be getting most of your calories from sugar--fruit sugar. Another calorie source is cooked foods, especially cooked starches. Some may find cooked starches to be preferable to a diet in which raw fat (or fruit sugar) is the predominant calorie source. Finally, some rawists try to follow an extreme, puritanical diet that provides inadequate calories (or nutrition), and then they actually get their calories via binge-eating or "exceptions" (cheating on the diet).

Why 100% raw vegans eat so much avocado and nuts, or overeat sugar (fruit)
Given the above, we can now say that raw vegans eat significant amounts of avocados and nuts for two primary reasons:

They NEED the calories provided. They NEED the fat (essential fatty acids).

Your body knows it needs energy and essential fatty acids--even if the ego is deluded by dietary dogma that suggests you can live on just cucumbers or green juice or even sweet fruits. This is why many raw-fooders experience strong cravings for fat, usually in the form of avocado (or alternatively, nuts). As for sugar, fruitarians must eat a large volume of fruit each day to satisfy their calorie requirements. In so doing, they ingest a large amount of sugar as well. Sugar is addictive and promotes cravings. Like a junkie who can't get enough drugs, it is hard to get "enough" sugar. Further, a significant amount of willpower is

417

required to resist the frequent sugar cravings, which often leads to food (and self) obsessions (very common in fruitarianism).

The calorie paradox explains why prominent (raw) organizations endorse/allow cooked (starch) foods
The American Natural Hygiene Society (ANHS) and The Hippocrates Institute both suggest a maintenance diet that includes or allows for cooked foods. In both cases, (clean, basic) starch foods are included in the suggested cooked foods. I cannot and do not speak for the ANHS or Hippocrates; however, some reasons for their suggestions are as follows:

Solves the calorie paradox by providing calories that are lacking in more puritanical raw vegan diets, and avoids the dependence on sugar or fat that characterizes most 100% raw vegan diets. Cooked starch foods do not promote cravings as strongly as sugar does. A combined raw/cooked diet often works in the long run, while 100% raw vegan diets most often do not work in the long run.

(References: The Natural Hygiene Handbook, p. 47; also Living Foods for Optimal Health, by Brian Clement.)

What raw vegans can do to partially mitigate the calorie paradox


There are a few things that raw vegans can do to reduce the impact of the calorie paradox. A partial list of these mitigations is as follows.

Include avocados as a regular part of your diet. To minimize possible problems from the
insulin-inhibiting sugar in avocados, consume them separately from sweet and/or starchy foods. (Avocados are very popular with raw vegans; the calories here primarily come from fat.) Consume calorically significant amounts of nuts and oily seeds, dry or sprouted, as a regular part of your diet--e.g., 1-day sunflower seed sprouts, sesame sprouts, almond sprouts, etc. Make milk substitutes/analogues from nuts and oily seeds, to increase your calorie intake. (The calories here are primarily from fat.) Consume raw sprout breads, to increase your consumption of grain sprouts. Recipes for sprout bread can be found in a number of raw (recipe) books. (The calories here are primarily from starch.)

Adopt a combined raw + cooked diet, where the cooked food includes some starch foods, an easily digested source of calories, and/or some cooked legumes. Note here that one
should just ignore the extremists who promote crank mal-nutritional claims that cooked foods, starch, and/or protein are poisonous. (The calories here are primarily from starch.) If you eat sweet fruit, consume it in moderation. Modern fruit is so high in sugar that some authors (e.g., Brian Clement of the Hippocrates Institute) make the sensible recommendation that sweet fruit juices be diluted with water before consumption. Although they are higher in calories than a number of other raw vegan foods, the high sugar content of sweet fruits may promote cravings if consumed in excess. (The calories here are primarily from sugar.) Blend coarse vegetable foods into raw soups; e.g., Ann Wigmore's energy soup or the blended salads suggested by Dr. Stanley Bass. This helps you to eat more of these foods. Note: this has limited utility calorie-wise, but is nutritious and worthwhile in its own right, as it provides ample vitamins and minerals. Also, you can increase the calorie level of blended foods to

418

significant levels by including avocado or soaked/sprouted nuts. (The calories here are primarily from protein.)

Ultimately, the solution to the calorie paradox can only come by consuming, in some manner, the calories required. Assuming one is a vegan, the majority of these calories will come from
fat (avocados, nuts, tahini), sugar (fruit), or starch (cooked, or raw, perhaps as sprouts). Those who adopt 100% raw exclude, in theory, cooked starch as an option, but such people often end up eating cooked starch (and junk) anyway via binges. If that describes you, perhaps you should consider facing the situation, and deliberately include some cooked starch in your diet. After all, isn't it better to eat some cooked food and be honest about it, than to claim the "honor" of the ideal of 100% raw, while you binge-eat in secret?

Ranking the workability of various 100% raw vegan diets


Based on the calorie data here, plus extensive anecdotal evidence of individuals attempting 100% raw or predominantly raw diets in the real world, the following is an ordered list of 100% raw diets, from most likely to least likely to succeed.

100% raw diets that are most likely to be successful:


A diet in which raw fat (avocados, nuts) is the primary calorie source, and where sweet fruit has a very minor role. A diverse diet in which raw sprouts (nuts/seeds and grains) are the predominant calorie source. A raw diet in which starchy tubers are the predominant calorie source (though raw tubers are considered unappetizing by many).

100% raw diets that are most likely to fail:


A diet based on cucumbers and sweet fruit (this may, in effect, constitute anorexia--though the motivations are much different than "traditional" anorexia, of course--and/or the "expert" may be lying about his/her diet). A diet that is predominantly sweet fruit.

Readers should be aware that 100% raw vegan diets have a dismal record of failure in the long-term. Surprisingly, mixed diets (i.e., raw plus cooked) have a better record of success, in the long-term, than do 100% raw diets.

Please don't overeat!


Some extremists may attempt to (falsely) characterize this paper as promoting overeating. The whole point of this paper, however, is that if one tries to be 100% raw vegan, then one must eat some concentrated foods, or they will end up (by default) overeating the lower-calorie-density foods. This point follows from the existence of lower limits for calorie consumption and the density of calories in each food/type. Strategies for mitigating the impact of the paradox have been provided here, as well as information promoting the realistic attitude that one must consume some concentrated foods as part of a balanced, diverse diet to satisfy calorie requirements without overeating.

419

Postscript: Foods Not Mentioned


Protein: Protein certainly can provide calories, as your body will burn protein for fuel (energy) when
appropriate, but protein is the body's least-preferred calorie source. However, when the more optimal fuels--carbohydrates (sugar, starch) and fats--are not available in adequate quantities, the body can and will increase its normally moderate use of protein as fuel. Also, the high-protein vegan foods are already discussed above (legume sprouts, nuts, seeds). As the protein content of these foods is a small part of their total calories, and the foods are already included in the analysis, it was not necessary to discuss protein separately.

Acid Fruits: The common acid fruits--grapefruit, pineapple, kiwi--are high enough in sugar that they can
also be considered sweet fruits. The other acid fruits--lemons, limes, kumquats--are usually consumed in small quantities and can be ignored for our analysis here.

Oils: Oils are pure fat, 9 calories per gram. Many raw-fooders claim they avoid oils, so they were not
included in the analysis. Despite this, oils can be used in moderation, as a part of a diverse raw diet, and are a high-calorie food.

Animal Foods: Though this article has been oriented toward raw vegan individuals, it is worth
mentioning one of the reasons why the human metabolism requires a certain quota of concentrated foods. The evolutionary evidence from Paleolithic diet research suggests a primary reason is due to the adaptation of the human gut to increased percentages of denser animal food that occurred in the diet after the split from the common evolutionary ancestor that humans share with our primate cousins. While there are also similarities between human and ape guts, of course, this divergence represents one of the key differences between ours and theirs. Given that vegans eschew animal foods, some type of concentrated food must be provided in the diet in place of the amounts of concentrated animal food that the human gut originally evolved to handle and became dependent on for efficient energy metabolism. (See the discussion Co-evolution of Increased Human Brain Size with Decreased Size of Digestive System in the postscript to Part 1 of the Paleolithic Diet vs. Vegetarianism interviews [about halfway down on the linked page] for more about this point.) The suggestion here that raw vegans should consume some concentrated foods to get sufficient calories is confirmed by real-world experience from the traditional diet of one of the more well-known huntergatherer groups, the Australian Aborigines. In "Traditional diet and food preferences of Australian Aboriginal hunter-gatherers," by Kerin O'Dea (Phil Trans R Soc Lond B, 1991, vol. 334: pp. 233-241), O'Dea notes (p. 238): "Nevertheless, it is significant that in a diet that was generally characterized by its low energy density, the foods most actively sought and most highly prized were those that had a high energy density. Clearly this was an important survival strategy." Finally, readers with an interest in animal foods might want to read the article, "Nature and variability of human food consumption," by D.A.T. Southgate (Phil Trans R Soc Lond B, 1991, vol. 334: pp. 281-288). Table 6 in Southgate, p. 286, is similar to the table in this article, but includes a wide array of animal foods as well. [Please note that this article was written (excluding, of course, this paragraph) some months before I saw the article by Southgate.]

In Closing: Advice to Readers

420

Please don't be afraid of raw vegetables, fruits, cucumbers, tomatoes, or legume sprouts because of this paper. Don't try to live on only nuts, avocados and/or wheat sprouts because of this paper. Instead, choose a DIVERSE diet that includes raw vegetables, fruits, sprouts, nuts, avocados, seeds, and if you are open to the possibility, other foods as well (e.g., cooked foods, raw dairy, raw honey, and even animal foods if you have no objections). The message of this paper is that diversity in diet is an important part of the solution to the calorie paradox, and that food phobias or food obsessions (common problems in raw) are to be avoided. In closing, since so many rawists are deeply concerned about food issues, it seems appropriate to remind readers of some basic common sense, as follows:

Raw veganism is just a diet, nothing more. Many people are very healthy on other diets-including those that contain cooked foods and/or animal products. Remember that your health is more important than idealistic raw dogma (much of which is false anyway). Follow raw-foods dogma ONLY as far as it supports good health for you. Discard any and all aspects of raw veganism that do not support this goal. Finally, as many raw vegans become deeply entangled in idealistic dietary dogma, remember: At all times, in all circumstances, the diet must serve you. Never let dietary dogma dominate you--when that happens, the diet is eating you, rather than you eating the diet!

APPENDIX 1: Primary Calorie Sources for Raw Vegans--Summary


The following are generalizations based on the nature of raw vegan diets as commonly practiced. Diet varies by individual, so the list below should be considered as a general description, and not a prescription. 100% raw veganism, when it works: o Living-Fooder: Fats (avocado, nuts, tahini) and starch (sprouts). o Fruitarian: sugar (sweet fruits) and fat (avocados). o Natural Hygienist: varies per individual style of natural hygiene--ranges from the fruitarian model to the living-foods model. 100% raw veganism, when it doesn't work: o Whatever foods you are consuming when binge-eating/"cheating"/making "exceptions." 75% raw + 25% cooked, veganism: o Cooked starches and fats (avocado, nuts, tahini).

APPENDIX 2: Details of Table and Calculations Sources


ANHS: Calorie/pound data are from The Natural Hygiene Handbook. USDA: Calorie/100 gm data are from The Composition of Foods, (Handbook series) from the U.S. Department of Agriculture. The following Handbooks were used in this paper: 8-9 (1982), 8-11, 812 (both 1984), 8-16 (1986), 8-20 (1989). Milk article: Fruit Is Not Like Mother's Milk, with accompanying documentation, available on this site. The "4 sweet fruits" blend is an average of: apple, mango, orange, watermelon.

Details

421

Note that the calorie data for the "4 fruits blend" is net--edible portion only, excluding waste. Per the USDA handbook, the 4 fruits have the following waste percentages (inedible peel, seeds, etc.): apple 8% (core and stem only), oranges 27%, mangos 31%, watermelon 48% (rind, seeds, cutting loss); average is 28.5%. This gives an adjustment factor = 1 - 0.285 = 0.715, used in the text (not the table) to give a total raw weight for the 4-fruit blend. Reference: USDA Handbook 8-9, pp. 2324, 179, 168, 283. Sweet fruit juice is average of grape (61 cal/100 gm, 14.96% CHO), orange (45, 10.40%), pineapple juice (56, 13.78%); reference: USDA Handbook 8-9, pp. 139, 184, 232. Average is 54 calories/100 gm, average carbohydrate content (mostly sugar) is 13.05%. 1 cup = ~250 grams of juice, so 1 quart weighs very close to 1 kg (within 1%). Dried fruit is average of: apricots (238 cal/100 gm, 61.75% CHO), figs (255, 65.35%), prunes (239, 62.73%), seedless raisins (300, 79.13%); reference: USDA Handbook 8-9, pp. 52, 104, 249, 257. Average is 258 cal/100 gm; carbohydrate content (mostly sugar): 67.24%. Neutral fruit section: calorie data come from USDA Handbook 8-11, pp. 155, 178, 183, 241, 283, 394, 401, 450; also USDA Handbook 8-9, pp. 89, 152, 158. Note on sugar estimates: the vast majority of the carbohydrate in sweet fruits is in the form of sugars. However, the USDA data provides only total carbohydrates--no breakdown of sugars. The total percentage of carbohydrates is used as an estimate of the sugar content of sweet fruit in the calculations for this paper. The effect of this is that the sugar estimates may be overestimates by a slight margin. Please keep this in mind when considering the sugar consumption estimates. Sprouted legumes are the average of: mung beans (30 cal/100 gm), lentils (106), peas (128), soybeans (128); reference: USDA Handbook 8-11, pp. 64, 218, 272, 382. Average is 98 calories/100 gm. The length of the sprouts is not supplied in the USDA handbook. Age and length of sprouts may cause the actual nutritional composition to vary. Cooked legumes are the average of: mung beans (105 cal/100 gm), lentils (116), peas (118), soybeans (173); reference: USDA Handbook 8-16, pp. 106, 92, 108, 130. Average is 128 calories/100 gm. Sprouted wheat data is from USDA handbook 8-20, p. 101. The estimate for soaked nuts/seeds is based on the ANHS numbers. The assumption is that sprouting time will be short, so calorie composition does not change much, and one can divide the calories in half to account for increased weight due to absorbed water. (As nutritional analyses for sprouted nuts are not available, the use of estimates is necessary.) The ANHS estimate of 2,500 calories per pound for nuts may be a bit low, but it is used here anyway. The figure of 2,500 calories per pound also covers oily seeds. Per USDA Handbook 8-12, pp. 114, 118, 131, we have pumpkin seeds (541 cal/100 gm), sesame (573), sunflower seeds (570). This gives an average 561.3 cal/100 gm, or ~2,546 cal/pound. For select nuts, the same USDA Handbook (pp. 24, 85, 89, 100) lists almonds (589 cal/100 gm), peanuts (567--peanuts are a legume but are eaten like nuts), pecans (667), English walnuts (642). This gives an average of ~616 cal/100 gm, or ~2,795 cal/pound.

--Tom Billings

Idealism vs. Realism in Raw Foods


by Tom Billings

422

Copyright 1997 by Thomas E. Billings. All rights reserved.

Idealism vs. Realism: A Comparison


The raw-foods movement is split into a number of factions, and it seems that no one agrees with anyone else. Some raw-fooders are very idealistic--to the point of extreme dogmatism, while others are very pragmatic and open. The purpose of this article is to contrast and clarify two different approaches to raw (vegan) diets that one might encounter: idealists versus realists. Based on long experience in the raw movement, it is my observation and opinion that many of the problems of the raw movement have their roots in excessive idealism. A little bit of idealism in raw foods can be good--one can argue that those who try the diet on spiritual or environmental grounds are engaging in idealistic behavior. However, an excess of idealism can lead to a number of problems, some of which are serious. Let us therefore examine idealism and realism in the context of raw-food diets, on an issue-by-issue basis (below). The material below labeled "I," for Idealistic, represents a summary and synthesis of views one frequently encounters in the movement. The material below labeled "R," for Realistic, is a snapshot of my own views, which are in an ongoing process of growth. I have written this so that others can see the current discussion within the raw movement in a clearer light. (Please note that I am not the only realist; there are many realists in the raw movement.)

Views Of Nature
I: Nature is simple, nature's laws are simplistic, and nature is perfect. R: Nature is a highly complex system; it is a system of tradeoffs. Our knowledge of nature's laws is limited. Nature is not perfect--some animals wage war against each other, they sometimes kill each other during mating, and there are many other natural events considered imperfect. (Note: nature is not interested in anyone's dogma: nature simply IS.) I: Nature is your friend, nature wants to help you. R: Actually, nature is impersonal. One can personalize nature in positive or negative ways, both equally valid. Positive: nature wants to help you, via your birth. Negative: nature wants to help you die, via diseases, predation, starvation, natural disasters. Observe that wild animals rarely die of old age; does their "friend" nature want to help them live to a ripe old age? I: A raw, vegan diet is the natural diet for humans. R: Humans are natural omnivores, per: (1) comparative anatomy analysis, (2) evidence of ape diets, (3) the fossil record, (4) every hunter-gatherer society ever known on this planet. Note that, despite the above, there are sufficient, powerful, spiritual and ethical reasons to be a vegetarian. I: All animals of one species have the same diet, and so should humans! R: Within a given species, diets can and will vary per habitat and territory. If a food is not present in the habitat or territory, the animal cannot eat it. This is also true of that intelligent, highly adaptable species-humans. Witness the wide variation in diet among hunter-gatherer societies, and across nations and cultures. Intelligent adaptation is the key to survival of our species. Omnivorous humans can eat a broad range of foods, and survive in habitats where a frugivore/vegan could not--e.g., the Arctic, Tibet. It should be noted

423

that it is due to our omnivorous adaptation skills that we survived as a species, and we are alive now, so we can debate the fine points of diet.

Theory of Diet
I: Cooked food is poison. R: Humans may have evolved the genetic ability to handle some cooked food--the evidence is inconclusive at present. However, some foods are easier to digest when raw, others when cooked (e.g., starch foods like potatoes). Simple logic suggests one should eat foods that are agreeable to you, in the form that is easiest to digest. Some raw foods should be avoided: raw rhubarb, kidney beans; many cooked foods should be avoided: fried foods, heavily salted foods, broiled foods, etc. Diet is not so trivial that it can be reduced to inaccurate, simplistic slogans. I: A 100% raw, vegan diet is the best diet for everyone. R: Actual experience with 100% raw vegan diets shows that they may assist healing and health in the short run, but can be problematic in the long run. Long-term 100% raw vegans are very rare, as very few people manage to stay on the diet long-term. There's a lesson here, if you are open to receive it. Note that vegan diets that are less than 100% raw, say 75-90% raw, are more common and are often less problematic than 100% raw. I: A raw diet can cure any/all diseases, and bestow perfect health. R: Wild animals die of disease, despite eating a raw, natural diet. So too, human rawists can and do get sick. A raw foods diet does not guarantee excellent health: there are no guarantees in life. Perfect health cannot even be defined; ask anyone who makes such a claim to give an objective, comprehensive definition of perfect health. I: If a raw diet doesn't work for you, it's your fault, because the diet is natural and perfect! R: There are no perfect diets; we live in an imperfect world. If you have made a reasonable, sincere effort at a raw diet, and it doesn't work for you, then common sense suggests that you should change diets. The diet must serve you, not the other way around! I: A 100% raw diet is very special, in every way! R: Consider the glorious 100% raw-food diet, and its profound effects on your life. Study this sublime health system, at length. Then, ask yourself this question: in its deepest essence, what really, truly, is this supreme health system of a 100% raw diet? Answer: It's just lunch! (and other meals as well...) Seriously, the most sensible way to view raw diets is to regard them as potential tools for good health. Like any other tool, they can be helpful if used correctly, and may be harmful if used incorrectly. I: A 100% raw (vegan) diet is the goal to work towards. R: Good health, not 100% raw, is the goal to work towards. Good health and a 100% raw (vegan) diet aren't necessarily the same, a lesson that some rawists learn the hard way. Don't obsess on a specific % of raw in your diet. Instead, find the diet, the % of raw, that works for you, in that it supports good health for

424

you. (Note: some people who adopt raw specifically for healing may need to follow a nearly 100% raw diet for some time, as part of their healing program.)

Applications
I: Common motivations (negative) here include fear of mucus, fear/hate of cooked foods and those who consume them, fear of protein foods, and obsessive fear of aging or illness. Many idealists also have positive motivations as well. R: You should have (only) positive motivations: enhance health, healing, spirituality, environment, and so on. The realist avoids negative motivations. Motivations driven by fear/hate can lead, in the long run, to serious mental or emotional problems (e.g., obsessive fear can turn the diet into an eating disorder similar to anorexia nervosa). I: Idealistic expectations: cure of all illness, total immunity from disease, greatly enhanced health, extended lifespan; in effect: the "promised land." R: Realistic expectations: raw food diets are famous for their health enhancement and healing effects. However, one must try and see for oneself, as results are not guaranteed. I: Any problems you experience are due to detox ("you're not pure enough"). Ignore the problem, redouble your efforts to comply with the ideal diet, and the problem will go away. (Important: see ** below.) R: Raw-fooders can and do get sick. Problems should be addressed promptly. For serious/acute conditions, one should consult a health professional for advice (and possible treatment), as soon as possible. Dietary changes, which are not in line with the idealistic diet, may be necessary. Don't sacrifice your health on the altar of rawist dogma! **Danger: the idealist position above is potentially harmful to your health. I: Dietary dogma is VERY important in life; it is part of your self-identity. Among idealists, guilt and low self-esteem may occur if one cannot follow the perfect diet. Also, egoism, pride, and feelings of superiority may occur when one manages to perform the discipline required for the perfect diet. (Perfectionism) R: Diet and dietary dogma are not very important. A realist is not upset when he or she backslides, nor does he/she develop a big ego when the diet is followed successfully for a long time. Making perfectionism a part of your dietary dogma is a very bad idea. If you allow perfectionism to be dominant, then the raw foods diet can, figuratively, eat you (when it should be the other way around). I: Idealists may get very upset if you criticize their perfect diets, as it is such a big part of their self-identity. R: If you don't like my diet, that's fine. I hope you find one that you like and that works for you. I: Idealists frequently look at diet in binary terms: cooked vs. raw, vegan vs. non-vegan, with raw/vegan = good, cooked/non-vegan = bad. R: Realists recognize that a binary model does not yield an accurate view of food/diet. Instead of a black/white (cooked/raw) classification, food is figuratively seen in different shades of gray. Each food, whether cooked or raw, vegan or non-vegan, has properties, and the effects of that food may range from very good to neutral to very bad, depending on the type of food, and your condition. Some relevant questions regarding food are: can you digest the food?, how do you react to it--positive, neutral, or negative?, and so on.

425

Summary Evaluation
I: Simplistic, inaccurate view of nature and/or false versions of nature's laws. R: Accurate view of nature; respects that we don't fully understand the real laws of nature. I: Raw vegan diets are perfect; follow the diet, and you too can be perfect, just like us idealists! R: There are no perfect diets. Idealists are not perfect. Instead, their idealism blinds them and they often cannot see the diet (or themselves) as they really are. I: Idealists are in denial of reality, and in denial of nature. R: A realist accepts reality, nature, and life--as they are, without the blinders of dogma or idealism.

Comments
It is worth mentioning that idealism and realism are not a binary classification of raw-fooders, as one can simultaneously hold some idealist and some realist views.

The Effects of Excessive Idealism In Raw-Food Diets


An earlier version of the preceding section was published on Internet in August of 1997. That article generated a number of replies that praised idealism in a general context. In those replies, idealism was romanticized and depicted as being a major factor in the advance of human societies. There is certainly some truth to those claims. We all are realists in some ways, idealists in others. That's life, or as the realist would say, that's reality. However, idealism has a darker side. While one can say that Buddha, Christ, and Da Vinci were idealists, one can also say that Pol Pot, Mao, Hitler, and Stalin were idealists as well--idealists of a very different kind. The question then becomes, what is the balance of idealism and realism that is best for each person? As mentioned earlier, it is my view that many of the problems of the raw movement have their roots in excessive idealism. Let's now consider some of the damage that excessive idealism can cause, in the context of raw-food (vegan) diets, as follows. If one firmly believes that all illness is due to toxemia, and all health problems are detox, then one may ignore health problems--refuse to seek medical advice or care--when treatment/care are urgently needed. This has hurt quite a few rawists. One may accept severe emaciation because the weight lost is all "mucus" (sort of an "evil demon" to some idealists), or it's your "cooked fude body" (as it is called by some). Someone with anorexia nervosa accepts emaciation gladly, as the weight lost is all "fat." Can you see the similarity here? One may accept chronic health problems such as fatigue, lassitude, loss of sex drive, or symptoms of excess sugar consumption (excess urination, thirst, other symptoms) as signs of ongoing detox. The excuse is that "you're not pure enough." The emphasis on purity/toxemia, and general obsession with the body and food, are examples of a very UNhealthy mental attitude towards food, and life in general. Food should sustain you and nourish you, rather than rule your life. If food rules your life, the diet is eating you, rather than the other way around!

426

The high incidence of eating disorder behaviors in rawism (binge eating, eating in secret, lying about eating) also reflects a mentally UNhealthy state. If one incorporates their diet into their selfidentity, then finds that the diet doesn't work, one can end up living a lie. An illustration of this would be someone who becomes a fruitarian, tries a (100%) fruit diet, then finds that the diet does not work "as advertised." The result is that he/she may end up living a lie: claiming to have a (100%) fruit diet, while binge-eating in secret. This is the fate of some of the "role models" and "experts" of rawism, but you will never get them to admit this in public. The obsession with the body (driven by the idea that one can be perfect through diet alone) often causes one to neglect mental and spiritual health factors, to your long-term detriment. The desire to be perfect, via a perfect, 100% raw diet, can increase the social isolation associated with raw diets, thereby increasing emotional stress. Idealists promise perfect health and cure from all diseases, via a raw-food diet. Of course, no diet can meet such unrealistic expectations. The result is that people try raw, find that it doesn't work per their expectations, and then leave the diet and never come back. For those who fall prey to fecal obsessions, or who abandon their families for the "raw path" (i.e., abandon their families for a "different lunch"), the most plausible explanation seems to be excess idealism. An excess of idealism is also one explanation for those who become hostile zealots, attacking anyone who dares tell the truth about their not-so-perfect diets. The ends justify the means, if you let idealism blind you to the harm you are causing with hostility, and/or by attacking cooked food eaters, those who eat meat, or those with different raw food diets. (Of course there are other potential explanations here: mental illness/dysfunction perhaps?)

The Potential Effects of Increased Realism In Raw-Food Diets


I am convinced that increasing the level of realism in raw (vegan) diets, and decreasing the level of idealism, will alleviate some of the above problems, as follows. More people will seek the advice of health professionals when serious problems arise, rather than ignoring the problems. (Helps protect people from harm.) Freed from the tyranny of the golden cage of the "ideal 100% raw diet," social interaction with the world is easier and less tense. You can eat (clean) cooked food once in a while, and it really won't hurt you. Your diet will serve you, rather than letting your diet control every other aspect of your life. New rawists will have more realistic expectations, and will realize that problems may occur on the diet. Given realistic expectations, fewer people drop out, and hence the number of successful longterm rawists increases. (In particular, the number of long-term 75-90% raw people will increase.) Rawists with serious eating-disorder behavior problems would likely be advised to seek professional help, rather than being told to "stay the course" on their idealistic diets. (A raw diet is an excellent cover story for someone with an eating disorder.) Most raw-food zealots would change their ways or fade into obscurity (i.e., the lunatic fringe). In the long run, a more realistic approach could make rawism more acceptable, and accessible, to the masses. It could eventually lose its reputation for being a form of food-faddism. Raw diets can go mainstream, but that might happen only if they are promoted in an open, honest, realistic manner. Idealistic promises--and cute slogans--won't work in the long run.

So, before praising idealism and dismissing realism, consider their applications and effects in raw-food diets. I hope that you will agree with me: that the raw movement needs MORE realism and LESS idealism.

Recommendations

427

Many years ago, when I was getting into raw foods, I was very naive and idealistic. I learned--the hard way (see my bio for the unfortunate details)--that idealism is not a good approach. I encourage others to try rawism, but to do so in a realistic way, and to avoid the traps of dogma and idealism. The long-term outcome of your "raw experiment" may depend on which approach you choose, and your attitude. Also, I should address the situation that some raw-fooders will cling to idealism, despite its serious flaws, because it appears to be more positive, and gives them (false) hope. If someone came to you with a pill, and said that it was natural, and that if you took the pill it would cure all diseases, grant perfect health, and lengthen your lifespan, then you would dismiss that person as a charlatan, a fraud, a snake-oil peddler. Why then do you believe it when someone tries to "sell" you a diet with the same false claims? Let's ignore those in the raw movement who are afflicted with excessive idealism (and who are in denial of reality), and approach raw foods with our eyes open, with full acceptance of reality. Finally, it is not my objective here to insult or attack non-zealous idealists. You have the right to run your life as you see fit. I am only asking you to open your eyes, and see the risks of excessive idealism when it is incorporated into diet. The combination of a highly restricted diet and excessive idealism can be hazardous to your health. I wish you good health, and good luck with your diet! --Tom Billings

The hidden complexity in simple dietary idealism Rationalizing failures gets complex quickly. As far as simplicity vs. complexity of detail goes,
certainly the ideals and principles behind plant-based diets founded on philosophical naturalism are simple. But what is overlooked is the complexity involved at an entirely different level: how one is forced into never-ending, detailed rationalizations to account for such things as the many long-term failures on such an "ideal" diet, and- why the failures don't count against the diet, why they can be ignored, why they mean almost the opposite of what it would appear they do based on ordinary common sense, why they demonstrate rather than refute the principles behind the diet, how the few exceptions of long-term success are not really exceptions but instead the ideal everyone can achieve. And so on and so on--endless contortions and complexities to explain away the obvious. These include complexities besides just explaining failures on such diets that we'll cover a little later, complexities which are necessary to support some of the premises of utopian dietary thought itself. Are the explanations really simple, or just familiar? Important to note is that such complexities or rationalizations really only seem as simple to us as they do because they are so familiar, and oft-repeated like mantras. Familiarity, or what we become habituated to with repetition and practice, can easily blind us to how complex or convoluted some of our "simplicity" may really be. Familiarity is a double-edged sword that is necessary to understand things, yet its pervasive and normally unseen role is what gives it the power to blind us to the obvious if we get too emotionally comfortable in it, and stop looking and asking questions.

Note: If one is really interested in looking at it, the low success rate mentioned just above is actually fairly simple to observe on almost any email list devoted to the ideal of
raw-foodism if one sticks around long enough to see the patterns, where most of the posters fall into one of two camps. One camp, the biggest one usually (though not always), asks for lots of advice or talks about the problems they are having on the diet or in sticking to it without lapses. The other camp of people who are apparently doing well and giving advice based on their success,

428

such as it is, have in most cases been on the diet only a relatively short time (a few years or less, just as often months or less). The remaining people who have been on the raw-food diet long-term (say, five years or more) without significant lapses or on-and-off-the-wagon episodes, and also say they are successful (giving them the benefit of the doubt) do exist but are more rare. This is before getting into further considerations of what criteria might be used to assess reliability of the claims, though. (For example, it's not that unheard of for some who count themselves as successes to report weight levels that would be classed as anorexic. Others report menus that if true would be well below caloric starvation levels, which strains credulity.) Here, it's what you don't see--lots of long-term successes--that speaks volumes by its absence.

The complexities of dietary utopianism go beyond just having to explain failures Conundrums in idealistic dietary thought itself. Beyond the problems of accounting for the
significant failure rate, there are numerous examples of conundrums confronting the philosophy of idealistic dietary thought itself that have to be ironed out via complicated explanations to justify them. Here we'll give the flavor of them through a few brief examples, which we will follow up with a bit more indepth look later. Some of the issues that have to be explained with convoluted rationales are:

Why the evolutionary record of meat-eating in humans doesn't really mean we could have adapted to that way of eating, even though it is maintained that our mostly
vegetarian progenitors, the ape family, are adapted to their way of eating, which was itself a big change in diet from the predominantly insectivorous primates who preceded the apes. (Most seem to forget or are not familiar enough with the details--the actual evidence--to realize the latter was the case.)

Asserting that significant physiological evolution in dietary adaptations in humans couldn't have happened, while tacitly accepting other adaptations that occurred, such as the tripling in size of the human brain. (This is what is called by another writer elsewhere on this site "the brain can evolve but the stomach cannot" line of reasoning.)

Similarities between apes and humans are discussed, but not the differences. Here the
problem is making much of any similarities between humans and apes where items relevant to diet are concerned, but discounting or not being interested in the differences.

Having to explain why the ideal diet is more difficult for most people than other diets,
rather than easier.

Why good results are good for our diet, but not the other guys'. Why if the diet you
believe in makes you feel good that's good, but if any other diet makes you feel good that's deceptive, and you'll pay later.

And conversely, why bad results are bad for their diet, but not ours. If any other diet
makes you feel bad that's obviously bad, but if the ideal diet makes you feel bad that's good, because it's only detox.

Being told to trust your own experience, but to believe in what the diet gurus say instead if your experience doesn't agree, because discordant results don't necessarily mean what
you think.

429

The differences between various vegetarian diets are important, but all omnivorous diets are pretty much the same. Here one broadly lumps any diet with meat in it into one

allegedly detrimental class with each diet regarded as ultimately little different than any of the others. Meanwhile hairs are split over the differences between various vegan or vegetarian diets, and the differences seized upon as very important to distinguish which is best. We'll go into a few of these as well as other examples in somewhat more depth later, where we'll compare and contrast this more complex pattern that makes necessary "putting out fires" in one's logic, after the fact, with its alternative: acknowledging the messy facts of life ahead of time and dealing with them more simply right up front.

Dietary utopias and the tar baby of science Can there be a harmonious relationship between simplicity and complexity? As we have said,
there is no escaping at least some complexity. The only real choice concerns the manner in which you want to deal with it. What would be optimum is for the overall scheme or logical "container" into which the complexities fit to itself are reasonably straightforward to grasp, so that the complexities are acknowledged and awarely "corralled" in areas that put them out into the open where they can be dealt with more straightforwardly and consciously.

Choice: deal with complexity in science or have it infect your overall logic. Hopefully one
wants to do this without the surrounding logical container being so oversimplistic or unrealistic that it generates additional complexities of its own in the form of fallacies, hidden contradictions, justifications characterized by convoluted reasoning, and so forth. In the view being outlined here, the overall conceptual approach is simpler because the complexity gets put where it more logically belongs: in the procedural realm of weighing the scientific details and evidence that support the simpler, higher-hierarchical-level concepts presented. Put another way: Less complexity in this approach is found in the reasoning but is fully dealt with in the area of science where it has its rightful place. With the more idealistic diets, scientific complexity is avoided but at the expense of more complex reasoning that becomes necessary to avoid or rationalize the evidence that puts one's ideological tenets into question. (Whether humans were originally vegetarian or ate meat, etc.) In the end, convoluted justifications that are generated to maintain a static belief system in the face of the progressive discoveries of science make things more complex than if the science were just dealt with up front.

Trying to simplistically deny science is emotional instead of rational. In one way or another,
proponents of overidealistic vegetarian diets that are based on philosophical naturalism are being forced to start dealing with scientific details because of all the attention and information that's becoming available on early human evolution and meat-eating and the dietary behavior of primitive hunter-gatherers. In response, because there is getting to be so much evidence having to be explained away, there are some people who prefer to take the path of avoiding the complexities almost entirely and try to do end runs around them in the form of dismissing the science almost altogether. This is typified by the epitome of oversimplistic responses: emotional slurs that don't even attempt much reasoning but just label science as "cooked," "brain-damaged," "scientism," etc.

The tar baby of simple-minded attempts to deflect science. In general, though, the attempt to
avoid scientific findings is a tar baby that sucks one into even more complexity, as science continues to advance and more things have to be rationalized in order to keep pushing them away, unless one is going to be a complete ostrich. Usually, rather than meeting the complexities on their own terms, oversimplistic attempts to deal with them are made that only further entangle one and compound the complexities.

430

Those who may have previously ignored scientific research become concerned to explain why it doesn't really say what it does. But when questions are asked by someone who is actually familiar with it, the conspicuous shortcomings in their knowledge are revealed and the self-created trap closes shut. Those who proudly avoid or jeer at peer-reviewed journals don't hesitate to comment on the finer scientific points in research they haven't read, which often results in statements that are soon found to be contradicted by items in the research they didn't hear about. Because little attempt is made to understand the details of the research, unknowing assertions are made about it that dig even deeper logical holes to reason one's way out of later. Before, the complexities were too complicated to wade into and too irrelevant to bother with. Now they are easily interpreted and the meaning clear to see for all but the scientists, few of whom understand the real implications of their own research, which are usually the opposite of what they think.

The culinary art of butchering science. Others who have long seen the headlights of science coming
have kept busy constructing convoluted dietary "proofs" not seen anywhere in the halls of science, to justify the jungle diets of apes for savannah-dwelling humans. Compatriots point others to the proofs in justification for their diets while admitting they can't really follow the reasoning themselves. And at the same time the chef concocting the proofs is chopping, dismembering, and transforming the meaty research of science into a fruity-smelling salad of no-longer-recognizable ingredients that even most of their invited guests find hard to chew on, the real scientists whose research they are butchering up for others are labeled hacks.

Complexity: can't live with it, can't live without it. To add another degree of complexity and
contradiction, it's common to simultaneously maintain while doing all of this that the science really isn't necessary, in any event, because anybody should be able to see what the truth is without it. This conflicted, schizophrenic stance toward the complexities of science is the end result of a process where it's believed that one side (simplicity) should be able either to do without, or to dominate, the other (the details). But like a marriage with an overbearing partner, one keeps finding it doesn't make a happy marriage. The implicit attitude is almost as if one were saying: Complexity--you can't live with it, and you can't live without it. You think you want to get a divorce one moment, but later you want to make up again. It takes a relationship between equally respected partners to make a harmonious, happy union.

In the end, dietary idealism generates more complications and ignores simpler explanations Accepting "the messy reality" is simplest. We all know that reality can be messy, but we often deny
it. "Things have to be more simple than that!" we plaintively exclaim. Ironically, however, an approach that consciously embraces the necessity of dealing with some complexity as a co-partner (including especially the unexpected messiness we often run into in the real world, not just the complexities of science) ends up being more simple overall than a more utopian approach. Why? Because it doesn't generate unnecessary baggage having to explain that messiness or complexity away. (The numerous examples addressed in the section below will clarify this.) Complexity is not gotten rid of or attempted to be stamped out, but it assumes a more natural relationship of coexistence with its partner simplicity that's more harmonious. Doing this, however, requires a different schema that repackages the simplicities and complexities into different areas than before. Because of this initial unfamiliarity it might at first seem more complex. However, to see what is simple about it requires only a small shift in one's underlying mental stance or "mindset."

"What is" vs. "what ought to be." What is this shift? If you're really interested in health rather than
just identifying psychologically with a diet or dietary philosophy that you believe might get you there, you

431

become more interested in "what is." It's the simple difference between beginning to care more about "what is" than what one thinks "ought to be." But this doesn't mean giving up worthy goals. It means redefining them.

What is the REAL goal? In pragmatic realms like health, presumably the REAL goal you want is the
end result (good health), not the method you use to get there. But if you care more about what you think "ought to be" (100% veg, 100% raw, 100% instincto, etc.) rather than "what is" (the results you and others are getting, as well as what science suggests about what might work) then you are forced to come up with justifications why what ought to be isn't what is :-) , whenever there is a conflict between the two. Because you don't really have a choice when "what ought to be" is more important to you than "what is." Trying to save a specific conception of "what ought to be" by explaining away "what is" is the core dynamic that starts making things more complex.

The shift from utopianism to pragmatism simplifies reality. However, when we relax our grip on
"ought to be" and focus on looking at "what is," and make that our priority, explaining the sometimes messy realities becomes a lot simpler. If our real, underlying goal is "health" rather than "being 100% blahblah-blah," then seeing "what is" gives us a better handle on how knowledge of just what this "what is" is, and what it is right now, might help us reach our underlying goal of "health" in the future. The concern becomes "what is" working or "what is" not working for yourself and others, "what is" actually reliable evidence (from both your experience and others', as well as from science), and the details to support that or not.

Forgoing knowledge of "what is" for a "what-ought-to-be" identity. When you get sidetracked
into wanting to be an identity (100% this or 100% that) more than you care about knowing what the results of things actually are (what your experience and others' really suggests; what scientific research suggests), that is the point at which "what is" gets sacrificed to "what ought to be." We say we want to know the truth, but in the realm of diet it's easy to get seduced into merely wanting to be an identity instead. Therein lies the complexity.

How the most idealistic view ends up being more complex: examples
The framework of looking at "what is" by accepting the world in its messiness, whether in the realm of personal experiences or scientific investigation, is actually a demonstrably less complex one in terms of conceptual approach. Let's now explore a number of examples to illustrate. Ask yourself the following questions, and see if the answers you would give are as simple as you think.

Does the diet work well or not, and why? If one takes the utopian blinders off, it is
relatively easy to observe the messy reality as we mentioned previously that 100% raw diets work long-term for few, and 75%+ strict fruitarian diets work for even fewer. (For the latter diet, perhaps even no one when all the facts come in, though we won't try to argue that here, because it's not important to the general point.) This is not too difficult to observe. Even though it's a "messy" reality, however, all in all it's still a pretty simple one to grasp. What becomes complex is to have to explain why, if these diets are supposedly ideal, so few succeed in the long-term.

Should exceptions to the rule be made the goal? In explaining why long-term failure turns
out to be the rule rather than the exception on these diets, it's simpler to conclude the diets just don't work well. Instead, though, in true "what-ought-to-be" fashion, that fact is turned upside down and the exceptions are made the goal for others, with the few fortunate ones becoming minor-celebrity role models. However, since the preponderance of individuals who try these diets mostly ends up having to

432

eventually abandon them, or seriously modifies the diets and ends up doing better on other ones long-term, it becomes complex to have to explain why failure is not because of the diet, but the fault of something else. To speculate it's solely the fault of detrimental factors in modern life that almost everyone on any diet is exposed to--such as impure air or water, or stress, or eroded topsoil and less-than-ideal produce, etc.--doesn't simplify things or help make them any more clear or explain them better. It only makes them less rational.

Are people the same or different? It is simplest to observe that people are different in their
responses to diet, sometimes considerably so (that's the messy reality), and thus the idea that there is one single ideal diet is questionable. To say everybody is more or less the same but they respond differently to diets anyway leaves one with some real explaining to do.

Should you judge your own adequacy, or instead the diet's, by the results you get? It
is a bit more difficult, but still relatively simple, to observe that people are often more emotionally invested in the philosophy behind a diet (their "what-ought-to-be," psychological identity) than in the results they actually achieve ("what is"). And very often they prefer to interpret results, bad or good, as supporting the philosophy while they automatically interpret discrepancies as calling into question the adequacy of their efforts, rather than the other way around: using discordance between results and philosophy to question the philosophy. Yet evaluating the philosophy by the results is at least as simple as judging oneself by them. And as time passes, it becomes increasingly more so, if concerted efforts are made and a trend toward success is not in evidence.

Should an ideal diet be more difficult or easier than other diets? Here's another simple
idea related to the previous one. Which is more straightforward: to say that success on a diet that is ideal should be counterintuitively difficult, or that an ideal diet should be easier to succeed on than other diets? This consideration applies equally whether the person tries hard but still can't stick to the disciplines (implying the disciplines are not very livable or the diet isn't satisfying), or does stick to them but isn't getting good results (implying the diet simply doesn't work).

How about negative symptoms on a diet? What do they mean, and what's the simplest way to interpret them? Even granting that "detox" exists, if one is eating an ideal
diet that presumably results in detoxification, then symptoms that are detox won't persist indefinitely and should diminish over time. If the symptoms don't diminish, then it's logical and simpler to view continued symptoms as bad. More complex is to have to explain why continuing bad symptoms mean good things.

Are chimps humans? Should we copy their diets? Another straightforward observation
that guides much of the thinking here: Chimps and bonobos and gorillas are not humans. (Really. :-) ) If one is going to prescribe human diet based on extrapolations, then it makes more sense and is more straightforward conceptually to do so based on prehistoric human diet now that such data is available. To assert that even Paleolithic-era humans were culturally duped into eating nonevolutionary foods--and so we should therefore look at the next-closest animal's diet for guidance--is a more complex presumption not based on any supporting human evolutionary evidence or bona fide logic.

Is there really a "perfect" diet, or does any diet involve tradeoffs? Another fairly
simple observation or assertion here--that there is no such thing as a perfect food or perfect diet--is

433

one that most of us who have pursued the ultimate, ideal, natural diet may find emotionally difficult to conceive of at first. It may seem to be only something designed to puncture people's idealism and motivation to improve their diet. For those who believe evolution should produce absolutely perfectly adapted creatures, it just doesn't seem right. But the observation is tied to a more useful unifying principle: the idea of competing tradeoffs (which is in fact the coin that evolutionary adaptation deals in).

Nutritional costs and benefits. One problem in viewing foods as either (a) toxic, or (b) clean,
with the ensuing pursuit of perfectionistic dietary purity this entails, is that it usually leads to narrowing the diet to the point of long-term unsustainability. The simple idea of tradeoffs avoids this trap by first taking into account that any food necessarily involves a metabolic cost (in digestive energy and transient "toxemia" or digestive waste) to get its nutritional digestive benefit. Secondly, this idea also says that you cannot assume a robust enough nutritional intake to achieve dietary sufficiency will automatically coincide with the most toxin-free diet of which you could conceive. It doesn't necessarily follow. The trap of eating only the so-called "cleanest" foods ignores they often lack essential nutrients that can only be gotten in other foods that may entail more of a digestive cost in either metabolic waste or energy. (Indeed, it is elementary that if there were any "perfect food" without such tradeoffs, then we would need to eat only that perfect food and no others.)

No free lunch: the messy reality. In simple terms, this is similar to the saying, "You don't get
something for nothing," or "There is no free lunch." Instead, the ideas presented here suggest that you go for the best overall balance of nutritional cost/benefit. Which will help you eat the best diet possible for you, but it isn't going to meet idealistic standards of "the perfect diet." That's the messy reality.

Do shades of gray or does black-and-white thinking more accurately describe reality? A unifying principle related to the previous one that conceptually cuts across much of the
analysis and approach taken here is that it's more useful, accurate, and informative to look at varying "shades of gray" instead of viewing things through black-and-white schemes. While on one level looking at shades of gray may seem to add a degree of complexity that black-and-white schemes don't have, the benefit is it doesn't falsely classify things as "all one way" when they are really some of both (or some of three or four things), thus better reflecting reality. The huge downside of black-and-white schemes is they often generate fallacies and lead to unacknowledged or hidden complexities and absurdities. For example, if you classify foods as either "live" or "dead," it's now your job to explain how it is that most of the civilized world lives to the age of 75 primarily on dead food, while most raw-foodists can't live on a pure, all-live-food raw diet for more than just a few years before bailing out and having to add a few dead foods to their diet too.

A little complexity up front is better than a lot more in the end. Here, we would rather
have a little bit of extra complexity if it takes the real world into account better, because in the end it avoids the thorny fallacies of binary black-and-white logic that can't see anything in between. So the straightforward principle here is that if simplicity distorts or obscures the reality rather than helping to elucidate it, then it's better to take the complexities into account. And it's simpler to do that up front rather than be forced to deal with them later when they sneak in through the back door.

How often can we actually be certain? Finally, it is also apparent in the approach taken here
that at times, sometimes much more often than we would like, we may not be very certain about

434

any particular detail. It is this uncertainty that generates the most complexities of the type that are addressed in depth on this site and have to involve the necessary scientific research to try to answer them. These, then, are some of the core simplifying principles that underlie our approach here. In most cases, they are demonstrably simpler than the utopian approach, and in the minority of cases where they are slightly more complex, it's because there is very good reason: Dealing with the complexities consciously eliminates still other complexities and problems that arise if we try and ignore them. Which in the end is actually still simpler. Of course, that might not be as simple as we would prefer in an ideal world, but we're talking here about this world--the real one.

Is simplistic certainty in the face of uncertain knowledge smart?


A concluding observation that follows directly from the last principle mentioned just above is to ask: Is it not a considerably better strategy in life to remain uncertain if we can't be sure about something, and to be fully alert as a consequence, rather than to be erroneously certain? And to be aware of the details that make us cognizant of just where the uncertainties lie, rather than to live in a dubious or false certainty that makes us oblivious to certain crucial things which would otherwise be obvious? The fruit of false certainty. One critical example of this, as we have said, is that most people don't succeed long-term on utopian, totally raw or 75%+ fruit diets. And they run considerable risk of damaging their health if they pigheadedly persist past the point where things start going downhill for them. If one wants to talk about things that are unnecessarily complex or detailed, then the kind of rationalizing that obscures an observation like this is one of them. Because this is something relatively simple to see which could save some people literally years of futile experimentation and/or potential long-term health consequences. Complexity's role. What is complex to understand and explain (as is attempted on this site) are the specific underlying reasons why or whether a so-called ideal diet does or doesn't exist, and what that might be, along with all the scientific or research-based evidence to truly answer the question. (Unless you are willing to just say "people are different"--or its opposite, that "people are not THAT different"--and leave it at that, of course.) And that's another conflict here: dietary idealism would like to keep things simple, but if you also want bona fide evidence that is scientifically verifiable, you have to embrace scientific methods which involve detailed research. That's of course not so simple because such evidence has to be concerned with the actual biology, anthropology, evolution and genetics, and verifiable clinical results of things. None of which are or can be satisfied with being the kind of simple philosophies (simplistic or oversimplistic would be more accurate) that dietary utopias are. The real issue is this: How hard do we really want to look at things, and how much "truth" do we want to risk? Perhaps even to the point of realizing we have to live with some uncertainties as part and parcel of life? This implicit price in being free from oversimplistic dogmas that don't serve us well, however--and being able to then respond, even if imperfectly, to a more clearly perceived if somewhat uncertain reality--is well worth it. Uncertainty leaves you alert and primed for new vistas of discovery. Far from being a bane, it is the uncertainties in life that actually make it an interesting challenge, worthwhile, and even exciting. It's the very tentativeness of uncertainty that leaves you open, primed, and flexible, whereas being too certain dulls awareness and makes you closed and rigid. Uncertainty is a spice that makes you eager for new information that might even change the way you think, and can repeatedly open up new vistas of discovery for exploration. And that's what's exciting about it. Once you get a real taste of the process--how it stretches you to use all aspects of yourself in equal measure including logic, intuition, and the feedback of results, plus every useful tool available including detailed evidence--you simply can't go back.

Functional and Dysfunctional Lunch-Attitudes

435

by Tom Billings Copyright 1998 by Thomas E. Billings. All rights reserved. Contact author for permission to republish. Are you eating the right foods with the wrong attitude? Is it possible to spoil your good diet with a bad mental outlook? To become unhealthy in spirit even if sound in body? Judging by the troubles people on alternative diets often seem to have obsessing over food or maintaining a balanced approach to its role in the rest of their life, it often appears to be so. The objective of this paper is to encourage readers to explore their attitudes toward diet and food, i.e., one's mental relationship with diet/food. Some of the common attitudes one may encounter in the raw/vegan movement are defined and briefly described. We begin with a look at functional (positive) attitudes, after which we'll compare and contrast them with dysfunctional (negative) attitudes. The motivation for this paper is to help raw-fooders (and conventional vegans) recognize dysfunctional attitudes in themselves and others, and to thereby assist in the process of moving toward a more functional attitude. Not only can this be important to help improve your own relationship with food, but with the proper attitude, those who want their diets to be examples for others won't be so likely to end up alienating them-something that unfortunately happens as often as not.

FUNCTIONAL ATTITUDES
Lunch-Mindfulness
Lunch-mindfulness occurs when one has a sufficient, but not excessive, awareness of the food that one consumes. One should have sufficient awareness to have a nutritious, healthy diet that is suitable for your body, conditions, and circumstances, but without obsessing on the details of diet. Such a diet can be raw, cooked, vegetarian, or non-vegetarian--the principles underlying lunch-mindfulness are not dependent on the type of diet. In lunch-mindfulness, the diet is your servant. Here, one eats to live, rather than lives to eat. (In contrast, you are the servant (slave) of dietary ideology under the dysfunctional dietary attitudes discussed further below.) The practice of lunch-mindfulness lies between hedonism (ignoring your health, eating only for pleasure) and excessive discipline (strictly controlling your food intake at all times, often in disregard of your health). Lunch-mindfulness is moderation or the middle path: paying enough attention to food to have a good, healthy diet, but not so much that one is obsessed with food. One way to characterize lunchmindfulness is as "appropriate" discipline, where appropriate means discipline that is driven by intelligence rather than dogma, and applied with restraint, in carefully measured amounts. Because most of us live in a society that has a generally hedonistic approach to diet (and life), lunchmindfulness and the middle path may appear to be "too disciplined" to those with a hedonistic bent. On the middle path, social situations may arise where deciding how much discipline to apply is a non-trivial question. (Such is life--I can only suggest that one think carefully regarding such decisions.) Ultimately, the point of lunch-mindfulness is that one has just enough discipline to take care of oneself, without making dietary ideology/discipline into a god. Lunch-mindfulness is based on the idea (common sense, really) that your health is more important than dietary ideology/dogma, and that excessive discipline is as much a problem as no discipline (hedonism). One real-world complication that most of us face when trying to practice lunch-mindfulness is the common habit of eating for emotional reasons. When we eat for purely emotional reasons, we are not applying "appropriate discipline." Learning to eat mindfully and to control/avoid emotion-based eating can be a real challenge. There is no one solution to this problem, but some of the things that may help are: Finding your spiritual center (yoga, other spiritual practices).

436

Be a witness of the mind when the emotions come up--analyze the emotions and your response (accept the emotions, try to control just the response). Avoid foods/situations that trigger the emotions, and Most important of all: Whatever happens, don't judge or condemn yourself--love yourself as you are right now.

These issues also illustrate the important point that lunch-mindfulness is more of a process than a result--it is a learning process that we engage in throughout our lives. Like other learning processes, problems and challenges may arise, and we may experience reversals on the path. The motivations for the practice of lunch-mindfulness can vary significantly according to individual circumstances and spiritual philosophy. A few select reasons to practice lunch-mindfulness are given as follows: A spiritual person might regard eating as a form of meditation, and/or a form of worship. Such worship could be to the divine fire (digestive fire) within, or, as one example of possible Christian views, one might consider the body to be the temple of God, and only clean food should be put into the temple. A non-spiritual person might view eating as a potentially healthy pleasure, and try to eat delicious but nutritious food (without obsessing on it). Another person might seek to minimize the effort required to achieve a good healthy diet.

The approach to lunch-mindfulness will vary with the individual, but the common elements present in lunch-mindfulness are moderation, positive motivation and attitudes towards diet, and the critically important realization that there is far more to life than what is on your lunch plate. Examples: A person (who has positive attitudes and motivation) following a 100%-raw diet as part of a healing program. A person following a (nearly) 100%-raw diet with positive attitudes and motivation, in a nonobsessive way, because the diet works well for them. A person following a 75-95% raw diet because it supports good health for them, and fits into their lifestyle. A person following a 75-95% cooked diet for the same reasons as above.

Refining the Scope of Lunch-Mindfulness


Some readers will note that the definition of lunch-mindfulness above does not mention one's awareness of the effect diet has on the environment, or the claims of spiritual "superiority" made regarding certain diets. The exclusion of these is deliberate, for the reasons given below. Lunch-mindfulness can be incorporated into one's spirituality (as in the examples above), but it becomes dysfunctional (hence is no longer lunch-mindfulness) if one begins to think their diet makes them superior to others. Further, spiritual views on diet vary significantly; what appeals to one might not appeal to others. For example, some Christian denominations recommend vegetarianism, while others (e.g., certain small fundamentalist groups) condemn vegetarianism. For widest generality, it is best to exclude specific spiritual criteria from the definition. Certain advocates of veganism are quick to claim that their diet is more environmentally efficient than other diets, ignoring that their claim is logically dubious: it is based on the comparison of large-scale, non-sustainable production of plant vs. animal foods. The problem with such an approach is obvious: foods, whether plant or animal, should be produced in an environmentally

437

sustainable way. If foods are produced in a sustainable matter, is the amount of resources used of great significance? (My view: it is not; in particular, others have the right to live--and use the earth's resources for their food production, whether plant or animal.) An additional complication here is that promoting environmental efficiency as a virtue puts one on the proverbial "slippery slope." In effect, the idea usually becomes that "it's good to use the earth's resources to make plant food for me, but it's bad to use the earth's resources to produce animal foods for you." Another way to state that argument: "I have my plant (vegan) food--you should eat vegan or starve!" The latter quote is not very compassionate, but then genuine compassion, in my opinion, is rare in the vegan movement--where compassion is often used as a weapon against those "murderous" meat eaters (at least that's what the "compassionate" extremists suggest that meat-eaters are). Personally, I strongly disagree with the extremists. Showing concern for the environment when choosing your diet can be laudable. However, allowing such concern to put one in the position of advocating that others cannot (or should not) use the earth's resources for their food, or condemning others because their lunch uses more of the earth's resources than your lunch uses, is not laudable (such attitudes are reprehensible, in my opinion). Condemning others for their use of the earth's resources for legitimate food production is a clear illustration of placing dietary dogma above the rights of others--it is a dysfunctional lunchattitude, to be avoided.

DYSFUNCTIONAL ATTITUDES Lunch-Righteousness


Lunch-righteousness is a delusion--a false sense of individual superiority (self-righteousness) that is based on the quality/type of one's diet (their lunch) versus the quality/type of others' diets. That is, one has a "better" diet than others, hence one believes the delusion that he or she is "superior" to others.

Typical Examples:
An ethical vegan who believes that he or she is "more compassionate" than a meat-eater. A puritanical raw-fooder who believes that he or she is "purer" than those "poor souls" who eat "dead" (cooked) food.

Extreme Example:
Raw-food zealots who engage in some of the following behaviors: hostility, threatening others, plagiarism, intellectual dishonesty, and/or promoting the dietary equivalent of racism, because they are under the delusion that their 100%-raw vegan diet makes them "superior," and the ends justify their negative means. It should be mentioned here that some raw zealots promote lunch-righteousness by actively spreading the false myth that a 100%-raw vegan diet makes one "superior" to those who eat cooked foods. Not only is this false, but it is a form of bigotry. In her book, To Eat Flesh They Are Willing, Are Their Spirits Weak? (1996, Pythagorean Publishers), philosopher Kristin Aaronson nicely summarizes the moral trap of dietary self-righteousness (p. 18): No one who feels morally superior ever is, for the simple reason that [self] righteousness is itself a moral taint. We may feel better, feeling that we are better; but the better we feel we are than others, the worse-and worse off--we will be. We can be corrupted by a good thing, by too much of a good thing, by taking a good thing much too seriously. Note: material in brackets above [ ] is my own explanatory note.

438

Lunch-Identification
Lunch-identification occurs when one closely integrates their dietary philosophy (i.e., their lunch philosophy) into their self-identity. Here one strongly identifies with their lunch; in figurative terms, the lunch eats you, rather than you eating the lunch.

Typical Example:
A person with lunch-identification is easily upset when his/her diet is criticized, challenged, or questioned, because he/she interprets criticism of the diet as personal criticism.

Extreme Examples:
Zealots lashing out with hostility against all who dare to challenge their delusions about their supposedly "perfect, natural, ideal diet." Raw zealots denying the humanity and/or existence of people with different diets--e.g., referring to them as "mutants," or saying that an existing race/group of people will go "extinct" because they don't have the "ideal" diet. Calling others "mutants" and implying that they are less than fully human is the language of bigotry--the same as practiced by hate groups. (See Note 1 at end--the word mutant can be used in a civil, non-hostile way.)

Lunch-identification and lunch-obsession (below) are two aspects of a newly described eating disorder: orthorexia nervosa. See the article Health Food Junkie, by Steven Bratman, M.D., reprinted from the October 1997 issue of Yoga Journal magazine, on this site for details.

Lunch-Obsession
Lunch-obsession occurs when one has a pathological obsession with the details of their diet (i.e, their lunch). Such an obsession may focus on the amount of food eaten (eating disorders: anorexia nervosa, bulimia), the type of food eaten (e.g., ethical vegan zealots), and/or the quality of the food consumed (eating disorder: orthorexia nervosa--dietary purists). Lunch-obsession occurs when one expends great effort in trying to conform to the strict discipline or requirements of an "ideal, perfect" diet. In the context of "ideal" diets, if lunch-obsession continues long enough, and one succeeds with the "ideal" diet, lunchidentification and lunch-righteousness typically develop as well. The result is an individual whose diet is stunningly dysfunctional to their mental health. In my opinion, certain raw-food zealots are prime examples of this phenomenon, and are severely mentally unbalanced. Note that conforming to strict religious rules about diet is not necessarily lunch-obsession, as religions are (presumably) based on love, and love is not a pathological emotion. Also, one might need to pay considerable attention to food for a short period (the length of which will vary with the individual) after switching diets. Such attention is not considered lunch-obsession if the attention declines substantially after one gets habituated to the new diet.

Typical Examples:
Someone following a 100%-raw vegan diet for negative (pathological) reasons (i.e., fear of cooked food, mucus, protein) discovers that a "raw" dish just eaten actually included a cooked ingredient. The wanna-be 100% purist then feels "impure" and/or "sinful," and may seriously consider doing a fast as a form of penance. This is the kind of nonsense that can occur when one has the negative motivations promoted by dietary purists.

439

An ethical vegan eats a prepared food without checking each ingredient for the presence of (horrors!) animal products. The vegan learns after eating it that it contained a tiny amount of animal products, and feels guilty and impure.

Extreme Example:
Lunch-obsession can be considered an eating disorder (mental imbalance) when it has a significant negative impact on a person's life: orthorexia nervosa (see article mentioned above).

REFLECTIONS ON LUNCH-ATTITUDES
Remember that the motivation for presenting these definitions is to give us tools to examine our relationship with food. All of us likely have a blend of both functional and dysfunctional attitudes in our psyche. Our task, then, is to work to increase the functional and to decrease the dysfunctional. Think back to when you first got into raw/living foods (or conventional veganism) and experienced a big improvement in your health. You were probably very enthusiastic then, and your personal attitude towards food was probably more dysfunctional than it is now. Over time, you have (or probably will) move to a more functional attitude. The point, of course, is that we can--and do--change. As zealots are criticized here, it should be mentioned that zealots also have a choice: they can remain as examples of dysfunctional attitudes, or they can choose to change their ways and embrace a functional style. That is, it is not the goal of this paper to divide people into a good "us" (functional) vs. the "bad" (zealots and dysfunctional). [Side note: dividing people into "us" (good vegans) vs. "them" (bad nonvegans) is a characteristic feature of the vegan movement--unfortunately.] Instead of division, I encourage all (including the zealots) to discard their dysfunctional attitudes, and work toward a positive attitude, one that is guided by genuine compassion.

NOTES
1. Some raw writers use the term "mutant" to refer to abnormal elements in the body--e.g., mutant sugars, mutant enzymes. Such usage is perfectly legitimate and non-hostile. What is hostile is to say that another person is a mutant, and inferior, because their diet is different. Thanks to those who reviewed earlier drafts of this paper for their suggestions which have, hopefully, helped to make this paper more "mentally digestible."

2.

LUNCH-ATTITUDES: SUMMARY OF TERMINOLOGY Classification FUNCTIONAL Attitude Lunch-Mindfulness Suggestions A positive approach--cultivate!

440

Lunch-Obsession

Consider counseling, and remember: the diet must serve you, not the other way around! Consider counseling, and repeat the affirmation: I am more than just my lunch; others are free to choose a different lunch. :-) Consider counseling, and do a self-analysis: can you eat your way to perfection, heaven, or enlightenment? How do you actually treat those with different diets?

Lunch-Identification DYSFUNCTIONAL Lunch-Righteousness

I hope you found the material above to be interesting. Good luck with your diet and health! --Tom Billings

Satire.
The Morality of Human Omnivorousness
by Wardolfski
...who wishes to be protected from the wrath of the vegetarian community for reasons that will soon be apparent. BROUGHT TO YOU AS A PUBLIC SERVICE MESSAGE BY THE OMNIVORAN COUNCIL FOR EQUAL-OPPORTUNITY EATING. Copyright 1998 by Ward Nicholson. All rights reserved. Contact author for permission to republish.

Our semi-scriptural topic for the day is the golden quotation: "God grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference."
Hi out there everyone! It's a great day for drinking carrot juice, isn't it? :-) You know, at first I was going to write a big, long, humdingin' philosophical treatise here, complete with some sobering riffs on the self-help saying for the day quoted above, the one the 12-Step programs use. Some academically impressive intellectual stuff about how rational it obviously is to eat meat when humans are born omnivores in the first place. And then I thought, golly gee, what's the use Aunt Gertrude? If a person doesn't get it, they just don't get it. With all due apologies to the 12-Step folks--who have a philosophy of human helplessness I don't really agree with much in the first place--still, I am happy to concede the quotation cited above is one of the least

441

illogical things about their rap. (Hey, you sure ain't gonna tell me I'm a failure and always will be, and the only way out is to remain tied to the umbilical cord of 12-Step philosophy for the rest of my life. Sheesh.) In fact, I'd even go so far as to say the 12-Steppers' quote above is actually a darn pithy summation of what the spiritual life is really all about. All wrapped up in a nice little nutshell, too, that you won't forget if, say, you had way too much watermelon for breakfast, your stomach hurts, you have to pee real bad, and you're finding it hard to think straight from all that addictive blood sugar rushing to your head. I mean, talk about too much of a good thing. But then, finding a balance instead of going to extremes by having too much of any one thing (such as, fer instance, plants, plants, and even more plants) is not exactly something vegetarianism is really known for. But back to that 12-Step saying. It's pretty simple: You accept what you are, you only worry about the things you can change, and you don't waste your time trying to change things that just are the way they are because... well, because they are that way because they're that way naturally. Anyway, that's what the spiritual path is all about the way I see it. Now of course, there are some people who continue to believe, in spite of the overwhelming prehistoric evidence to the contrary, that humans are really born vegetarians. (Actually, it's due to ignorance about it, I'd have to say, in all rancor. :-) ) Or that all of us should go stark raving veggie anyway, even if it ain't natural, just because our poor little ol' "inner child" can't stand it that some other critter should have to die so we can live. Like happens in many other places in the animal kingdom among critters we don't bother to take the time to condemn for just being themselves. Well, bury your head in the sand if you want to. Or better yet, chuck it down a termite hole and let a chimpanzee fish it back out along with the tasty bugs he's tryin' to extract for his lunch. Either way, you're only fooling yourself. The human evolutionary past and resulting genetic programming that makes us natural omnivores adapted to eating a certain amount of meat (as well as murderin' innocent veggies for our dining pleasure too, while we're at it) continues to live with us. And don't forget that omnivorism is a pretty democratic way to eat too, being the unbiased, equalopportunity selection of that which we shall employ for our gustatory management objectives, whether that happens to be animals OR plants. That's how I heard Hulk Hogan put it once, anyway. :-) (And I gotta ask, what IS it that vegetarians have against plants, for that matter? They can be so MEAN to them sometimes with their juicers and Vita-Mixers and such. :-( ) So I say, go ahead and be a vegetarian if you want to. It's a free country, it's your choice, and it doesn't bother me. But don't get all hot 'n bothered if the rest of us would rather accept our true natures. (The appropriate Zen koan for you to ponder being: "What is the sound of one bone crunching?") The rest of us just wanna get on with living and trying to figure out how to make the world work better by accepting who we are in the first place instead of denying it, or trying to transcend it, when that may not be such a good idea--for our physical health, anyway. At least not my health, 'cause I've tried it both ways before, folks. (Can anybody say LOSS OF L-I-B-I-D-O? :-) ) Now don't jump to conclusions on me, y'all vegetarians out there. I'm not saying it's wrong to be a vegetablearian, just like I don't think it's wrong to be a meatatarian either. And actually, ya wanna know the real truth? I just really don't think it's that much of a big moral deal in the first place. Kinda like standing there at the gas pump trying to decide whether to put regular or premium in your tank, and what the cost is gonna be today vs. last week or last month, and if it's worth it or not. And by way of analogy, sure, when you're weighing the ecological costs, maybe the way things are getting to be these days, a vegetarian diet would use the planet's resources more efficiently. So much more efficiently, in fact, that we might possibly be able to cram 10-plus billion people onto the planet instead of just 5 or 6 billion. I mean, let's go for a new all-time record everybody, and crowd even more animal species off the planet than have already been bumped off by our modern "green revolution" that plows over

442

half the planet's arable land, all the while destroying other animals' habitats while supplying us with plant crops. Crops that have always enabled humans to overpopulate and infest the planet like the rabbits that chew on my lawn greenery, and later get splatted flat in the middle of the road in front of my house by blunt rolling metallic projectiles. ...whoops, I guess I got a little carried away with the carnage and killing there, folks. Ahem, sorry... :-) So anyway, I suppose you could get me to agree it might be a good idea for practical reasons for us to consume all sorts of vegetables and even some slimy algae or even plant a bunch of acres of that greenbarley stuff (Hallelujah!) as a big part of what we eat. You know, do it all in the interest of going on a Diet For a Small Planet, and Lappe it up Moore to the best of our ability. But THEN you're ALSO gonna try to tell me that something like that which we're doing just as an expedient because we were forced to, or even just because we wanted so that we could be do-gooders with visions of saving the planet--somehow it all of a sudden becomes an act that awakens my true human compassion, turning me into a morally Good Person, and that I was a morally Bad one before? And that those who won't give up meat are morally benighted for doing what they are biologically designed to do, quite possibly in the best interest of their own health? Sorry, I DON'T THINK SO, Bubba! And I'll admit there are some other good questions vegetarians ask, too. Like, gee, why don't we stop making chickens sit or stand around in their own poop in cramped little cages plopping out eggs a mile a minute like gumball dispensers, only to have their heads chopped off before their time? And who really wants to see baby cows locked into tiny stalls they barely fit into (so they can't move forward or backward or lie down at all) and force-feed them milk until they're slaughtered for veal? It IS cruel, even to an unrepentant carnivore like myself. Do you really think an omnivorous malcontent like me wouldn't rather see those animals live out their natural lives first before they get to become my dinner? :-) Gimme a break, pal. I mean, hey, I have a certain amount of compassion too. But all the same, it's a man-eat-chicken world out there, so remember, I'll still be there after they bite the dust to salvage the carcass anyway. And think about it: What's gonna happen if you just let the carcass of a dead animal sit there and rot anyway, huh? I mean, SOMEBUDDY'S gotta eat the thing, right? Why should we be so different than the other omnivorous animals who would step right in to eat it if we didn't? This vegetarian immorality and hubris of wanting to take it upon themselves to redesign the balance of nature knows no bounds. If vegetarians got to create creation in their own image and had their way, there'd probably be no carnivores or omnivores at all. Whenever an animal croaked, the carcass would just sit there rotting on its way to becoming fertilizer for plants, and stinking up the place. No more lions and tigers and bears (oh my!). No sirree! Just a Wizard-of-Oz fairytale existence where everything is Emerald-City green, populated with plants and squeaky, smiley, feel-good munchkins stunted by generations of eating nothing but green, green, and even more green. And all you'd have to do is click your ruby-slippered heels three times, and you'd be transported to the fantasy world of your dreams where nothing in nature ever kills anything else at all. But back to the real world. Doesn't it seem just a bit strange that where killing for food is concerned, few condemn the revered Native Americans and all their past buffalo hunting, who are held in such high esteem that movies like "Dances with Wolves" portraying them have won Academy Awards, with even the vegetarians crying crocodile tears while eating their popcorn in the front rows of theaters everywhere? Somehow I guess the fact the Indians killed the buffalo with some sense of compassion about the poignancy of its (and their own) place in the overall scheme of things is an example of conscientious human behavior that really isn't so hard to conceive of after all, huh? And speaking of compassion, what about giving our very own domesticated urban animals a chance at more natural lives? What about compassion for our feline carnivore friends, for instance? Seriously, in addition to the good things vegetarians do, there are also some pretty stupid things ya gotta admit they try to do too. Like trying to force their cats to become vegetarians, and even believing they have succeeded

443

when good ol' Tiger seems fat and happy at the very same time the house is completely mouse- and rat-free without any traps or D-Con poison having been set around the house for months on end. DING DONG! HELLOO-O! Is there anybody HOME upstairs when Tiger is downstairs in the basement having fun ripping them mieces to pieces?! I mean, let's get REEALL, sister! But hey, folks, we live in a crazy world, and these are crazy times. That's what happens when the world gets overpopulated with humans (even vegetarian ones). And there ain't no easy way out of it until we start thinning the human herd. People forget none of this would have ever happened if we'd stuck with hunting and gathering instead of inventing agriculture and overproducing plants and people both. So next time you feel the urge to tell someone else they should become a vegetarian like you, quit passing around the same old plate of tired ethical hors d'oeuvres. We've all had our fill of all that, and it ain't satisfyin'. What most people will always be hungry for and secretly crave is something more accepting of basic human nature--not trying to deny themselves by remaking a species of omnivores into herbivores. Try sinking your moral teeth into meatier issues like that one, and don't miss the main course. Now may I please be excused from the table?

Not a mere debunking piece, here's a plea for honesty and accuracy rather than myth-mongering in promoting alternative diets, and for putting practical results before dogma.

by Tom Billings
Copyright 1997 by Thomas E. Billings. All rights reserved. Contact author for permission to republish.

Special note to readers. This article was written in 1997, before the Beyond Veg site was even a
concept, and has changed very little since then. It is not up to current site standards, and we advise readers (below) that scientific discussions can be found in other site articles. This article remains on the site because it does provide a concise introduction to some of the numerous myths that are prevalent in rawism. Something else you should know is that this article was written shortly after an acrimonious conflict with some quite hateful/dishonest fanatics. Because of this, the tone of the article is considerably sharper than other site articles (an apparent side-effect of the dispute). Please keep this in mind if you find that the tone is not pleasing, and do check out other site articles for a better view of the site's overall style. Also, if some of the "myths" here seem extreme or bizarre to you, be aware that one can indeed find rawists preaching a number of the myths here as "eternal health truths" or as "true" science. A complete rewrite of this article is planned, but it may be some time before that is accomplished, as other articles have higher priority. Finally, for a discussion of just how negative the behavior of some extremists can be, see Raw Vegan Extremist Behavior Patterns: The Darker Side of Rawism for actual examples of certain raw vegan diet guru behaviors.

What & Why

444

This article briefly lists, and briefly debunks, some of the more common myths found in the raw-foods movement. The presentation here is a summary; much more could be said regarding each myth.

(Scientific documentation for much of the information covered here can be found in other articles on Beyond Veg. This introductory treatment is intended to be a brief capsule presentation only.) Indeed, for each myth, one can find raw-fooders who are emotionally attached to the
myth, who will defend the myth long after it has been discredited. Readers are thus warned that they may find portions of this to be challenging, or even disturbing. The reasons for doing this can be illustrated with an analogy. You have been given an old field, in which you want to plant crops. The field is full of pernicious weeds. In order to plant your crops, you must dig out and dispose of the weeds. The field is your mind and lifestyle, the weeds are the myths that cloud your perception and make your experience in raw foods less desirable and less successful. The crops to be planted are the honest approach to raw foods, and your life, that we here seek to promote. In the following, Myth is abbreviated as M, Reality is R.

M: A raw-foods diet will give you perfect health. M: A raw-foods diet will give you a perfect body. R: The term "perfect health" cannot even be defined. Health has many aspects, and we cannot make it
well-ordered in an objective way. That is, we cannot objectively say who is healthier: someone with a SAD diet, poor physical health but good mental health, or a fruitarian zealot with good physical health but who is hostile/mentally ill. We can only be subjective about such things, and I would say that a peaceful meateater is healthier than a hostile fruitarian zealot! (Why? Because it is easy to detox the body, hard to detox the mind.) Of course, we cannot tell what a "perfect body" is, because no such body exists on the planet. Perfection is a theoretical construct, an unknown ideal. Personally, I would ignore anyone promoting the idea that rawism will make anything about you "perfect." (See also the section for the next myth.)

M: A raw-foods diet will cure any/all diseases, and/or prevent any/all diseases. M: Animals in the wild never get sick because they eat a natural, raw diet. R: Let us consider the latter myth first. That myth will be addressed further later; however, it was
mentioned repeatedly in a recently published rawist book. For the record: Animals can and do succumb to disease; it is a major cause of mortality. Studying animal diseases is a major specialty area in biology. One can go to a good university library and find numerous books on the subject. Additionally, there is the simple fact that there is such a thing as "veterinary science," which would not exist if animals never got sick. The people who repeat this myth ad nauseum, in my opinion, are simply trying to get you to believe a lie by repeating it often enough. Some quick examples of animal diseases: hoof and mouth disease killing bison in the U.S., and wildebeest in Africa; bubonic plague in rodents; tick fever and lyme disease in deer/antelope, and so on. Now to the former myth: If animals living in the wild eating a natural diet die of disease, why should human rawists be any different? Our diet, even if raw, is "less natural" than the animals (as they eat wild food, we eat cultivated foods). Also, there is the reality that raw-fooders can and do get sick. I have seen many rawists suffer from illness, and cases can be read in the Natural Health Many-to-Many and in some issues of the Health & Beyond newsletter.

M: You will live longer on a raw-foods diet.

445

R: Well, we certainly hope so! On a more serious note, where are the many (thousands) of long-time raw
fooders above the age of 100 who would be evidence that the claim might have some truth in it? They literally don't exist. The longest-lived societies on this planet are not raw-fooders; in fact the longest-lived societies are not even vegan. There is a lesson here for the raw-foods and vegan egos, if we are open to receive it. Also, we have the example of some long-time raw-fooders passing away "before their time"--T.C. Fry and Herbert Shelton. One factor that is more relevant than longevity (assuming some minimum longevity, i.e. that one survives to adulthood) is the quality of life. It is in improving the quality of one's life that raw diets can be very helpful.

M: A raw-foods diet will improve your mental health. R: In the short run, improvements in mental health (or attitude) are sometimes noted. However, it is my
opinion/observation that the 100% raw diet is correlated with serious mental problems in the long run. There are a few (very few) long-time 100% raw people who are mentally healthy. However, it is my opinion/experience that in the 100% raw category one finds: full-scale eating disorders and/or extensive eating-disorder behavior, unbelievably hateful zealots, lunatics, and people who appear to display lesser mental impairments such as: emotional fragility--just challenge their diet and you may see this first-hand; juvenile behavior (denial of aging); extreme environmental views; turning their diet--or 100% raw--into a pseudo-religion; and so on. The topic of why so few 100% raw people are mentally balanced has been discussed on the Raw-Food email list. It is my opinion/conclusion, based on personal experience and observation, that a raw-foods diet does not improve mental health by itself; rather it simply brings your emotional/mental problems to the surface, where all the world can see them and you can work on them. This is a difficult (and unpleasant) experience for some. The bottom line here: If you follow 100% raw for a long time (years), I suggest that you take active steps to preserve your mental health.

M: Natural hygiene always works! R: Sarcastic reply: Yes, and you also believe in the Easter bunny and Santa Claus? Seriously, those who
believe this need a reality check. No system is perfect, on this planet. Nature itself is imperfect--a source of considerable irritation for raw dogmatists, who try to explain nature in simplistic/idealistic ways. I would ask those who believe this myth to make up their minds: Is natural hygiene a science, or a religion? The proof that natural hygiene doesn't always work can be found in the pages of the Natural Health Many-toMany, in the Health & Beyond newsletter (6/94 and 1/97 issues), and in the case of Herbert Shelton, who suffered from Parkinson's disease for 10 years before he passed away. Note also that most other health systems are humble enough to admit that some patients are incurable (often due to their attitudes).

M: Apes are fruitarians (or peaceful vegans). M: Fruitarianism (or veganism) is our natural diet. R: The above myths were debunked by Ward Nicholson in a superbly researched interview in Health &
Beyond. Also, there are two email lists on Internet that address this topic--the Paleodiet and Paleofood lists. The reality is that all the large primates are omnivores, as they all eat insects and some eat flesh. The

446

chimpanzee society is marked by violence: war, incest, murder, cannibalism. So much for "peaceful vegan" chimps! (A more accurate description would be "occasionally violent, omnivore chimps.") The fossil record clearly shows that our prehistoric ancestors were omnivores; they ate both plant and animal foods. They were not vegans, fruitarians, or even vegetarians. P.S. (1) Some fruitarians are in denial regarding the above. Some fruitarians try to counter the above with misinformation. Be skeptical of counter-arguments, and look up/check all references cited in counterarguments! (2) Some conventional vegans have integrity and admit that, biologically, humans are natural omnivores. See anatomist John McArdle, Ph.D.'s article "Humans are Omnivores," in The Vegan Handbook for more info.

M: But mountain gorillas are vegans! R: Not really; they have been observed deliberately eating driver ants, and they also consume considerable
insects on the leaves that are the major part of their diet. Given that many vegans reject honey, avoid foods colored with cochineal (a dye made of ground insects), and get all grossed-out by descriptions of humans consuming insects (common in many hunter-gatherer societies), it is reasonable to say that an ape that deliberately eats ants is not a vegan, even if ants are a small part of their diet. Reference: Watts, D.P. (1989) "Ant-eating behavior of mountain gorillas." Primates, vol. 30, no. 1, pp. 121-126.

M: The apes that eat meat are perverted. R: Perversion is a human concept; nature simply IS, and wild animal behavior simply IS. We can accept
reality, or live in denial of it. So, I would say that apes eating insects or meat are simply following their instincts. What is perverted here, in my opinion, is the obvious denial of reality practiced by certain fruitarians (but the apes are not perverted).

M: The apes that eat meat are acting in error; it is a mistake. R: Applying human preconceptions to evaluate the actions of animals can be a tricky matter. There are
many things in nature that one might consider an error. What criteria makes an action by an animal an error? The only readily apparent criteria is the primal urge of animals: survival. That is, an action that interferes with survival of the subject animal is an error. In this light, we see that when an animal accidentally dies, say, in mating, it may be an error. However, apes eating insects, bird's eggs, or animal flesh are simply eating food--without which they might not survive. Hence, apes eating animal foods are not acting in error.

M: The only reason that chimps eat flesh is because of habitat loss. R: Given that much of the primate research is done in remote, undeveloped wilderness areas, there is no
evidence to support such a claim. Indeed, the evidence from remote wildernesses directly contradicts the claim (as the claim suggests animal food consumption would occur only in habitats near humans--which is not the case), and shows that consumption of animal foods by chimps is natural. This is yet another example of denial of reality.

M: Only one "tribe" of chimps in the whole world eats meat. R: False, as even a quick glance at the primate research literature will show. Are those who make this
claim desperate?

447

M: Flesh-eating by our prehistoric ancestors was the exception rather than the rule. R: Is there no limit to the denial of reality? Recent posts on the Raw-Food and Paleodiet email lists
(complete with full reference sources from a prominent researcher on the latter) have cited evidence from modern hunter-gatherer societies showing that animal food consumption ranged from 20-90% of diet, with an average of around 50%. So the evidence of modern hunter-gatherers does not support the view that flesh-eating was an exception. (See also Weston Price's book, Nutrition and Physical Degeneration.) The same data on modern hunter-gatherers, when combined with optimal foraging theory, yields a best estimate that the diet of our ancestors in prehistoric times was roughly 50% animal foods (wild game). Clearly, 50% of the diet is hardly an exception. Side note: My objective here is to present reality, so that you are not misled by those who are in denial of reality. I personally do not advocate, or practice meat-eating. One can accept reality and still be a veggie! I want you to openly, honestly accept reality, and not be misled by phony models of nature or phony versions of "nature's laws." Also, a reminder to vegans: compassion without integrity (honesty) is false and hypocritical; it is not real compassion.

M: Mucus is toxic. M: All disease is due to accumulation of toxemia/mucus, caused by eating the "wrong" foods. R: Here mucus is made into a demon, without justification. Mucus is an essential bodily fluid; without
mucus, your stomach acid would dissolve your stomach, and without mucus, your eyes would not function--you could not see. Of course, one can have too much mucus. However, what is happening here is that some people think that all mucus is toxic--a delusion. Mucus is not the only bodily "toxin"; for example the system of Ayurveda takes a broader view of toxins, which are known as ama. Ama can take different forms, e.g., vata--the toxic gases of flatulence in your colon (said gases can go into solution in your bloodstream and lymph); pitta--excess liver bile which, believe it or not, can escape your intestines and cause problems; and kapha--excess mucus, cholesterol, and similar "sludge." So claiming that all mucus is toxic, or that mucus is the only toxin, is a narrow/inaccurate view. Two examples that disprove that all diseases are due to toxins are: any/all deficiencies, such as vitamin B12 deficiency in raw-food vegans; and anorexia nervosa, the illness of excessive fasting. Additionally, most genuine holistic systems say that diseases are often caused by mental and spiritual factors. From that view, raw-foods diets that only remove the physical toxins are, like Western medicine, an allopathic system (i.e., the physical toxins are a symptom of the underlying mental/physical causes). Raw-food diet = allopathic system. Interesting!

M: All health problems you experience on a raw foods diet are due to detox of stored poisons. R: This is a dangerous delusion, one that guides some people to real harm. If a disorder is caused by the
side-effects of detoxification, then the disorder should get somewhat better as time goes by, as your detox process continues. If that does not happen, then the disorder is not connected with detox, and may be a deficiency or a real disease. Also, I would strongly advise anyone with serious health problems (especially acute problems) to consult (as soon as possible) a qualified health professional. Don't assume it is detox, and don't sacrifice your health/well-being on the altar of rawist dogma!

M: Fasting can cure any/all diseases.

448

R: Fasting can be a very powerful healing and cleansing tool. That said, it is not a cure-all. Fasting cannot
cure anorexia nervosa, the disease of excessive fasting. Though fasting may be able to initially normalize metabolic imbalances or aid in improving nutrient assimilation after a fast, it cannot cure chronic or longterm diseases of deficiency. (Some individuals with long experience in the vegan community--see natural hygiene practitioner Stanley Bass's interview in Health & Beyond, 6/94 issue--believe deficiencies may constitute up to half the problems some individuals experience.) Fasting can cause a psychological sense of deprivation, which can cause one to grossly overeat after a fast. This can lead one to a cycle of fast-overeat, or starve-binge. If that becomes habitual, it is a form of bulimia. Fasting can also aggravate other eating disorders. Moderate fasting can increase your digestive fire--a blessing for some and a serious problem for others. Improper fasting can loosen body toxins, then drive them deeper into the tissues--a lose-lose situation. Fasting is a powerful medicine, to be taken in the right dose, as needed. However, please note that most (not all) people can do short fasts (one day) without any problems.

M: Fasting will make you spiritually purer. R: Many world religions use fasting as a tool for spiritual advancement. The primary motivation of fasting
for spiritual goals is that it clears the mind somewhat and may motivate or inspire people to do other spiritual practices. However, many spiritual leaders warn that the "spiritual" feelings one gets from fasting are temporary and illusory, and should not distract one from the real, serious spiritual practices. Real progress comes from other spiritual practices, not fasting. You cannot fast--or eat--your way into heaven or enlightenment! Further, the religions that use fasting also warn against excessive fasting. Excessive fasting is considered "mortification of the flesh," and regarded as inappropriate behavior.

M: You should only eat those foods which you can gather with your bare hands, while naked. R: With apologies to author Desmond Morris, I will refer to this as the "naked ape" hypothesis. The above
myth is often repeated, with religious fervor, by some raw-fooders. Problem: This myth is a denial of reality, and a denial of our real nature. The very definition of human beings that evolutionary scientists use specifies, among other things, that we are intelligent tool-users. Those who promote this myth are denying the use of tools, and they are denying our intelligence; hence they are literally in denial of their real nature. To deny tool use is to make us lower than the chimpanzees (who have been observed using sticks as tools to collect and eat termites, a common food for them). It lowers modern humans to below the level of Australopithecus, one of our prehistoric ancestors who was very ape-like. Note that some may object to the above on the grounds that humans have taken tool usage too far. However, the use of simple tools for obtaining and processing food (such as stone tools), is well within the evolutionary definition. So this myth can be seen as an insult to humanity, or a denial of our humanity, in a sense. Additional problems: One can find rawists promoting this myth on computer networks. How would naked apes (with no tools) make computers?

449

The myth is often used to justify fruit as your natural diet. However, modern fruitarians don't eat wild fruit--they eat grafted, budded, cloned fruit. I'd like to see a naked ape, without tools, accomplish grafting or budding. (Of course, it cannot be done--hence the hypocrisy.) As one who has extensive experience in picking fruit, both wild and cultivated, I can attest that the idea of naked people picking fruit is both unrealistic and hilarious. I'd like to see naked people, without tools, successfully harvest a large blackberry patch that is full of thorns, wasps, fire ants, and with a healthy inter-growth of poison oak or poison ivy. Picking fruit is hard work, especially wild fruit, and requires protective clothing and tools if one wants to be efficient at it. Those without tools or clothing will pick very little fruit (and those who limited their diets to fruit picked under such conditions would quickly starve themselves into extinction in evolutionary terms).

P.S. Those who state the above myth as some kind of revealed truth intend for it to suggest that you should eat only fruits and/or leaves (with a few nuts/seeds on occasion). However, stop and think critically about the statement. What can you gather, besides the above, with bare hands? It turns out that you can obtain a wide selection of animal foods that way, as follows. Birds' eggs and chicks. A wide variety of insects and worms (grubs, leafhoppers, ants and termites, etc.). Honey and bee brood. Small shellfish, e.g. freshwater crayfish (watch out for the claws!). Small clams, e.g. the coquina clam of Florida, whose shell is so thin it can be easily crushed by your teeth. Snails and slugs, including many marine snails (seashells). An occasional frog or reptile (snake/lizard). The young of many animal species (chimps catch them without tools--so can humans!). Fish in small ponds/mud puddles; fish that swim upstream to shallow water to breed (e.g., salmon); fish that breed at the shoreline (e.g., Pacific grunion). Animals killed by other predators (scavenging; leopards leave their kill in trees--easy to climb up and steal it, when the leopard is not around!).

M: Herbs are toxic and don't cure anything. R: This is promoted by the "self-healing is the ONLY healing" folks; that myth will be discussed later. If
herbs are toxic, why then do many animals use them when they are sick? A large variety of animals have been observed using herbs as medicines, including: chimpanzees, elephants, many types of carnivores, omnivores, and others. It is interesting to note that some raw-fooders promote wheatgrass juice while condemning the use of herbs. That is quite surprising, for wheatgrass juice is a potent medicinal herb. Additionally, note the medicinal use of herbs by nearly all of the holistic health systems, and the historical use (thousands of years of use) by virtually all indigenous medical systems. Are the raw-fooders who condemn herbs really onto something, or do they simply fail to see the evidence in front of them?

M: The ONLY healing is self-healing. R: This is a theological/philosophical question. One can argue that self-healing has inherent advantages
that make it preferable in many cases. However, to claim it is the only method is to be narrow-minded, and to deny reality. For example, I am critical of fruitarian diets, but I acknowledge reality and agree that a fruitarian diet, in the short run (only), may assist healing from some ailments (but it is dangerous for some other ailments). [That such a diet can be healing (in the short run) is not the question--the question is the long-term problems of such a diet.] The point here is that there are many healing modalities, and they all have merits, and also negative points. For example, many rawists reject supplements, yet the supplement manufacturers have reams of testimonial

450

letters, saying how their product assisted healing. So if you favor self-healing, go ahead and promote it. However, don't insult others by claiming that your way is the only kind of healing.

M: Healing is a biological process. R: This is promoted by the American Natural Hygiene Society (ANHS), and is not a myth--it is true on the
physical level. I would like to point out, though, that true healing means that the whole person--the body, mind, and spirit--have healed. So, to be truly healthy, one cannot obsess on the body (as most rawists do), but must take steps to achieve mental and spiritual health as well. (The steps will vary somewhat according to your spiritual inclinations.) An example will illustrate this. I have encountered fruitarian zealots who loudly proclaim themselves as healthy, while actively promoting hate and fear--which are negative, UNhealthy emotions (i.e., the zealots are mentally/spiritually ill, in my opinion). What good is it to achieve excellent physical health, if it comes at the expense of your mental health? So, keep in mind that you need to pay attention to mental and spiritual health as well. P.S. I respect the ANHS, and admire their efforts to update natural hygiene to reflect new knowledge.

M: Cooked food is poison. R: It is true that some types of cooked food are not very good for you when consumed over a long period
of time--fried foods, heavily salted food, etc. However, cooked food does not merit the term poison in its normal usage. Even if we expand poison to a less rigorous definition such as "those items one cannot digest," we still cannot say that (all) cooked food is poison. Another complication: Raw rhubarb and raw kidney beans are poisonous by any definition, and are more poisonous than any cooked food. If one argues that "cooked food is poison" simply means that some cooked foods, but not all, are poison, then by citing the example of rhubarb and kidney beans, one can say that "raw food is poison" using the same logic. The facts are that some starch foods, and some other foods, are easier to digest when cooked (discussed later). Additionally, some cooked foods, such as steamed vegetables, are not harmful. This may upset some rawists who seek to promote false, idealistic models of nature, but it is reality. Additionally, one must wonder about the mental effect of such slogans. If one believes them and repeats them often enough, one may develop (irrational) fear of cooked foods. As eating is a major part of life, it can infuse your eating--and your life--with fear. That is a slow but certain path to mental and emotional problems. I would encourage rawists to ignore such bogus slogans. P.S. Apply simple common sense: if cooked food really is toxic, then we would all have died long ago.

M: Raw is law. R: A cute, but false and meaningless slogan. The idea that animals never eat cooked food is false:
Animals are killed and cooked by forest fires (also volcanoes, geysers, lightning strikes), and their cooked remains are quickly eaten by other animals--both carnivores and omnivores. Natural, wild animals will go to human-created landfills, and eat their fill of cooked/processed/decaying food.

451

Of course, landfills are not natural, but the animals that feed there are natural. This shows that animals are opportunists, not dogmatists. The animal at the landfill, following instinct, seizes the opportunity and eats the food (cooked or processed) that is available. No "raw is law" dogma for wild animals! If we must be so presumptuous as to claim that we understand nature's laws, then "opportunism is law" is much closer to the truth than the bogus "raw is law." Additionally, an interesting scientific argument can be made (see Part 2 of the Paleolithic vs. Vegetarianism interviews, Fire and Cooking in Human Evolution, on this site for details) that we and our prehistoric ancestors have been using fire (for cooking foods) long enough that our genes have evolved to allow us to consume some cooked food. In other words, consumption of some cooked food may be natural, according to a powerful definition of natural: those foods you have evolved to eat. This point is controversial; some disagree. However, the serious debate on this point occurs at a scientific level that is far above the usual rawist dogma. Note: Some rawists misinterpret the fossil record and claim that it shows that we evolved as natural frugivores/vegans. However, as the fossil record does not support their viewpoint, such claims are not part of the serious scientific debate on this topic. Please note also that this is not a reference to, or criticism of, those who hold a traditional Christian view of creation. P.S. Slogans are usually a poor basis for a diet.

M: A 100% raw vegan diet is the most natural, the best, diet for everyone. R: Everyone is different, and diet must be individualized. There is no one single diet that is "best" for
everyone. Some people will do best on raw, others on macrobiotic, and so on. Those who promote the "one true diet" are promoting dogma rather than fact. Also, 100% raw diets are very problematic--100% raw can be a good healing diet, but it has problems as a long-term maintenance diet. (See the Troubleshooting Problems article for more info on this subject.) Quite frankly, I would question any "expert" who tells you that one specific diet is the best diet for everyone on this planet! Also, the claim to being natural is somewhat questionable, per the discussion of the myths preceding this one.

M: Cooking makes organic minerals inorganic. R: This is, in general, false, and simply nonsense. It was promoted by Herbert Shelton and T.C. Fry. M: You should be a mono-eater (of fruit) because two different types of fruit never grow next to each other in nature. R: There are reasons why a person might want to consider/experiment with mono-eating: easier on the
digestion, may help you eat less, and may help acclimate you to raw foods. Instinctive eaters typically practice sequential mono-eating. However, the myth above is bogus; one wonders whether those promoting such nonsense have ever been in the woods? I grew up in Florida, and can attest that one can find multiple fruit trees growing adjacent to each other, all in fruit at once. Example: a wild mulberry, next to a wild guava, next to a naturalized (wild) lime tree. All three were in fruit simultaneously. Also, growing in the trees were wild grapes, and under the trees nearby, wild deadly nightshade (Solanum nigrum: edible when fully ripe) and wild bitter melon (Momordica charantia). In my opinion, those who promote such nonsense are simply demonstrating their ignorance of nature.

M: Fruit has a nutritional profile similar to mother's milk.

452

R: False. The protein content for both is low, but they do not match in other important areas:
Fats: Milk is high in fat; fruit, except for avocados, is very low in fat. Sugar: Milk contains small amounts of lactose, a slowly assimilated sugar; fruit contains large amounts of glucose, fructose, sucrose, and can cause an insulin spike (and hypoglycemic symptoms). Vitamins B-12, D, biotin: Present in milk, largely absent in fruit. Calcium: Plentiful in milk, low/scarce in fruit.

(For an extensively documented look at the many differences between fruit and mother's milk, plus the logical problems in equating the two, see the in-depth discussion, Fruit Is Not Like Mother's Milk.)

M: All protein foods, including raw protein foods, are toxic. R: According to the above, even sunflower seeds are toxic. Those who actually believe the delusion of this
myth advocate a diet of fruit, with occasional vegetables. This myth is sometimes promoted with "proof"-detailed nutritional theories that are impressive to the layperson, but utterly bogus and logically invalid when examined closely. A key part of the "proof" offered by those promoting such theories is that they went on a 100% fruit diet for months, then ate some seeds or nuts and could not digest them. They blame the protein in the seeds. As a former long-time fruitarian who experienced the same thing, I can attest that what really happens is that following a fruit diet for a long time can weaken the digestive system. Then you eat protein food, which is relatively harder to digest than other foods. The obvious result: an upset stomach, maybe even the production of a small amount of mucus, which a few fruitarians see as "proof" that protein is an evil demon and toxic. Silly fruitarians! It's just a weakened digestion--I went through this very same delusion myself, back in the 1970s. Note that if one goes on a mono-diet of one type of food, the body becomes habituated to it. Then, when one adds a new type of food, it will take the body time to readjust to the new food--e.g., the secretion of additional digestive fluids (stomach acid, liver bile) for the digestion of protein foods. One may also experience this effect after a long fast. Finally, experiencing the "new food" effect only once--because one immediately gives up the "toxic, mucus-producing" food--proves nothing. P.S. If protein really is toxic, we would have all been dead long ago.

M: All raw foods are easier to digest than cooked foods, as the raw foods contain enzymes which are destroyed by cooking. M: Starch is toxic. R: Some cooked foods are easier to digest than raw foods. The starch foods are prime examples of this:
potatoes, rice. Heat degrades the crystalline structure of starch, making it more accessible to the enzyme action in your digestive system. Raw starch is hard to digest, but probably won't harm you unless you consume such foods in gross excess (difficult to do). Starch, whether cooked or raw, is not toxic. At least 70% of the world population has a diet based on starch--cooked starch, no less. If it were truly toxic, there would be a lot fewer people on this planet! Some foods contain antinutrient properties, toxins, and/or taste awful when raw, but are digestible/edible when cooked: large beans, especially kidney beans. Other raw foods have negative side-effects, such as severe flatulence (e.g., raw cabbage, lentil sprouts). Cooking such foods is one way to reduce/avoid side effects. (Other ways to avoid side effects include using spices, and fermentation.)

453

So while many foods are best eaten raw, there are some that are difficult or impossible to eat raw. (P.S. Some types of rice can be sprouted and eaten raw, but it is often very bitter and unpalatable.)

M: Spices are toxic. R: Is everything toxic to the rawist? It is a shame that many rawists refuse to consider spices because of
their ideology. Spices, used properly, can assist/strengthen weak digestion, and can help you to digest the heavy, cold, rough, high-water-content foods that we rawists often eat. Spices also have real medicinal properties and uses. Used improperly, spices can cause problems: they can over-stimulate the digestion, and/or overheat the body. (If that happens to you, you will be the first to know.) If used in small amounts, properly (per your body condition), spices may be beneficial and assist healing. The problem is how to use them properly--for that you can refer to Ayurveda, traditional Chinese medicine, or other genuine holistic systems for guidance.

M: Don't drink juices--they are not a whole food. R: True, juices are not a whole food. Also, one must be careful with juices as it is easy to overconsume
them due to their strong taste. For example, 1 kg. of carrots will give you a liter or so of carrot juice. It is easy to drink a liter of juice, very hard to eat a full kg. of raw carrots. However, juices have important therapeutic properties, and are used extensively in Ayurveda, in the Hippocrates diet, in the Gerson diet, and many other diets. Wheatgrass juice is famous and is the best, easiest way to consume wheatgrass. Also, wild chimpanzees practice a crude type of juicing, known as wadging. (Wadging consists of crushing foods with the palate, sucking the juices, then spitting out the used wadge.) So juices are "natural" after all, even if we use an electric juicer (instead of wadging) to extract them. To summarize: Juices, in moderation, can be a beneficial part of a good raw diet. One should not be afraid of juices.

M: Don't drink water. Your food should contain all the water you need. R: What were the people who dreamed this up drinking? Our close primate relative, the chimpanzees,
drink water. Most land mammals drink water; those that do not are the exception, rather than the rule. Refusing to drink water may be the reason some long-time rawists look so dehydrated. P.S. Some rawists, due to the water content of their diet, may get by on less water than someone on a conventional diet. Still, some water is advisable.

As this article may be controversial, let me briefly address some likely criticisms:

"This is too negative." Well, if raw-foods diets are so good, then why do you apparently favor
promoting them using inaccurate/false information? Their promotion should be honest, and in step with reality. Those who cling to discredited dogma, in my opinion, are in denial--a common, and serious, problem in rawism. "You're promoting doubt." Not really; I am simply presenting reality. If this article shows you that some of the "wisdom" of rawism is inaccurate, misleading dogma, then it has done you a favor. If you are blinded by dogma, then you may ignore symptoms of deficiency and/or illness, and hurt yourself. I did just that, years ago, and have seen others do the same.

454

"This seems anti-vegetarian." It is not intended to be; I have been a vegetarian since 1970,
and was a strict vegan for much of that time. The material here does challenge some of the claims that are a "scientific" basis for veganism/vegetarianism. However, the spiritual and ethical factors used as a basis for vegetarianism are not addressed here. Some of these factors include: not wanting to kill for food, not wanting to harm other creatures unnecessarily, wanting your food to be offered to you with love, and so on. The spiritual and ethical factors, alone, provide a sufficient and satisfactory basis to be a vegetarian. So, you can reject some/all of the "scientific" basis, but still be a vegetarian. Also, if your motive for vegetarianism is spirituality or ethics, you will want an honest basis. "This won't help me achieve 100% raw." Why do you want to be 100% raw? Answer: probably because you believe that it will make you very healthy. So if good health is your objective, and you are logical, you will follow rawist dogma ONLY insofar as it supports good health. If 100% raw does not work for you in the long run, then you will change your diet. In short, your health is more important than dogma, more important than being 100% raw, more important than veganism! I was 100% raw for several years. When one goes 100% raw, there is often a noticeable improvement in physical health in the short run. But, in the long run, problems are common (verified by my experience and observation). My Troubleshooting Problems article is an honest effort to educate others about such problems and ways to overcome them. So in the long run, one learns (the hard way, if one is dogmatic) that rawism is not a guarantee of health, and that turning simplistic dietary dogma like rawism into a religion (with 100% raw considered to be some bizarre kind of "holy sacrament") is a very bad idea indeed. Rawism is a tool, to be used to improve your health. Keep it in its place--as a support tool only. Never let rawist dogma rule your life. Your diet must serve you, not the other way around!

In closing, let me make a few comments:

As mentioned in the beginning, some raw-fooders are very attached to certain of these myths. If I have made any small errors in the above, I hope you realize that it was not deliberate, and also does not detract from the overall points made. Writing the above was a lot of work, and it may cause zealots to attack me further. Ignore any personal attacks on me, and concentrate on the subject matter. The zealots are in denial, and I want you to accept reality as it is--even when it disagrees with rawist ideology. Let me quote, from memory (might not be 100% word-for-word accurate), a beautiful aphorism by Baba Hari Dass: "Life is not a burden, but we make it one when we refuse to accept things as they are." Discard the burden of false dogma, accept reality, and accept life and nature as they are: that is the essence of my message. I hope that you will take this message to heart, and put it into practice in your life.

I hope some of the above was interesting to you. I wish you good health, and good luck with your diet!
--Tom Billings

455

Troubleshooting: Avoiding and Overcoming Problems in Raw & Living-Foods Diets


by Tom Billings
Copyright 1997 by Thomas E. Billings. All rights reserved. Contact author for permission to republish. Notes for a talk held at the SF-LiFE Expo on June 1, 1997, and updated for website release in November, 1997. SF-LiFE is the San Francisco Living Foods Enthusiasts, the oldest and largest raw/living-foods support group in the U.S. For information on SF-LiFE, check http://www.living-foods.com/sflife. Thanks to Dorleen Tong, Ward Nicholson, and a reviewer who wishes to remain anonymous, for valuable comments on this paper; the present version incorporates some of those comments. Note: The medical disclaimer on the Beyond Veg home page applies to this article, as it does to any other article here discussing problems and potential solutions to health conditions.

Preface
The reality of raw diets is that many try such diets, but very few succeed on them, in the long run. Problems are common on raw diets--far more common than some of the idealists promoting such diets care to admit. The purpose of this paper is to list some of the more common problems that can occur on raw diets, and to present a list of possible solutions for you to consider. The approach here is a frank and honest one. Because of this, some readers may find that portions of the material here might not agree with their dietary philosophy. The material here reflects my personal experience in raw-foods diets (since the early 1970s), my observations of, and discussions with, other rawists during that time, and my personal biases. Additionally, although the list below is lengthy, I do not claim that it is complete or perfect. Further, readers should be aware that the author (that's me) does not claim to be perfect.

456

The preceding claim should be obvious, so why am I explicit about it? Reason: because some people put the dietary "experts" on pedestals and effectively worship them (e.g., those who worship Herbert Shelton or T.C. Fry, for instance). I don't want anyone to do that with me--I want people to think for themselves, to ask questions, and to actively search for the truth. Glorifying "experts" leads some people to just "ask the expert" when, instead, they would be best served by their own experimentation and independent thinking. (There is nothing wrong, per se, with "asking the expert"; my point here is that you should apply common sense to all "expert opinion" and take personal responsibility for your own actions and health. Common sense also provides an excellent defense against the utter nonsense promoted by many raw-food extremists.) I hope this paper is interesting, and maybe even helpful, to you!

Physical Problems That Can Occur on Any Raw-Food Diet


Detox symptoms
(examples: headache, nausea, upset stomach, diarrhea, acne/rashes, strong body odor) These symptoms are usually short-term. If the symptoms are due to detox, you can slow down detox by increasing the percentage of cooked food in your diet, and/or by eating "heavier" raw foods like avocados, coconut, or nuts. Don't let dogma override your common sense--some of the above symptoms may have a serious underlying cause if they persist (i.e., they might not be "detox" symptoms). Consult a qualified health professional if serious symptoms persist!

I'm always hungry, even though I overeat! (gluttony)


Psychological causes may be a factor--try meditation or other spiritual practices. Eat mindfully, in a non-stressful environment. Foods eaten in a hurry or under stress will not satisfy your physical or psychological hunger. If you have a very restricted diet (like fruitarianism)--expand your diet. Sprouts, eaten when short (root shoot is length of soaked seed), can be very filling. Experiment with fancy recipes, also raw dairy (if available, and you have no philosophical objections). Consider that your body may be telling you something--that it needs more calories, particularly from the foods avoided by many rawists: those high in fat, protein, and other concentrated foods. (For more in-depth information on this point, see The Calorie Paradox of Raw Veganism.) See also section on sugar addiction (below).

I have cravings for "undesirable" foods. How can I resist?


May be a symptom (result) of detox of those foods. Suggestions for item above on continual hunger apply. Substitute raw foods for cravings: o Sugar: dried fruit/dates, very sweet fruits (mangos), carrot juice, comb honey (use sparingly). Caution re: sugar addiction. o Salty foods: seaweeds, tomatos, celery juice, or for instinctos: RAF. o Fatty foods: avocados, coconut, soaked nuts, raw sesame tahini, raw dairy, or for instinctos: RAF. Avoid temptations to the extent possible.

457

A psychological tool: consider the consequences (how bad you will feel physically, the aftereffects) if you do eat the "undesirable" food. You may want to make written notes on the bad effects, to reinforce the memory thereof, as it is very easy to forget when faced by temptation. If cravings persist in the long term, consider that your body may be telling you that the "undesirable" food contains nutrients that are missing in your diet. Your health is more important than dogma--you may actually need the "undesirable" food!

I'm underweight and emaciated. How can I gain weight? Help!


Primarily a concern for men, as our society foolishly considers extremely thin/underweight women to be more "beautiful"?! Regular strenuous exercise, such as working with light weights, or Ashtanga-style yoga, can be very helpful in gaining/maintaining weight (Viktoras Kulvinskas suggested weightlifting in an interview in "Eat it Raw!" newsletter). Strenuous exercise may stimulate weight gain by promoting muscle growth. Expand the scope of your diet--eat heavier raw foods like nuts, avocados, sprouted sesame. Raw goat milk will put weight on quickly, if you are willing to drink it. Note to fruitarians: it is extremely difficult to gain weight on a fruitarian diet, even if you overeat avocados! (Also see The Calorie Paradox of Raw Veganism for information about how to get more concentrated-calorie foods in your diet.) Some raw-food authors recommend fasting to increase the digestive fire, and indirectly increase food assimilation (i.e., once eating recommences). This is really not a good idea when you are emaciated--for alternatives, see "increasing digestive fire" below.

I'm always cold!


This usually passes in time (unless your weight drops below a normal level); indeed some longtime raw-fooders develop the opposite problem: too hot! Indicates low liver bile output and sluggish digestion. Fasting not recommended here--if you are cold simply because you are emaciated, fasting will not solve the problem! Vigorous physical exercise (fast walking, rebounding, etc.), for 20 minutes or more, may help warm you up--see "increasing digestive fire" section below. Daily warm oil massage (using unrefined, crude sesame oil), Ayurvedic-style, can be very helpful. Gain weight (per above).

The food that I eat passes through me and looks the same when it comes out as it did when it went in! What's going on here?
May be result of overeating; try reducing food intake. May indicate that you are not properly chewing your food; eat more slowly; practice meditative/mindful eating. If neither of the above, often indicates sluggish, weak digestion; common in fruitarianism. See "increasing digestive fire" section below.

I'm frequently very weak/fatigued.


If short-term, usually a detox symptom. Are you getting enough sleep? Eating enough calories? Are you under heavy stress or suffering emotional trauma? Indulging in excessive sex? Using recreational, natural drugs like marijuana? All these can be factors in fatigue. Evaluate your life as objectively as possible, and try to remove or reduce any potential causal factors. (Caution: avoidance of sex out of excessive or irrational fears of energy depletion, when one is otherwise normally interested in sex, can be a problem in raw-food diets--do not invite mental problems by making a fetish of avoidance.) Can be caused by sugar addiction (common in fruitarianism).

458

Carrot juice may be a useful short-term solution, but it can lead to sugar addiction--be careful! Raw dairy may be helpful, or for instinctos: RAF. Though to long-time vegetarians it may appear a false stereotype, adding animal foods to the diet often does increase strength, and may eliminate fatigue. Long-term, may be a symptom of B-12 deficiency (or something much more serious). Consult a qualified health professional.

Vitamin B-12 deficiency


Algae, chlorella, spirulina are not reliable sources. (They contain "analogue" forms of B-12 which recent research has shown the body cannot use, and which can block/crowd out uptake of the needed form.) Also expensive, and some are sold by the inefficient, unpopular method of multilevel marketing. Supplements are best source for vegans: reliable, cheap. Supplements not required if you eat raw dairy or are an instincto. Blood test for B-12 is available--see your physician. Symptoms of B-12 deficiency usually appear very slowly. But do not ignore them if they appear: tingling in the extremities is a cardinal sign, and unusual or persistent fatigue/lassitude can be an additional potential sign. High sugar consumption (fruitarians take note) can accelerate and aggravate B-vitamin deficiency. Some raw-fooders are also at risk for vitamin D, zinc, and in a few cases, calcium deficiency.

Animals don't brush their teeth, why should I?


Because if you don't, your dentist will eventually make lots of money repairing the damage you will cause by not brushing your teeth. Modern fruits are higher in sugars than their wild counterparts (see Wild vs. Cultivated Fruit Table). With a diet high in such fruit sugars, you are asking for teeth problems if you do not brush. Also, diets high in starches and carbohydrates, generally, can induce tooth decay if you do not brush. This includes foods like carrot juice, which is high in sugar--drinking carrots as juice subjects the teeth to more concentrated sugars than otherwise. If you supplement (or perhaps, binge-eat) with foods such as grains, potatoes, etc., without brushing, dental caries can also follow. (See Longevity and Health in Ancient Paleolithic vs. Neolithic Peoples for how bone and teeth problems followed soon upon the introduction of an agricultural diet based on grains, as seen in archaeological skeletons.) This sentiment is sometimes motivated by a fear or dislike of commercial toothpastes. Natural alternatives are available, ranging from salt/baking soda to astringent herbal toothpowders (some of which are excellent).

I'm hyper sometimes and can't sleep.


Although many raw-fooders are mellow people, the diet provides substantial energy which must be burned in some way. Meditation and hatha yoga can be very helpful here. Regular exercise can be quite helpful. Slow, deep breathing, and simply concentrating on the breath is very calming. Evaluate your life for stress factors--reduce stress. If due to hunger, see also the "I'm always hungry" section (above). Herbs with a sedative action are available, including the commonly available chamomile tea. (Many other herbs are available, see holistic health professional for recommendation.) Long-term 100% raw, restricted diets (e.g., fruitarianism), may place one at risk of nervous system disorders, which can disrupt sleep. Finally, if you are willing to consider it, animal foods reportedly have helped those who have trouble sleeping when the above remedies failed, and sometimes improve the quality of sleep. If problem persists, see qualified health professional.

459

Little or no sex drive


Not seen as a problem by many; seen as spiritual progress by some and/or as allowing you to relate to others in a loving, yet non-sexual way. Others see this as a serious problem indeed! To increase sex drive, you can try (no guarantee they will work): increasing percentage of cooked food in diet, eating onions/garlic (though the odor may drive away people who have a good sense of smell!). Some have found that increasing the amount of sea vegetables in the diet can be helpful here. May be a sign of zinc deficiency; broaden the diet to get more zinc, or try supplements. May be due to EPA/EFA deficiency. EPA/EFA are found primarily in animal foods; some is available in flaxseed, and surprisingly, in the common weed purslane (Portulaca oleracae). Certain herbs may increase the sex drive. (Note: the claim is they may impact the "sexual mind," not that they are "aphrodisiacs.")

Supplement: Increasing Digestive Fire


Hatha yoga is best exercise system for this. No conventional exercises can compare to nauli kriya, and many other yoga exercises are superb for increasing digestive fire: peacock pose, agni sara dhauti, uddhiyana bandha, agni sara pranayama, kapalabhati, bhastrika, etc. Some raw-fooders report that distance running, cycling, swimming, or other intensely aerobic exercise not only sharply increases their appetite, but also improves their digestion. Spices: ginger, cayenne, black pepper, other spices. Used carefully and correctly these may be beneficial. The dogma of some raw-fooders prevents them from even considering these. Pungent greens, like mustard, watercress, arugula, are alternatives to pungent spices. Some foods increase digestive fire--onions, garlic. Some raw-fooders have religious objections (and other objections) to these. Daily Ayurvedic oil self-massage, with clockwise massage of the stomach area, can increase digestive fire over time (sounds strange, but it really works!). Fasting, if not contraindicated by conditions (emaciation, for one), can increase the digestive fire. Try reducing number of meals per day. This allows your digestive system to rest longer each day. Caution: in some people this may work too well, and may aggravate your digestive fire. Tonic herbs: the Ayurvedic herbal blend triphala churna, which is three fruits, dried and ground up, strengthens the entire digestive system, and is extremely good for the colon.

Physical Problems Associated with Specific Dietary Styles


FRUITARIANISM AND NATURAL HYGIENE Dental problems: severe erosion of tooth enamel (enamel hypoplasia)
Caused by consumption of excessive amounts of acidic fruit like citrus, pineapples, kiwi. May be caused by acid reflux, due to overconsumption of sweet fruit, especially dried fruit/dates; common in sugar addiction. Very similar to dental damage encountered in bulimia (an eating disorder). If damage already done, see your dentist for restorative work: bondings, veneers, crowns, etc.

460

Prevention is better than repair! Limit acid fruit, always brush with baking soda and floss after eating acid fruit, do something about acid reflux/sugar addiction. Have your teeth checked by your dentist for enamel erosion.

Sugar addiction: sugar highs, and sugar blues


Serious problem in fruitarianism, as modern hybrid fruit contains UNnaturally high levels of sugar, and some fruitarians unknowingly or ignorantly claim that 100% fruit (hybrid sweet fruit, not the sour, fibrous, wild fruit!) is the "ideal" diet. A physical addiction, but it has mental effects: highs like a stimulant drug, followed by depression, with accompanying psychological dependency. Excess sugar can produce symptoms similar to those found in diabetes and hypoglycemia: excess urination and thirst, mood swings, fatigue, temporarily blurred vision, pains in extremities, etc. (Good idea to see a health professional if these symptoms persist.) The traditional antidote for excess sugar is to eat bitter foods: bitter greens (endive, escarole, dandelion, wild greens like the thistle family), bitter melon, turmeric, or bitter herbs. Eat some of these regularly--daily. See Ayurvedic practitioner, OMD (Oriental medical doctor trained in traditional Chinese medicine), or Western herbalist and get herbal anti-diabetes program. Overeating of sugar is clear evidence that your diet is not psychologically satisfying, in the long run. Make appropriate dietary changes and address underlying psychological factors via mindful eating and meditation/spiritual practice.

I was a fruitarian for year(s), and now my digestive system is extremely weak; I eat food but have real difficulty digesting it. Help!
An alternate, but equal form of this question: I was a fruitarian for year(s), and now I find that I cannot digest raw protein foods like seeds and nuts. Help! Very common complaint among former fruitarians. See "increasing digestive fire" section (above); hatha yoga can be very helpful in this situation. Use of tonic herbs is suggested here; the Ayurvedic herbal blend triphala churna is a truly excellent tonic.

Mental problems
Common, as former fruitarians openly admit. Those who are currently fruitarians are often less candid. Natural Hygiene is subject to many of the same problems--see "mental problems" section.

LIVING FOODS Flatulence (gas)--from sprouts or certain vegetables, especially cabbage family
See section on "increasing digestive fire" (above). Mono-eating and food-combining are options for experimentation. Eat with oily dressing or food--such as avocados or tahini dressing: slows down passage of food through digestive system. For sprouts: turmeric and ginger are a great help in digesting protein foods. For vegetables: fermentation (e.g., sauerkraut), and marination in lime/lemon juice (which may include spices also). Dental caution re: lime/lemon. If all else fails, eliminate the offending foods from your diet.

461

Watermelon Juice Fasts Are Not For Everyone.


Some living-foods books advocate watermelon juice fasts as a cleansing tool. Ayurveda and TCM (traditional Chinese medicine) say watermelon juice is contraindicated in: hypoglycemia and diabetes (disorders characterized by excess urination and problems with sugar metabolism), congestive lung disorders (watermelon is cooling, may increase mucus). Note: some living-fooders suggest fasting on juice of the watermelon rind (the green part, not the sweet red part) in hypoglycemia/diabetes. As the green rind may be an antidote for the sugar in the flesh, fasting on the rind juice might be okay then. (Warning: try at your own risk.) Ayurveda says watermelon increases intraocular pressure--the pressure in your eyes. This suggests that watermelon fasting is not appropriate if one has glaucoma, optic nerve disorders, detached retina, or other serious eye problems (consult your opthalmologist). Obviously, your eyesight is too valuable to put at risk! If you have eye disorders and want to fast, fast on water or some other type of juice, not watermelon juice. Some advocates of watermelon juice "fasting" suggest consuming an entire large watermelon in a day. While this can be cleansing to the kidneys, it may provide more sugar than can be tolerated by some. One can argue that eating a whole melon in one day is gluttony rather than fasting. An alternative is to limit the amount of melon used--say to 3 kg (1 kg per main meal), drinking goodquality water (instead of watermelon juice) between juice meals.

FASTING
Avoid excess fasting--generally, fasting is NOT necessary to receive the positive results of a rawfood diet. Fasting may create a psychological feeling of deprivation--leading to overeating/binge-eating after the fast. The result is a yo-yo process of fasting followed by gluttony. If this is your reaction to fasting, it is not helping you in the long run. (If the yo-yo eating persists, it is bulimia.) Can make backsliding or cheating easy: "It's okay to eat this junk; I'll get rid of it by fasting." (Not good for you if it becomes a habit.) A certain amount of fasting can reset the body's metabolism, making it very difficult to lose weight and easy to gain weight. Grossly excessive fasting, as in anorexia nervosa, can have the opposite physical effect. (The latter effect can also occur after years on a high-percentage fruit diet.) Some instinctos report that a short-term (2-3 days maximum) mono-diet of cassia pods is highly effective at promoting detox, and may be an alternative to (or supplement for) conventional juice/water fasting. Moderate fasting can increase your digestive fire. In some people, this effect is helpful; in others it may be harmful. Fasting is strong "medicine"--take in "right dose," when really needed. Excess fasting weakens the mind-body connection, can lead to serious mental problems, particularly ego problems. (Following a high-percentage fruit diet for a long time may produce similar problems.)

Mental Problems on Raw-Food Diets


Personal experience and observation: Mental problems are more common in fruitarianism and natural hygiene, less common in living foods and instinctive eating.

462

Behaviors Found in Eating Disorders (Anorexia, Bulimia) and Raw Foods


Lying about eating (common in fruitarianism). Obsession with food (fueled by constant hunger/sugar addiction). Backsliding ("cheating"), eating in secret, binges. Feeling guilty about eating. Perfectionist attitude and resultant poor self-image. Eating becomes a stressful act. (Anorectic: Will it make me fat? Raw-fooder: Will it produce mucus? Will it constipate me? Did I combine foods correctly?) Acceptance of severe underweight because of delusions/dogma. (Anorectic: I'm FAT! Rawfooder: The lost weight is mucus/toxins; glad it's gone!) Fruitarianism can produce a physical and mental feeling of lightness. Some fruitarians mistake this for a "spiritual" feeling--it usually is not; it is comparable to a mild drug high. This mental feeling may really be a symptom of zinc deficiency. Also, the misinterpretation of this feeling may lead a few down the path to narrow-minded zealotry and hostile behavior. The types of behavior described above reflect a very UNhealthy emotional and mental relationship to food!

The WRONG Motives for a Raw Food Diet Common but inappropriate motives:
Fear of mucus--obsession with toxemia. Hatred of protein foods, including raw protein foods. Hatred/fear of cooked foods and those who eat cooked foods. Unfortunately there are people in the raw-foods movement who promote the above as reasons to have a raw-food diet. (Shame on them!) Hatred and fear are not a healthy basis for a diet; they are not healthy emotions. To have hatred and fear as motivations for your diet is to "eat" hatred and fear daily, and to put them at the very center of your life--a very bad idea!

Other inappropriate motives:


You think it makes you morally superior ("more compassionate") than those who eat meat or cooked/processed foods (ego). You think it is 100% guaranteed to bring you perfect health (there are no guarantees in life). You think you can eat your way "into heaven," that eating raw foods will make you enlightened, or will somehow make you spiritually superior. Eating must be subordinate to spirituality; it can be a part of, and support your spirituality, but should not dominate it. You read book(s) by raw-food "expert(s)," and you think the expert(s) know everything there is to know about health. (They don't, and some of the so-called experts are frauds.) An example of this would be those people who view the writings of Herbert Shelton or T.C. Fry as being, in effect, "holy scriptures." Side note: I am not suggesting Shelton was a fraud; as for Fry--no comment. It is a cover for your eating disorder(s).

The Terrible Mental Poison Called Zealotry


Fear of mucus; blaming all sickness on mucus. Intellectual dishonesty of blaming protein foods for most health problems. Fear/hatred of protein foods, including raw protein foods. Intellectual dishonesty of blaming cooked foods for all the world's problems. Fear/hatred of cooked foods and those who eat cooked foods.

463

Dietary philosophy becomes a de-facto religion. "My diet is the ONLY truth, all other diets are wrong; even other raw-food diets are wrong!" (Alas, such stupidity is far too common in the rawfoods movement.) You MUST adopt a raw-food diet or the world will come to an end! ("Chicken Little" environmentalist syndrome.)

To Avoid Mental Problems


Have positive motives for your diet. Avoid the poisons of obsession, hate, fear, ego, zealotry. Raw/living foods--an excellent diet, but a terrible religion. Don't make it your religion! It can be incorporated into your spirituality if you wish, but don't make it more important than your spiritual path. It is better to be a cooked-food eater with a healthy emotional/mental relationship to food than to be a raw-fooder who has an UNhealthy emotional/mental relationship to food. However, best of all is to be a raw-fooder who has a healthy emotional/mental relationship with food.

--Tom Billings

Appendix: References of Possible Interest Hatha yoga resources


Some of the hatha yoga practices listed under "increasing digestive fire" are advanced practices which one must work into over time. It is best to have a yoga teacher to learn these practices from; books and videos are nice but are not a substitute for a real teacher.

464

References that discuss many of the practices mentioned here include: The Practices of Yoga for the Digestive System, by Dr. Swami Shankardevananda Saraswati; Bihar School of India. (This book is hard to get, but see below for a source.) Integral Yoga Hatha, by Sri Swami Satchidananda; Integral Yoga Publications, Yogaville, Virginia. Ashtanga Yoga Primer, by Baba Hari Dass; Sri Rama Publishing. A reference that deals specifically with agni sara pranayama: "Solar Power," reprint from Yoga International magazine (NOT Yoga Journal magazine).

Ayurvedic food properties


The Ayurvedic properties of foods are discussed in the book, Ayurvedic Cooking for Self-Healing, by Dr. Vasant Lad and Usha Lad. Even though it is a cookbook, and the recipes are of very limited value to a rawfooder, I recommend the book because it has the most detailed set of food tables and properties (more detailed and much better than the tables in Gabriel Cousens' book), and it has a lot of information on the use of spices, including medicinal uses. In order to use spices in accord with your Ayurvedic body type, you need to know what your body type is: your prakruti--what you should be; and your vikruti--what you are now (ideally, same as prakruti). The best way to get this information is to see a qualified Ayurvedic practitioner, and have your pulse read to determine this (there are a very few qualified practitioners in the San Francisco area, can give you a referral). Alternately, you can use the various tables/tests in books to determine your type. The best book in this regard is: Prakruti, by Robert Svoboda; Geocom Ltd.

Ayurvedic massage
Ayurvedic massage is discussed in most basic books on Ayurveda, and is easy to do for yourself. Detailed information on self-massage is in the book Ayurvedic Beauty Care, by Melanie Sachs, and the topic is covered in Ayurvedic Massage, by Harish Johari. Many of the books mentioned above can be obtained at the bookstore at the Integral Yoga Institute (770 Dolores St., San Francisco, CA 94110 U.S.A.).

END

PAGE 1 OF REFERENCES
Comparative Anatomy and Physiology Brought Up to Date: Are Humans Natural Frugivores/Vegetarians, or Omnivores/Faunivores?

465

NOTE: References cited solely within quotations in the sources listed here, and/or identified using the phrase "as cited in" or similar terminology, are not included below. Check the reference list of the identified citing paper for details on such references. Agren JJ, Tormala ML, Nenonen MT, Hanninen OO (1995) "Fatty acid composition of erythrocyte, platelet, and serum lipids in strict vegans." Lipids, vol. 30, pp. 365-369. Aiello L (1992) "Body size and energy requirements." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, pp. 41-45. Aiello LC, Wheeler P (1995) "The expensive tissue hypothesis: the brain and the digestive system in human and primate evolution." Current Anthropology, vol. 36, pp. 199-221. Albert M, Mathan V, Baker S (1980) "Vitamin B-12 synthesis by human small intestinal bacteria." Nature, vol. 283, pp. 781-782. Allman WF (1994) The Stone Age Present, Simon and Schuster, New York. Altar T (date unknown--may be 1994) "What might be the 'natural' diet of human beings?" Available on Internet at: http://arrs.envirolink.org/ar-voices/natural_diet.html. Anapol F, Lee S (1994) "Morphological adaptation to diet in platyrrhine primates." American Journal of Physical Anthropology, vol. 94, pp. 239-261. Anderson GJ, Connor WE, Corliss JD (1990) "Docosahexaenoic acid is the preferred dietary n-3 fatty acid for the development of the brain and retina." Pediatric Research, vol. 27, pp. 89-97. Anderson G, Connor W (1994) "Accretion of n-3 fatty acids in the brain and retina of chicks fed a low linolenic acid diet supplemented with docosahexaenoic acid." American Journal of Clinical Nutrition, vol. 59, pp. 1338-1346. Ansell GB (1973) "Historical introduction." In: Form and Function of Phospholipids, eds. Ansell GB, Hawthore JN, Dawson RMC; Elsevier Scientific Publishing, New York. (pp. 1-8) Armstrong BK, Brown JB, Clarke HT, Crooke DK, Hahnel R, Masarel JR, Ratajczak T (1981) "Diet and reproductive hormones: a study of vegetarian and nonvegetarian postmenopausal women." Journal of the National Cancer Institute, vol. 67, pp. 761-767. Baker S (1981) "Contribution of the microflora of the small intestine to the vitamin B-12 nutriture of man (letter)." Nutrition Reviews, vol. 39, pp. 147-148. Bannerjee DK, Chatterjea JB (1963) "Vitamin B-12 content of some articles of Indian diets and effect of cooking on it." British Journal of Nutrition, vol. 17, pp. 385-389. Bar-Sella P, Rakover Y, Ratner D (1990) "Vitamin B12 and folate levels in long-term vegans." Israel Journal of Medical Sciences, vol. 26, pp. 309-312. Berglas A (1957) Cancer: Nature, Cause and Cure. Institute Pasteur, Paris. Biesalski H (1997) "Bioavailability of vitamin A." European Journal of Clinical Nutrition, vol. 51 (suppl 1), pp. S71-S75.

466

Blass EM (1987) "Opioids, sweets, and a mechanism for positive affect: broad motivational implications." In: Sweetness, ed. Dobbing J; Springer-Verlag, Berlin, pp. 115-126. Bjorkman L, Mottett K, Nylander M, Vahter M, Lind B, Friberg L (1995) "Selenium concentrations in the brain after exposure to methylmercury: relations between the inorganic mercury fraction and selenium." Archives of Toxicology, vol. 69, pp. 228-234 (abstract). Boesch C, Tomasello M (1998) "Chimpanzee and human cultures." Current Anthropology, vol. 39, pp. 591614. Bonkovsky H, Ponka P, Bacor B, Drysdale J, Tavill A (1996) "An update on iron metabolism: summary of the Fifth International Conference on Disorders of Iron Metabolism." Hepatology, vol. 24, pp. 718-729 (abstract). Brace CL, Smith SL, Hunt KD (1991) "What big teeth you had Grandma! Human tooth size, past and present." In: Advances in Dental Anthropology, eds. Kelley MA, Larsen CS; Wiley-Liss, New York, pp. 3357. Brown E, Micozzi M, Craft N, Bieri J, Beecher G, Edwards B, Rose A, Taylor P, Smith Jr. J (1989) "Plasma carotenoids in normal men after a single ingestion of vegetables or purified beta-carotene." American Journal of Clinical Nutrition, vol. 49, pp. 1258-1265. Butynski TM (1982) "Vertebrate predation by primates: a review of hunting patterns and prey." Journal of Human Evolution, vol. 11, pp. 421-430. Carlson S, Rhodes P, Ferguson M (1986) "Docosahexaenoic acid status of preterm infants at birth and following feeding with human milk or formula." American Journal of Clinical Nutrition, vol. 44, pp. 798804. Campbell TC (1989) "A study on diet, nutrition and disease in the People's Republic of China, Part I." Contemporary Nutrition, vol. 14(5), pp. 1-2. Campbell TC, Junshi C (1994) "Diet and chronic degenerative diseases: perspectives from China." American Journal of Clinical Nutrition, vol. 59(suppl), pp. 1153S-1161S. Chanarin I (1969) The Megaloblastic Anaemias. Blackwell Scientific Publications, Oxford (U.K.), pp. 4063, 708-714. Chapman CA, Chapman LJ (1990) "Dietary variability in primate populations." Primates, vol. 31, pp. 121128. Chivers DJ, Hladik CM (1980) "Morphology of the gastrointestinal tract in primates: comparisons with other mammals in relation to diet." Journal of Morphology, vol. 166, pp. 337-386. Chivers DJ, Hladik CM (1984) "Diet and gut morphology in primates." In: Food Acquisition and Processing in Primates, eds. Chivers DJ, Wood BA, Bilsborough A; Plenum Press, New York, pp. 213-230. Chivers DJ (1992) "Diets and guts." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, pp. 60-64. Chivers DJ, Langer P (1994) "Gut form and function: variations and terminology." In: The Digestive System in Mammals: Food, Form, and Function, Cambridge University Press, pp. 3-8.

467

Chivers DJ (1998) "Measuring food intake in wild animals: primates." Proceedings of the Nutrition Society, vol. 57, pp. 321-332. Clastres P (1972) "The Guayaki." In: Hunters and Gatherers Today, ed. Biccheri M; Holt, Rinehart, and Winston, Inc., New York, pp. 138-174. Cohen BL (1994) "Invited commentary: in defense of ecologic studies for testing a linear-no threshold theory." American Journal of Epidemiology, vol. 139, pp. 765-768. Collier GR, Greenberg GR, Wolever TMS, Jenkins DA (1988) "The acute effect of fat on insulin secretion." Journal of Clinical Endocrinology and Metabolism, vol. 66, pp. 323-326. Connor WE, Lowensohn R, Hatcher L (1996) "Increased docosahexaenoic acid levels in human newborn infants by administration of sardines and fish oil during pregnancy." Lipids, vol.31 (suppl), pp. S-183 to S187. Conquer JA, Holub BJ (1996) "Supplementation with an algae source of docosahexaenoic acid increases (n-3) fatty acid status and alters selected risk factors for heart disease in vegetarian subjects." Journal of Nutrition, vol. 126, pp. 3032-3039. Cooper B, Rosenblatt D (1987) "Inherited defects of vitamin B-12 metabolism." Annual Review of Nutrition, vol. 7, pp. 291-320. Conrad M, Benjamin B, Williams H, Foy A (1967) "Human absorption of hemoglobin-iron." Gastroenterology, vol. 53, pp. 5-10. Cordain L, Miller J, Mann N (1998) "(letter) Scant evidence of periodic starvation among huntergatherers." Diabetologia, vol. 43, pp. 383-384. Corder, CA (1998) "Communications: longevity and plant foods--attributions compared to associations." Vegetarian Nutrition: An International Journal, vol. 2, issue 3, pp. 129-132. Craig W (1994) "Iron status of vegetarians." American Journal of Clinical Nutrition, vol. 59(suppl), pp. 1233S-1237S. Crane MG, Register UD, Lukens RH, Gregory R (1998) "Cobalamin (CBL) studies on two total vegetarian (vegan) families." Vegetarian Nutrition: An International Journal, vol. 2, issue 3, pp. 87-92. Cziko G (1995) Without Miracles: Universal Selection Theory and the Second Darwinian Revolution, MIT Press (Bradford Books), Cambridge, Massachusetts. Davidson L, Lonnerdahl B (1989) "Fe-saturation and proteolysis of human lactoferrin: effect on brushborder receptor mediated uptake of Fe and Mn." American Journal of Physiology, vol. 257, pp. G931-G934. de Pee S, West C, Muhlilal D, Hautvast J (1995) "Lack of improvement in vitamin A status with increased consumption of dark-green leafy vegetables." Lancet, vol. 346, pp. 75-81. de Pee S, West C (1996) "Dietary carotenoids and their role in combating vitamin A deficiency: a review of the literature." European Journal of Clinical Nutrition, vol. 50(suppl 3), pp. S38-S53. Deacon T (1992) "Biological aspects of language." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, pp. 128-133.

468

Dinshah H. Jay (1976-1977) "Fruit for thought: should man live by fruit alone?" (a 4-part article series). Ahimsa, Part 1 in vol. 17, no. 2 (Mar./Apr. 1976), pp. 1, 4, 7-8; Part 2 in vol. 17, no. 3 (May/Jun. 1976), pp. 1, 7-8; Part 3 in vol. 17, no. 4 (Jul./Aug., Sept./Oct., Nov./Dec. 1976 combined issue), pp. 1-9, 11-12; Part 4 in vol. 18, no. 1 (Jan./Mar. 1977), pp. 1-3, 5-8. Dong A, Scott SC (1982) "Serum vitamin B-12 and blood cell values in vegetarians." Annals of Nutrition and Metabolism, vol. 26, pp. 209-216. Dunn FL (1968) "Epidemiological factors: health and disease in hunter-gatherers." In: Man the Hunter, eds. Lee RB, DeVore I; Aldine Publishing, Chicago, pp. 221-228. Eaton SB, Konner M (1985) "Paleolithic nutrition: a consideration of its nature and current implications." New England Journal of Medicine, vol. 312, pp. 283-289. Eaton SB, Konner M, Shostak M (1988) "Stone agers in the fast lane: chronic degenerative diseases in evolutionary perspective." American Journal of Medicine, vol. 84, pp. 739-749. Eaton SB, Pike M, et al. (1994) "Women's reproductive cancers in evolutionary context." The Quarterly Review of Biology, vol. 69, pp. 353-367. Eaton, S. Boyd; Eaton, Stanley B. III; Konner MJ; Shostak M (1996) "An evolutionary perspective enhances understanding of human nutritional requirements." Journal of Nutrition, vol. 126 (1996), pp. 1732-1740. Eaton SB, Eaton SB III (1998) "Evolution, diet and health," 14th Int'l Cong of Anthro and Ethno Sciences. Available on Internet at: http://www.cast.uark.edu/local/icaes/conferences/wburg/posters/sboydeaton/eaton.htm. Eaton SB, Eaton SB III, Sinclair AJ, Cordain L, Mann NJ (1998) "Dietary intake of long-chain polyunsaturated fatty acids during the Paleolithic." World Review of Nutrition and Dietetics, vol. 83, pp. 1223. Ember CR (1978) "Myths about hunter-gatherers." Ethnology, vol. 17, pp. 439-448. Estes JA (1989) "Adaptations for aquatic living by carnivores." In: Carnivore Behavior, Ecology, and Evolution, ed. Gittleman JL; Cornell University Press, pp. 242-282. FAO (1995) "Nutrition science policy. WHO and FAO joint consultation: fats and oils in human nutrition." Nutrition Reviews, vol. 53, pp. 202-205. Fallon S, Enig MG (1997) "The Cave Man Diet." Paleodiet e-mail list, May 23, 1997, posting no. 160. Farquharson J, Cockburn F, Patrick W, Jamieson E, Logan R (1992) "Infant cerebral cortex phospholipid fatty-acid composition and diet." Lancet, vol. 340, pp. 810-813. Feder J, et al. (1996) "A novel MHC class l-like gene is mutated in patients with hereditary haemochromatosis." Nature Genetics, vol. 13, pp. 399-408. Felsenstein J (1987) "Estimation of hominoid phylogeny from a DNA hybridization data set." Journal of Molecular Evolution, vol. 26, pp. 132-131.

469

Foley R (1982) "A reconsideration of the role of predation on large mammals in tropical hunter-gatherer adaptation." Man (The Journal of the Royal Anthropological Institute), vol. 17, pp. 393-402. Foley RA, Lee PC (1991) "Ecology and energetics of encephalization in hominid evolution." Philosophical Transactions of the Royal Society of London, Series B, vol. 334, pp. 223-232. Food and Agriculture Organization (FAO) (1970) Amino-Acid Content of Foods and Biological Data on Proteins. Fossey D, Harcourt AH (1977) "Feeding ecology of free-ranging mountain gorilla (gorilla gorilla beringei)." In: Primate Ecology: Studies of Feeding and Ranging Behaviour in Lemurs, Monkeys, and Apes, ed. Clutton-Brock TH; Academic Press, New York, pp. 415-447. Freedman D (1988) "From mouse to man: the quantitative assessment of cancer risks." Statistical Science, vol. 3, pp. 3-56. Freedman D (1999) Ecological Inference and the Ecological Fallacy. Technical Report No. 549, Department of Statistics, University of California, Berkeley. Available on Internet at: http://www.stat.berkeley.edu/tech-reports/549.abstract (abstract only); or http://www.stat.berkeley.edu/~census/549.pdf (full report, PDF, requires Abobe Acrobat Reader); or http://www.stat.berkeley.edu/tech-reports/549.ps.Z (full report, PostScript compressed using Unix compress command; Note: this is not a .zip file). Freeland-Graves J (1988) "Mineral adequacy of vegetarian diets." American Journal of Clinical Nutrition, vol. 48, pp. 859-862. Friberg L, Mottett N (1989) "Accumulation of methylmercury and inorganic mercury in the brain." Biological Trace Element Research, vol. 21, pp. 210-206 (abstract). Friedman M, Rosenman RH, Byers S (1964) "Serum lipids and conjunctival circulation after fat ingestion in men exhibiting type-A behavior pattern." Circulation, vol. 29, pp. 874-886. Galdikas BMF, Teleki G (1981) "Variations in subsistence activities of female and male pongids: new perspectives on the origins of hominid labor division." Current Anthropology, vol. 22, pp. 241-256. Garn SM, Leonard WR (1989) "What did our ancestors eat?" Nutrition Reviews, vol. 47, pp. 337-345. Garner GW, Knick ST, Douglas DC (1990) "Seasonal movements of adult female polar bears in the Bering and Chukchi Seas." International Conference on Bear Research and Management, vol. 8, pp. 219-226. Gaull G (1982) "Taurine nutrition in man." In: Taurine in Nutrition and Neurology (Advances in Experimental Medicine and Biology, vol. 139), eds. Huxtable R, Pasantes-Morales H; Plenum Press, New York, pp. 89-95. Gaull G (1986) "Taurine as a conditionally essential nutrient in man." Journal of the American College of Nutrition, vol. 5, pp. 121-125. Gerster H (1998) "Can adults adequately convert alpha-linolenic acid (18:3n-3) to eicosapentaenoic acid (20:5n-3) and docosahexaenoic acid (22:6n-3)?" International Journal for Vitamin and Nutrition Research, vol. 68, pp. 159-173.

470

Gibson RA, Neumann MA, Makrides (1996) "Effect of docosahexaenoic acid on brain composition and neural function in term infants." Lipids, vol. 31 (suppl), pp. S-177 to S-182. Gittleman, JL (1994) "Are the pandas successful specialists or evolutionary failures?" BioScience, vol. 44, pp. 456-464. Glendinning J (1994) "Is the bitter rejection response always adaptive?" Physiology and Behavior, vol. 56, pp. 1217-1227. Goodall J (1986) The Chimpanzees of Gombe, Harvard University Press, Cambridge, Massachusetts. Goodman G (1975) "Protein sequence and immunological specificity: their role in phylogenetic studies of primates." In: Phylogeny of the Primates, eds. Luckett WP, Szalay FS; Plenum Press, New York, pp. 219248. Gould S, Lewontin R (1994) "The spandrels of San Marco and the Panglossian paradigm: a critique of the adaptationist programme." In: Conceptual Issues in Evolutionary Biology, ed. Sober E; MIT Press, Cambridge, Massachusetts. Note: this is a reprint of an article that was originally published in 1978. Grasbeck R, Kouvonen I, Lundberg M, Tenhunen R (1979) "An intestinal receptor for heme." Scandinavian Journal of Haematology, vol. 23, pp. 5-9. Grasbeck R, Majuri R, Kouvonen I, Tenhunen R (1979) "Spectral and other studies on the intestinal haem receptor of the pig." Biochimica et Biophysica Acta, vol. 700, pp. 137-142. Greenland S, Robins J (1994a) "Invited commentary: ecologic studies--biases, misconceptions, and counterexamples." American Journal of Epidemiology, vol. 139, pp. 747-760. Greenland S, Robins J (1994b) "Accepting the limits of ecologic studies: Drs. Greenland and Robins reply to Drs. Piantadosi and Cohen." American Journal of Epidemiology, vol. 139, pp. 769-771. Groves CP (1986) "Systematics of the great apes." In: Comparative Primate Biology, Volume 1: Systematics, Evolution, and Anatomy, eds. Swindler DR, Erwin J; Alan R. Liss, Inc., New York, pp. 187217. Hamilton III WJ, Busse CD (1978) "Primate carnivory and its significance to human diets." Bioscience, vol. 28, pp. 761-766. Harcourt AH, Harcourt SA (1984) "Insectivory by gorillas." Folia Primatologica, vol. 43, pp. 229-233. Harding RSO (1981) "An order of omnivores: nonhuman primate diets in the wild." In: Omnivorous Primates, eds. Harding RSO, Teleki G; Columbia University Press, New York, pp. 191-214. Hathcock JN, Troendle GJ (1991) "Oral cobalamin for treatment of pernicious anemia? (editorial)." Journal of the American Medical Association, vol. 265, pp. 96-97. Hennigar G, Greene W, Walker E, de Saussure C (1979) "Hemochromatosis caused by excessive vitamin iron intake." American Journal of Pathology, vol. 96, pp. 611-623 (abstract). Herbert V (1988) "Vitamin B-12: plant sources, requirements, and assay." American Journal of Clinical Nutrition, vol. 48, pp. 852-858.

471

Herbert V (1994) "Staging vitamin B-12 (cobalamin) status in vegetarians." American Journal of Clinical Nutrition, vol. 59(suppl), pp. 1213S-1222S. Herbert V, Das K (1994) "Folic acid and vitamin B-12." In: Modern Nutrition in Health and Disease, vol. 1, eds. Shils M, Olson J, Shike M; Lea and Febiger, Philadelphia, pp. 402-425. Herbert V, Drivas G, Manusselis C, Mackler B, Eng J, Schwartz E (1984) "Are colon bacteria a major source of cobalamin analogues in human tissues?" Transactions of the Association of American Physicians, vol. 97, pp. 161-171. Heyssel RM, Bozian RC, Darby WJ, Bell MC (1966) "Vitamin B-12 turnover in man." American Journal of Clinical Nutrition, vol. 18, pp. 176-184. Hiiemae K (1984) "Functional aspects of jaw morphology." In: Food Acquisition and Processing in Primates, eds. Chivers DJ, Wood BA, Bilsborough A; Plenum Press, New York, pp. 257-282. Hill K, Hawkes K, Hurtado M, Kaplan H (1984) "Seasonal variance in the diet of Ache hunter-gatherers in eastern Paraguay." Human Ecology, vol. 12, pp. 101-135. Hill WCO (1958) "Pharynx, oesophagus, stomach, small and large intestine. Form and position." In: Primatologia III, eds. Hofer H, Schultz AH, Starck D; S. Karger, Basel (Switzerland), pp. 139-207. Hladik CM, Chivers DJ, Pasquet P (1999) "On diet and gut size in non-human primates and humans: is there a relationship to brain size?" Current Anthropology, vol. 40, pp. 695-697. Hoffman DR, Birch EE, Birch DG, Uauy RD (1993) "Effects of supplementation with w3 long-chain polyunsaturated fatty acids on retinal and cortical development in premature infants." American Journal of Clinical Nutrition, vol. 57 (suppl), pp. 807S-812S. Huffman MH (1997) "Current evidence for self-medication in primates: a multidisciplinary perspective." Yearbook of Physical Anthropology, vol. 40, pp. 171-200. Hunt KD (1994) "The evolution of human bipedality: ecology and functional morphology." Journal of Human Evolution, vol. 26, pp. 183-202. Hurrell R (1997) "Bioavailability of iron." European Journal of Clinical Nutrition, vol. 51(suppl 1), pp. S4S8. Huxtable R (1992) "Physiological actions of taurine." Physiological Reviews, vol. 72, pp. 101-163. Immerman, AM (1981) "Vitamin B12 status on a vegetarian diet. A critical review." World Review of Nutrition and Dietetics, vol. 37, pp. 38-54. Itzkoff SW (1985) Triumph of the Intelligent, Paideia, Ashfield, Massachusetts. Jones S (1992) "Natural Selection in Humans." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, pp. 284-287. Junshi C, Campbell TC, Junyao L, Peto R (1990) Diet, Life-style, and Mortality in China, Oxford University Press.

472

Jumpsen J, Clandinin M (1995) Brain Development: Relationship to Dietary Lipid and Lipid Metabolism, AOCS Press, Champaign, Illinois. Kanazawa S, Herbert V (1982) "Vitamin B-12 analog content in human red cells, liver and brain." Clinical Research, vol. 30, p. 504A. Kelley DS, Nelson GJ, Love JE, et al. (1993) "Dietary alpha-linolenic acid alters tissue fatty acid composition, but not blood lipids, lipoproteins or coagulation status in humans." Lipids, vol. 28, pp. 533537. Key T, Fraser G, Thorogood M, Appleby P, et al. (1998) "Mortality in vegetarians and non-vegetarians: a collaborative analysis of 8,300 deaths among 76,000 men and women in five prospective studies." Public Health Nutrition, vol. 1, pp. 33-41. Knott C (1998) "Orangutans in the wild." National Geographic, vol. 194(2), pp. 30-57. Kortlandt A (1984) "Habitat richness, foraging range and diet in chimpanzees and some other primates." In: Food Acquisition and Processing in Primates, eds. Chivers DJ, Wood BA, Bilsborough A; Plenum Press, New York, pp. 119-160. Koshimizu K, Ohigashi H, Huffman M (1994) "Use of Vernonia amygdalina by wild chimpanzee: possible roles of its bitter and related constituents." Physiology & Behavior, vol. 56, pp. 1209-1216. Krajcovicova-Kudlackova M, Simoncic R, Bederov A, Klvanova J (1997) "Plasma fatty acid profile and alternative nutrition." Annals of Nutrition and Metabolism, vol. 41, pp. 365-370. Kuzminski AM, Del Giacco EJ, Allen RH, Stabler SP, Lindenbaum J (1998) "Effective treatment of cobalamin deficiency with oral cobalamin." Blood, vol. 92, pp. 1191-1198. Laidlaw S, Shultz T, Cecchino J, Kopple J (1988) "Plasma and urine taurine levels in vegans." American Journal of Clinical Nutrition, vol. 47, pp. 660-663. Langley G (1995) Vegan Nutrition, Vegan Society (U.K.). Le Gros Clark WE (1964) The Fossil Evidence for Human Evolution, University of Chicago Press. Lederle FA (1991) "Oral cobalamin for pernicious anemia: medicine's best kept secret? (commentary)." Journal of the American Medical Association, vol. 265, pp. 94-95. Lee A (1996) "The transition of Australian Aboriginal diet and nutritional health." World Review of Nutrition and Dietetics, vol. 79, pp. 1-52. Lee RB (1968) "What hunters do for a living, or, how to make out on scarce resources." In: Man the Hunter, eds. Lee RB, DeVore I; Aldine Atherton, Chicago, pp. 30-48. Lee RB (1979) The !Kung San: Men, Women, and Work in a Foraging Society, Cambridge University Press. Lee-Thorp JA, van der Merwe NJ, Brain CK (1994) "Diet of Australopithecus robustus at Swartkrans from stable carbon isotopic analysis." Journal of Human Evolution, vol. 27, pp. 361-372. Leonard WR, Robertson ML (1992) "Nutritional requirements and human evolution: a bioenergetics model." American Journal of Human Biology, vol. 4, pp. 179-195.

473

Leonard WR, Robertson ML (1994) "Evolutionary perspectives on human nutrition: the influence of brain and body size on diet and metabolism." American Journal of Human Biology, vol. 6, pp. 77-88. Lewin R (1988) "New views emerge on hunters and gatherers." Science, vol. 240, pp. 1146-1148. Lieberman P (1992) "Human speech and language." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, pp. 134-137. Lin DS, Anderson GJ, Connor WE, Neuringer M (1994) "Effect of dietary n-3 fatty acids upon the phospholipid molecular species of the monkey retina." Investigative Ophthalmology and Visual Science, vol. 35, pp. 794-803. MacLarnon AM, Martin RD, Chivers DJ, Hladik CM (1986) "Some aspects of gastro-intestinal allometry in primates and other mammals." In: Definition et Origines de L'Homme, ed. Sakka M, Editions du CNRS, Paris, pp. 293-302. Mann AE (1981) "Diet and human evolution." In: Omnivorous Primates, eds. Harding RSO, Teleki G; Columbia University Press, New York, pp. 10-36. Mann GV, Oswald OA, Price DL, Merrill JM (1962) "Cardiovascular disease in African Pygmies--a survey of the health status, serum lipids and diet of Pygmies in Congo." Journal of Chronic Diseases, vol. 15, pp. 341-371. Martin FS (1990) "40,000 years of extinctions on the planet of doom." Palaeogeography, Palaeoclimatology, Palaeoecology, vol. 82, pp. 187-201. Martin R (1992) "Scaling." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, p. 42. Martin RD, Chivers DJ, Maclarnon AM, Hladik CM (1985) "Gastrointestinal allometry in primates and other mammals." In: Size and Scaling in Primate Biology, ed. Jungers WL; Plenum Press, New York, pp. 61-89. McArdle J (1996) "Humans are omnivores." The Vegan Handbook, eds. Wasserman D, Mangels R; Vegetarian Resource Group, Baltimore, Maryland, pp. 173-174. Also available on Internet at: http://www.vrg.org/nutshell/omni.htm. McCullagh KG (1972) "Arteriosclerosis in the African elephant." Atherosclerosis, vol. 16, pp. 307-335. McGrew WC (1998) "Culture in nonhuman primates?" Annual Review of Anthropology, vol. 27, pp. 301328. Miller J, Colagiuri S (1994) "The carnivore connection: dietary carbohydrate in the evolution of NIDDM." Diabetologia, vol. 37, pp. 1280-1286. Mills, Milton R (date unknown) "The comparative anatomy of eating." Available on Internet at: http://www.geocities.com/RainForest/2062/ana.HTML. Milton K (1987) "Primate diets and gut morphology: implications for hominid evolution." In: Food and Evolution: Toward a Theory of Food Habits, eds. Harris M, Ross EB; Temple University Press, Philadelphia, pp. 93-115.

474

Milton K (1993) "Diet and primate evolution." Scientific American, August issue, vol. 269(2), pp. 86-93. Milton K, Demment MW (1988) "Digestion and passage kinetics of chimpanzees fed high and low fiber diets and comparison with human data." Journal of Nutrition, vol. 118, pp. 1082-1088. Moir R (1994) "The 'carnivorous' herbivore." In: The Digestive System in Mammals, eds. Chivers DJ, Langer P; Cambridge University Press, pp. 87-102. Moodie P (1981) "Australian aborigines." Western Diseases: Their Emergence and Prevention, eds. Trowell H, Burkitt D; Harvard University Press, pp. 154-167. Mozafar A, Oertli JJ (1992) "Uptake of a microbially-produced vitamin (B12) by soybean roots." Plant and Soil, vol. 139, pp. 23-30. Mozafar A (1994) "Enrichment of some B-vitamins in plants with application of organic fertilizers." Plant and Soil, vol. 167, pp. 305-311. Mozafar A (1997) "Is there vitamin B-12 in plants or not? A plant nutritionist's view." Vegetarian Nutrition: An International Journal, vol. 1, issue 2, pp. 50-52. Muir J, Walker K, et al. (1998) "Modulation of fecal markers relevant to colon cancer risk: a high-starch Chinese diet did not generate expected beneficial changes relative to a Western-type diet." American Journal of Clinical Nutrition, vol. 68, pp. 372-379. Murdock, GP (1967) "Ethnographic atlas: a summary." Ethnology, vol. 6, pp. 109-236. National Research Council (NRC) (1989) Recommended Dietary Allowances, 10th edition, National Academy Press, Washington. Natural Hygiene Press (1982) Fit Food for Humanity. No author listed; believed to be a rewrite of an earlier booklet by Arthur Andrews (Fit Food for Man, 1970). Naughton JM, O'Dea K, Sinclair AJ (1986) "Animal foods in traditional Australian Aborigine diets: polyunsaturated and low in fat." Lipids, vol. 21, pp. 684-690. Neel JV (1962) "Diabetes mellitus: a 'thrifty' genotype rendered detrimental by 'progress'?" American Journal of Human Genetics, vol. 14, pp. 353-362. Neel JV (1982) "The thrifty genotype revisited." In: The Genetics of Diabetes Mellitus, Proceedings of the Serono Symposia, vol. 47, pp. 283-293. Nettleton J (1995) Omega-3 Fatty Acids for Health, Chapman and Hall, New York. Niederau C, Strohmeyer G, Stremmle W (1994) "Epidemiology, clinical spectrum and prognosis of hemochromatosis." In: Progress in Iron Research, eds. Hershko C, Konjin A, Aisen P; Plenum Press, New York, pp. 293-302. Nishida T (1991) "Primate gastronomy: cultural food preferences in nonhuman primates and origins of cuisine." In: Chemical Senses, vol. 4, Appetite and Nutrition, eds. Friedman M, Tordoff M, Kare M; Marcel Dekker, New York, pp. 195-209.

475

Nutrition Reviews (1980) "Contribution of the microflora of the small intestine to the vitamin B-12 nutriture of man." Editorial--no author cited, vol. 38, pp. 274-275. O'Dea K, Spargo RM (1982) "Metabolic adaptation to a low carbohydrate-high protein ('traditional') diet in Australian Aborigines." Diabetologia, vol. 23, pp. 494-498. O'Dea K (1984) "Marked improvement in carbohydrate and lipid metabolism in diabetic Australian Aborigines after temporary reversion to traditional lifestyle." Lipids, vol. 33, pp. 596-603. O'Dea K (1991) "Traditional diet and food preferences of Australian Aboriginal hunter-gatherers." Philosophical Transactions of the Royal Society of London, Series B, vol. 334, pp. 233-241. Okuyama H (1992) "Effects of dietary essential fatty acid balance on behavior and chronic diseases." In: Polyunsaturated Fatty Acids in Human Nutrition, Vevey/Raven Press, New York, pp. 169-178. Oxtoby DW, Nachtrieb NH (1986) Principles of Modern Chemistry. Saunders College Publishing, Philadelphia, Pennsylvania. Ozanne SE, Hales CN (1998) "Thrifty yes, genetic no." Diabetologia, vol. 41, pp. 485-487. Pagel M, Harvey P (1988) "Recent developments in the analysis of comparative data." The Quarterly Review of Biology, vol. 63, pp. 413-440. Pardridge W (1976) "Inorganic mercury: selective effects on blood-brain barrier transport systems." Journal of Neurochemistry, vol. 27, pp. 333-335. Parker RS (1996) "Absorption, metabolism, and transport of carotenoids." The FASEB Journal, vol. 10, pp. 542-551. Parker ST (1990) "Why big brains are so rare: energy costs of intelligence and brain size in anthropoid primates." In: Language and Intelligence in Monkeys and Apes, eds. Parker ST, Gibson KR; Cambridge University Press, pp. 129-156. Peters CR, Maguire B (1981) "Wild plant foods of the Makapansgat area: a modern ecosystems analogue for Australopithecus africanus adaptations." Journal of Human Evolution, vol. 10, pp. 565-583. Piantadosi S (1994) "Invited commentary: ecologic studies." American Journal of Epidemiology, vol. 139, pp. 761-764. Provonsha S (1998) "A hypothesis regarding meat and the insulin-resistant state known as Syndrome-X." Vegetarian Nutrition: An International Journal, vol. 2, issue 3, pp. 119-126. Rauma AL, Torronen R, Hanninen O, Mykkanen H (1995) "Vitamin B12 status of long-term adherents of a strict uncooked vegan diet ('living food diet') is compromised." Journal of Nutrition, vol. 125, pp. 25112515. Reaven GM (1994) "Syndrome X: 6 years later." Journal of Internal Medicine, vol. 236 (supplement 736), pp. 13-22. Reaven GM (1998a) "Hypothesis: muscle insulin resistance is the ('not-so') thrifty genotype." Diabetologia, vol. 41, pp. 482-484.

476

Reaven GM (1998b) "(letter) Insulin resistance, the key to survival: a rose by any other name." Diabetologia, vol. 42, pp. 384-385. Reddy S, Sanders TAB, Obeid O (1994) "The influence of maternal vegetarian diet on essential fatty acid status of the newborn." European Journal of Clinical Nutrition, vol. 48, pp. 358-368. Richard AF (1985) Primates in Nature, W.H. Freeman and Co., New York. Ridley M (1983) The Explanation of Organic Diversity, Clarendon Press, Oxford, U.K. Ruff CB, Trinkaus E, Holliday TW (1997) "Body mass and encephalization in Pleistocene Homo." Nature, vol. 387, pp. 173-176. Salem N, Wegher B, Mena P, Uauy R (1996) "Arachidonic and docosahexaenoic acids are biosynthesized from their 18-carbon precursors in human infants." Proceedings of the National Academy of Sciences (U.S.), vol. 93, pp. 49-54 (abstract). Sanders TAB, Reddy S (1992) "The influence of a vegetarian diet on the fatty acid composition of human milk and the essential fatty acid status of the infant." Journal of Pediatrics, vol. 120, pp. S71-S77. Sandstrom B (1997) "Bioavailability of zinc." European Journal of Clinical Nutrition, vol. 51(suppl 1), pp. S17-S19. Schaeffer O (1981) "Eskimos (Inuit)." In: Western Diseases: Their Emergence and Prevention, eds. Trowell H, Burkitt D; Harvard University Press, pp. 113-128. Schaller GB (1963) The Mountain Gorilla, University of Chicago Press. Schaller GB, Qitao T, Johnson KG, Xiaoming W, Heming S, Jinchu H (1988) "The feeding ecology of giant pandas and Asiatic black bears in the Tangjiahe reserve, China." In: Carnivore Behavior, Ecology, and Evolution, ed. Gittleman JL; Cornell University Press, pp. 212-241. Schneider Z, Stroinski A (1987) Comprehensive B-12: Chemistry, Biochemistry, Nutrition, Ecology, Medicine, Walter de Gruyter, New York. Shea B (1992) "Neoteny." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, p. 102. Sheldon WG (1975) The Wilderness Home of the Giant Panda, University of Massachusetts Press, Amherst. Sibley CG, Ahlquist JE (1984) "The phylogeny of the hominoid primates, as indicated by DNA-DNA hybridization." Journal of Molecular Evolution, vol. 20, pp. 2-15. Sibley CG, Ahlquist JE (1987) "DNA hybridization evidence of hominoid phylogeny: results from an expanded data set." Journal of Molecular Evolution, vol. 26, pp. 99-121. Sicher H (1944) "Masticatory apparatus in the giant panda and the bears." In: Field Museum of Natural History (Zoological Series), Chicago. Sillen A (1992) "Strontium-calcium ratios (Sr/Ca) of Australopithecus robustus and associated fauna from Swartkrans." Journal of Human Evolution, vol. 23, pp. 495-516.

477

Sillen A, Hall G, Armstrong R (1995) "Strontium calcium (Sr/Ca) ratios and strontium isotopic ratios (87 Sr/ 86 Sr) of Australopithecus robustus and Homo sp. from Swartkrans." Journal of Human Evolution, vol. 28, pp. 277-285. Solberg EE, Magnus E, Sander J, Loeb M (1998) "Vitamin B12 and folate in lactovegetarians--a controlled study of 63 Norwegian subjects." Vegetarian Nutrition: An International Journal, vol. 2, issue 2, pp. 73-75. Solomons N, Bulux J (1997) "Identification and production of local carotene-rich foods to combat vitamin A malnutrition." European Journal of Clinical Nutrition, vol. 51, pp. S39-S45. Speth JD, Spielmann KA (1983) "Energy source, protein metabolism, and hunter-gatherer subsistence strategies." Journal of Anthropological Archaeology, vol. 2, pp. 1-31. Speth JD (1991) "Protein selection and avoidance strategies of contemporary and ancestral foragers: unresolved issues." Philosophical Transactions of the Royal Society of London, Series B, vol. 334, pp. 265270. Spuhler JN (1959) "Somatic paths to culture." In: The Evolution of Man's Capacity for Culture, Wayne State University Press, Detroit, pp. 1-13. Stefansson V (1960) Cancer: Disease of Civilization? Hill and Wang, New York. Stephan H (1972) "Evolution of primate brains: a comparative anatomical investigation." In: The Functional and Evolutionary Biology of Primates, ed. Tuttle R; Aldine Atherton, Chicago, pp. 155-174. Stevens CE, Hume ID (1995) Comparative Physiology of the Vertebrate Digestive System, Cambridge University Press. Stirling I (1988) Polar Bears, University of Michigan Press. Story M, Brown JE (1987) "Do young children instinctively know what to eat?" New England Journal of Medicine, vol. 316, pp. 103-106. Strickland KP (1973) "The chemistry of phospholipids." In Form and Function of Phospholipids, eds. Ansell GB, Hawthore JN, Dawson RMC; Elsevier Scientific Publishing, New York. (pp. 9-42) Sussman RW (1987) "Species-specific dietary patterns in primates and human dietary adaptations." In: The Evolution of Human Behavior: Primate Models, ed. Spuhler JN; State University of New York Press, pp. 151-179. Talbot SL, Shields GF (1996) "A phylogeny of the bears (Ursidae) inferred from complete sequence of three mitochondrial genes." Molecular Phylogenetics and Evolution, vol. 5, pp. 567-575 (abstract). Tanner NM (1981) On Becoming Human, Cambridge University Press. Tanner JM (1992) "Human growth and development." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, pp. 98-105. Tanner JM (1992b) "Evolution of the human growth curve." In: The Cambridge Encyclopedia of Human Evolution, Cambridge University Press, p. 100. Taylor R (1975) Butterflies in my Stomach, Woodbridge Press, Santa Barbara, California.

478

Tefferi A, Pruthi R (1994) "The biochemical basis of cobalamin deficiency." Mayo Clinic Proceedings, vol. 69, pp. 181-186. Teleki G (1981) "The omnivorous diet and feeding habits of chimpanzees in Gombe National Park, Tanzania." In: Omnivorous Primates, eds. Harding RSO, Teleki G; Columbia University Press, New York, pp. 303-343. Temerin L, Wheatley B, Rodman P (1984) "Body size and foraging in primates." In: Adaptations for Foraging in Nonhuman Primates, eds. Rodman P, Cant J; Columbia University Press. Tenhunen R, Grasbeck R, Kouvonen I, Lundberg M (1980) "An intestinal receptor for heme: its partial characterization." International Journal of Biochemistry, vol. 12, pp. 713-716. Thornhill R (1996) "The study of adaptation." In: Readings in Animal Cognition, eds. Bekoff M, Jamieson D; MIT Press, Cambridge, Massachusetts. Tortora GJ, Anagnostakos NP (1981) Principles of Anatomy and Physiology, Harper and Row, New York. Tutin CEG, Fernandez M (1992) "Insect-eating by sympatric lowland gorillas (gorilla g. gorilla) and chimpanzees (pan t. troglodytes) in the Lope Reserve, Gabon." American Journal of Primatology, vol. 28, pp. 29-40. Uauy R, Peirano P, Hoffman D, Mena P, Birch D, Birch E (1996) "Role of essential fatty acids in the function of the developing nervous system." Lipids, vol. 31 (suppl), pp. S-167 to S-176. U.S. Dept. of Agriculture (USDA) Composition of Foods handbook series: (1984) handbook 8-11, Vegetables and Vegetable Products; (1986) handbook 8-16, Legumes and Legume Products; (1989) handbook 8-20, Cereal Grains and Pasta; U.S. Government Printing Office. van Lawick-Goodall J (1971) In the Shadow of Man, Houghton Mifflin Company, Boston. Wakayama EJ, Dillwith JW, Howard RW, Blomquist GJ (1984) "Vitamin B-12 levels in selected insects." Insect Biochemistry, vol. 14, pp. 175-179. Wapner R (1990) Protein Nutrition and Mineral Absorption, CRC Press, Boca Raton, Florida. Wardlaw G (1999) Perspectives in Nutrition (4th edition). McGraw-Hill, New York. Washburn SL (1959) "Speculations on the interrelations of the history of tools and biological evolution." In: The Evolution of Man's Capacity for Culture, Wayne State University Press, Detroit, pp. 21-31. Wang X, Krinsky N, Tang G, Russell R (1992) "Retinoic acid can be produced from excentric cleavage of beta-carotene in human intestinal mucosa." Archives of Biochemistry and Biophysics, vol. 293, pp. 298-304. Wang X (1994) "Review: absorption and metabolism of beta-carotene." Journal of the American College of Nutrition, vol. 13, pp. 314-325. Watts DP (1989) "Ant eating behavior of mountain gorillas." Primates, vol. 30, pp. 121-125. Wenxun F, Parker R, Parpia B, Yinsheng Q, Cassano P, Crawford M, Leyton J, Tian J, Junyao L, Junshi C, Campbell TC (1990) "Erythrocyte fatty acids, plasma lipids, and cardiovascular disease in rural China." American Journal of Clinical Nutrition, vol. 52, pp. 1077-1036.

479

Westergaard GC, Suomi SJ (1994) "A simple stone-tool technology in monkeys." Journal of Human Evolution, vol. 27, pp. 399-404. Williams WR (1908) The Natural History of Cancer. William Wood and Company, New York. Wood BA (1995) "Evolution of the early hominid masticatory system: mechanisms, events, and triggers." In: Paleoclimate and Evolution, with Emphasis on Human Origins, eds. Vrba ES, Denton GH, Partridge TC, Burckle LH; Yale University Press, New Haven, pp. 438-450. Courtesy link to Yale University Press's web page for the book Paleoclimate and Evolution graciously provided in exchange for permission to reproduce Fig. 29.1 from the above article. Wood C (1994) "The correspondence between diet and masticatory morphology in a range of extant primates." Zeitschrift fuer Morphologie und Anthropologie, vol. 80, pp. 19-50 (abstract). Woods J, Ward G, Salem J (1996) "Is docosahexaenoic acid necessary in infant formula? Evaluation of high linoleate diets in the neonatal rat." Pediatric Research, vol. 40, pp. 687-694. Young V, Pellett P (1994) "Plant proteins in relation to human protein and amino acid nutrition." American Journal of Clinical Nutrition, vol. 59(suppl), pp. 1203S-1212S. Zephyr (1996) Instinctive Eating: The Lost Knowledge of Optimum Nutrition, Pan Piper Press, Pahoa, Hawaii. Zhang Y-P, Ryder OA (1994) "Phylogenetic relationships of bears (the Ursidae) inferred from mitochondrial DNA sequences." Molecular Phylogenetics and Evolution, vol. 3, pp. 351-359. Zuckerman S (1933) Functional Affinities of Man, Monkeys, and Apes. Kegan Paul, Trench, Trubner & Co., London.

BIBLIOGRAPHY FOR Paleolithic Diet vs. Vegetarianism: What Was Humanity's Original Natural Diet?
Abrams, H. Leon Jr. (1979) "The relevance of Paleolithic diet in determining contemporary nutritional needs." Journal of Applied Nutrition, vol. 31, nos. 1 & 2, pp. 43-59. Abrams, H. Leon Jr. (1980) "Vegetarianism: an anthropological/nutritional evaluation." Journal of Applied Nutrition, vol. 32, no. 2, pp. 53-87. Abrams, H. Leon Jr. (1982) "Anthropological research reveals human dietary requirements for optimal health." Journal of Applied Nutrition, vol. 34, no. 1, pp. 38-45. Abrams, H. Leon Jr. (1983) "Salt and sodium: an anthropological cross cultural perspective in health and disease." Journal of Applied Nutrition, vol. 35, no. 2, pp. 127-158.

480

Aiello, Leslie C.; Wheeler, Peter (1995) "The expensive-tissue hypothesis: the brain and the digestive system in human and primate evolution." Current Anthropology, vol. 36, no. 2 (April 1995), pp. 199-221. American Dietetic Association (1993) "Position of the American Dietetic Association: vegetarian diets." Journal of the American Dietetic Association, vol. 93 (Nov. 1993), pp. 1317-1319. Ames, Bruce (1983) "Dietary carcinogens and anticarcinogens. Oxygen radicals and degenerative diseases." Science, vol. 221 (Sept. 23, 1983), pp. 1256-1264. Ames, Bruce N.; Magaw, Renae; Gold, Lois Swirsky (1987) "Ranking possible carcinogenic hazards." Science, vol. 236 (Apr. 17, 1987), pp. 271-280. (Discussions follow in vol. 237 (Jul. 17, 1987), p. 235; vol. 237 (Sep. 18, 1987), pp. 1399-1400; vol. 238 (Dec. 18, 1987) pp. 1633-1635; vol. 240 (May 20, 1988) pp. 1043-1047.) Andrews, Peter; Martin, Lawrence (1992) "Hominoid dietary evolution." In: Whiten A. and Widdowson E.M. (editors/organizers), Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Oxford, England: Clarendon Press. (pp. 39-49) Angel, Lawrence J. (1984) "Health as a crucial factor in the changes from hunting to developed farming in the Eastern Mediterranean." In: Cohen, Mark; Armelagos, George J. (eds.) Paleopathology at the Origins of Agriculture. Orlando: Academic Press. (pp. 51-73) Aoki K. (1991) "Time required for gene frequency change in a deterministic model of gene culture coevolution, with special reference to the lactose absorption problem." Theoretical Population Biology, vol. 40, pp. 354-368. Audette, Raymond V.; with Gilchrist, Troy (1995) Neander-Thin: A Cave Man's Guide to Nutrition. Dallas, Texas: Paleolithic Press. Barzel, Uriel S.; Massey, Linda K. (1998) "Excess dietary protein can adversely affect bone." Journal of Nutrition, vol. 128, pp. 1051-1053. Benditt, John (1989) "Cold water on the fire: a recent survey casts doubt on evidence for early use of fire." Scientific American, May 1989, pp. 21-22. Benesh, Gerald (1971) "Excessive fruit eating." Ahimsa, vol. 17, no. 3 (May/Jun. 1976), pp. 1-2. (Reprinted from Dr. Shelton's Hygienic Review, Apr. 1971) Berglund and Bjrck (1993) "Ice through the ages." In: Burenhult, Goran (ed.) The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. (pp. 82-83) Bidwell, Victoria (1990) The Health Seekers' Yearbook: A Revolutionist's Handbook for Getting Well and Staying Well without The Medicine Men. Freemont, California: GetWell StayWell America. Blumenschine, Robert J. (1992) "Hominid carnivory and foraging strategies, and the socio-economic function of early archaeological sites." In: Whiten A. and Widdowson E.M. (editors/organizers), Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Oxford, England: Clarendon Press. (pp. 51-61) Bodmer, W.F.; Cavalli-Sforza, L.L. (1976) Genetics, Evolution, and Man. San Francisco, California: W.H. Freeman & Co..

481

Bower, Bruce (1989) "A world that never existed: researchers debate the pervasive view of modern huntergatherers as a window to humanity's past." Science News, vol. 135 (Apr. 29, 1989), pp. 264-266. Breslau, N.A.; Brinkley, L.; Hill, K.; Pak, C.Y.C. (1988) "Relationship of animal-protein rich diet to kidney stone formation and calcium metabolism." J Clin Endocrinol Metab, vol. 66, pp. 140-146. Brody, Jane E. (1979) "Studies suggest a harmful shift in today's menu." New York Times (Science Section). May 15, 1979, p. C1 and following. Burch, Ernest S. Jr.; Ellanna, Linda J. (eds) (1994) Key Issues in Hunter-Gatherer Research. Oxford, England; Providence, Rhode Island: Berg. Burenhult, Goran (ed.) (1993a) The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. Burenhult, Goran (ed.) (1993b) People of the Stone Age: Hunter-Gatherers and Early Farmers. New York: Harper-Collins Publishers. Burenhult, Goran (1993c) "Towards homo sapiens: habilines, erectines, and neanderthals." In: Burenhult, Goran (ed.), The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. (pp. 55-59, 62-64, 66-67) Burenhult, Goran (1993d) "Modern people in Africa and Europe." In: Burenhult, Goran (ed.), The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. (pp. 77-81, 8485, 88-93, 95) Campbell, T. Colin; Chen, J. (1994) "Diet and chronic degenerative diseases: perspectives from China." American Journal of Clinical Nutrition, vol. 59, pp. 1153S-1161S. Catassi, C.; Ratsch, I.-M.; Fabiani, E. (1994) "Celiac disease in the year 2000: exploring the iceberg." The Lancet (North American edition), Jan. 22, 1994, vol. 343, pp. 200-203. Cavalli-Sforza, L. Luca; Menozzi, Paolo; Piazza, Alberto (1994) The History and Geography of Human Genes. Princeton, New Jersey: Princeton University Press. Cinque, Ralph C. (1976) "Man's natural diet." Ahimsa, vol. 17, no. 3 (May/Jun. 1976), pp. 1-2, 4. (Reprinted from Dr. Shelton's Hygienic Review, Jun. 1976) Coursey, D.G. (1975) "The origins and domestication of yams in Africa." In: Arnott, Margaret L. (ed.), Gastronomy: The Anthropology of Food and Food Habits. The Hague: Mouton Publishers. Distributed in U.S. by Aldine Publishing Co., Chicago, Illinois. (pp. 187-212) Crawford, Michael A. (1992) "The role of dietary fatty acids in biology: their place in the evolution of the human brain." Nutrition Reviews, vol. 50, no. 4 (April 1992, part 2) pp. 3-11. Cridland, Ronald (1988) "Dr. Ronald Cridland replies," in "Truth, Not Tradition, Is the Basis of Natural Hygiene." Health Science, March/April 1988, pp. 7-9. (on food combining, antibiotics, vaccines, and infectious disease, including ANHS conf where many N.H. attendees came down with flu-like symptoms) Cridland, Ronald (1994a) "Ultimate in vegetables." Health Science, May/Jun. 1994, vol. 17, no. 3, p. 21. (on gas to due raw broccoli and cauliflower)

482

Cridland, Ronald (1994b) "Unwanted natural gas." Health Science, July/Aug. 1994, vol. 17, no. 3, pp. 2223. (on gas even while following food combining, etc.) Davidson, Iain; Noble, William (1993) "When did language begin?" In: Burenhult, Goran (ed.) The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. (p. 46) Davis, Devra Lee (1987) "Paleolithic Diet, Evolution, and Carcinogens." Science, vol. 238, pp. 1633-1634 (includes response by Bruce Ames). DeFoliart Gene R. (compiled by) (1995a) "Food insects hold prominent place in international symposium on biodiversity in agriculture for a sustainable future, Beijing, China, September 19-21, 1995." The Food Insects Newsletter, November 1995, Vol. 8, No. 3, pp. 1-4. DeFoliart Gene R. (compiled by) (1995b) "The human use of insects as food in Uganda." The Food Insects Newsletter, March 1995, Vol. 8, No. 1., pp. 1, 10. Diamond, Jared (1996) "Why women change." Discover, vol. 17 (July 1996) no. 7, pp. 130-137. (on the evolution of menopause, and selection for longevity in humans) Dinshah, Freya (1976) "Menstruation." Ahimsa, vol. 17, no. 2 (Mar./Apr. 1976), p. 6. Dinshah, H. Jay, (1976-77) "Fruit for thought: should man live by fruit alone?" (a 4-part article series). Ahimsa; Part 1 in vol. 17, no. 2 (Mar./Apr. 1976), pp. 1, 4, 7-8; Part 2 in vol. 17, no. 3 (May/Jun. 1976), pp. 1, 7-8; Part 3 in vol. 17, no. 4 (Jul./Aug., Sept./Oct., Nov./Dec. 1976 combined issue), pp. 1-9, 11-12; Part 4 in vol. 18, no. 1 (Jan./Mar. 1977), pp. 1-3, 5-8. Eaton, S. Boyd, and Konner, Melvin (1985) "Paleolithic nutrition: a consideration of its nature and current implications." The New England Journal of Medicine, vol. 312, no. 5 (Jan. 31, 1985), pp. 283-289. Eaton, S.B.; Konner, Melvin; Shostak, Marjorie (1988a) "Stone-agers in the fast lane: chronic degenerative diseases in evolutionary perspective." American Journal of Medicine, vol. 84, pp. 739-749. Eaton, S. Boyd; Shostak, Marjorie; Konner, Melvin (1988b) The Paleolithic Prescription: A Program of Diet and Exercise and a Design for Living. New York: Harper & Row. Eaton, S.B.; Nelson, D.A. (1991) "Calcium in evolutionary perspective." American Journal of Clinical Nutrition, vol. 54, pp. 281S-287S. Eaton, S. Boyd (1992) "Humans, Lipids, and Evolution." Lipids, vol. 27, no. 10 (1992), pp. 814-820. Eaton, S. Boyd; Eaton, Stanley B. III; Konner, Melvin J.; Shostak, Marjorie (1996) "An evolutionary perspective enhances understanding of human nutritional requirements." Journal of Nutrition, vol. 126 (1996), pp. 1732-1740. Eaton, S. Boyd; Eaton, Stanley B. III (1998) "Evolution, diet and health." Presented in association with the scientific session, "Origins and Evolution of Human Diet," at the 14th International Congress of Anthropological and Ethnological Sciences in Williamsburg, Virginia Eldredge, Niles (1995) Reinventing Darwin: The Great Debate at the High Table of Evolutionary Theory. New York: John Wiley & Sons, Inc. Ember, C.R. (1978) "Myths about hunter-gatherers." Ethnology, vol. 17, pp. 439-448.

483

Fischman, Joshua (1996a) "A fireplace in France." Discover, Jan. 1996, vol. 17, no. 1, p. 69. Fischman, Joshua (1996b) "Oldest Europeans reign in Spain." Discover, Jan. 1996, vol. 17, no. 1, pp. 6667. Foley, Robert, (1995) Humans Before Humanity. Cambridge, Massachusetts: Blackwell Publishers, Inc. Forni, Gaetano (1975) "The origin of grape wine: a problem of historical ecological anthropology." In: Arnott, Margaret L. (ed.), Gastronomy: The Anthropology of Food and Food Habits. The Hague: Mouton Publishers. Distributed in U.S. by Aldine Publishing Co., Chicago, Illinois. (pp. 67-78) Gittleman Ann Louise (with James Templeton and Candelore Versace) (1996) Your Body Knows Best: The Revolutionary Eating Plan That Helps You Achieve Your Optimal Weight and Energy Level for Life. New York: Pocket Books, a division of Simon & Schuster Inc. Goldfinger, Stephen E. (1992) "Celiac disease: a cereal mystery." Harvard Health Letter, vol. 17, no. 4 (Feb. 1992), pp. 6-8. Goodall, Jane (1986) The Chimpanzees of Gombe: Patterns of Behavior. Cambridge, Mass.; London, England: The Belknap Press of Harvard Univ. Press. Goudsblom, Johan (1992) Fire and Civilization. London, England; New York, New York: Penguin Books. Gould, Stephen Jay (1996) "Creating the Creators." Discover, vol. 17, no. 10 (Oct. 1996), pp. 43-54. Gray, Hope; Taylor, Susan; Lennon, James Michael (1992) "Diet and disease: the China study." Health Science, vol. 15, no. 5 (Sept./Oct. 1992), pp. 6-9. Greco, Luigi (1995) "From the Neolithic Revolution to Gluten Intolerance." This article, dated 6/30/1995 was published on a website with this address: http://www.hooked.net/users/sadams/history.html, and contains footnotes and bibliographic references. (Update: Unfortunately, a recent check showed this URL to no longer be working. Greco is a Ph.D./M.D. in pediatrics at the University of Naples, Italy, and a European authority on gluten intolerance. He can be reached via email at this address.) Gross, Jane D.; Rich, Thomas H.; Vickers-Rich, Patricia (1993) "Dinosaur bone infection." Research and Exploration, vol. 9, (Summer 1993), pp. 286-293. Groves, Colin (1993) "Our earliest ancestors." In: Burenhult, Goran (ed.), The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. (pp. 33-40, 42-45, 47-52) Health magazine (1996) "Are mushrooms really carcinogenic?" Health, vol. 10, no. 6 (Oct. 1996), p. 116. Hitchcox, Lee (1996) Long Life Now: Strategies for Staying Alive. Berkeley, California: Celestial Arts. Ho, Kang-Jey et al (1972) "Alaskan Arctic Eskimo: responses to a customary high-fat diet." American Journal of Clinical Nutrition, vol. 25, no. 8 (Aug. 1972), pp. 737+. Hoggan, Ron B. (in press, late 1996 or 1997) "Considering Wheat, Rye, and Barley Proteins as Aids to Carcinogens." Medical Hypotheses.

484

Hunt, J.R.; Gallagher, S.K.; Johnson, L.K.; Lykken, G.I. (1995) "High- vs. low meat diets: effects on zinc absorption, iron status, and calcium, copper, iron, magnesium, manganese, nitrogen, phosphorus, and zinc balance in postmenopausal women. American Journal of Clinical Nutrition, vol. 62, pp. 621-632. James, Steven R. (1989) "Hominid use of fire in the lower and middle Pleistocene. A review of the evidence." Current Anthropology, vol. 30, pp. 1-26. Jerome, Norge, W.; Kandel, Randy F.; Pelto, Gretel H. (eds.) (1980) Nutritional Anthropology: Contemporary Approaches to Diet and Culture. Pleasantville, New York: Redgrave Publishing Co. Johnston, Francis E. (ed.) (1987) Nutritional Anthropology. New York: Alan R. Liss, Inc. Jones, Nicholas B.; Hawkes, Kristen; Draper, Patricia (1994) "Differences between Hadza and !Kung children's work: original affluence or practical reason?" In: Burch, Ernest S. Jr.; Ellanna, Linda J., Key Issues in Hunter Gatherer Research. Oxford, England; Providence, Rhode Island: Berg. (pp. 189-215) Kopytoff, Verne G. (1995) "Meat viewed as staple of chimp diet and mores." New York Times, Jun. 7, 1995. Kulvinskas, Viktoras (1975) Survival into the 21st Century. Wethersfield, Conn.: Omangod Press. Lewin, Roger (1993a) The Origin of Modern Humans. New York: Scientific American Library, a division of HPHLP. Lewin, Roger (1993b) "Shock of the past for modern medicine." New Scientist, vol. 140, (Oct. 23, 1993), pp. 28-32. Liljegren, Ronnie (1993) "Animals of ice age Europe." In: Burenhult, Goran (ed.) The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. (pp. 86-87) Lutz, W.J. (1995) "The Colonisation of Europe and Our Western Diseases." Medical Hypotheses, vol. 45, pp. 115-120. Mann, George V.; Spoerry, Anne; Gray, Margarete; Jarashow, Debra (1972) "Atherosclerosis in the Masai." American Journal of Epidemiology, vol. 95, no. 1 (Jan. 1972), pp. 26-37. Mapes, Lynda V. (1993) "Breatharian proves you are what you eat." The Spokesman-Examiner, unknown issue date, but apparently late 1993 sometime. This article, datelined "Seattle," was forwarded to me in Nov. or Dec. 1993, by a participant in The Natural Hygiene M2M who included only partial source data. Given the M2M participant was living in Washington state at the time the article was forwarded, we are assuming the piece was clipped from a local or regional paper at the time it was published. The article has clear earmarks of having been published in a Seattle metropolitan-area newspaper, but unfortunately I cannot trace it more specifically than this. Martin, Glen (1996) "Keepers of the oaks." Discover, Aug. 1996, vol. 17, no. 8, pp. 44-50. Mazess, Richard B. and Mather, Warren (1974) "Bone mineral content of North Alaskan Eskimos." American Journal of Clinical Nutrition, vol. 27, no. 9 (Sept. 1974), pp. 916-925. McArdle, John (1996) "Humans are omnivores." In: Vegetarian Resource Group (ed) (1996) The Vegan Handbook. Vegetarian Resource Group, P.O. Box 1463, Baltimore, MD 21203.

485

McGrew, W.C. (1992) Chimpanzee Material Culture: Implications for Human Evolution. Cambridge, England: Cambridge University Press. Megarry, Tim (1995) Society in Prehistory: The Origins of Human Culture. New York: New York University Press. Menon, Shanti (1996) "Oldest Asians in China." Discover, Jan. 1996, vol. 17, no. 1, p. 66. Milton, Katharine (1993) "Diet and Primate Evolution." Scientific American, Aug. 1993, pp. 86-93. Mogelonsky, Marcia (1995) "Milk Doesn't Always Do a Body Good." American Demographics (Jan. 1995). Online version available at: http://www.demographics.com/publications/ad/95_ad/9501_ad/9501ab07.htm. Montgomery, Sy (1991) Walking with the Great Apes: Jane Goodall, Dian Fossey, Birut Galdikas. Boston; New York; London: Houghton Mifflin Co. National Research Council (U.S.), Committee on Diet and Health. (1989) Diet and Health: Implications for Reducing Chronic Disease Risk. Washington, D.C.: National Academy Press. Natural Hygiene Press (1982) Fit Food for Humanity. Bridgeport, Connecticut: Natural Hygiene Press. O'Dea, K. (1992) "Traditional diet and food preferences of Australian Aboriginal hunter-gatherers." In: Whiten A. and Widdowson E.M. (editors/organizers), Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Oxford, England: Clarendon Press. (pp. 73-81) Palmqvist, Lennart (1993) "First Farmers of the Western World." In: Burenhult, Goran (ed.) People of the Stone Age: Hunter-Gatherers and Early Farmers. New York: Harper-Collins Publishers. (pp. 17-21, 24-26, 28-29, 32-35) Patel, Tara (1995) "Burnt stones and rhino bones hint at earliest fire." New Scientist, June 17, 1995, p. 5. Peters, Charles R.; O'Brien, Eileen M. (1984 or 85?) "On hominid diet before fire." Current Anthropology, vol. ___, no. ___, pp. 358-360. [I am missing the issue date on this one. The article was a response to Ann Brower Stahl's "Hominid dietary selection before fire," which had earlier appeared in Current Anthropology, vol. 25, no. 2 (April 1984), pp. 151-168.] Plowright, Walter (1988) "Viruses transmissible between wild and domestic animals." In: Smith, G.R.; Hearn, J.P. (eds) Reproduction and Disease in Captive and Wild Animals. Oxford, Oxfordshire; New York: Oxford Univ. Press. (pp. 175-199) Rensberger, Boyce (1979) "Teeth show fruit was the staple." New York Times, May 15, 1979, p. C1 and following. Robson, John R.K. (ed) (1988), Food, Ecology and Culture: Readings in the Anthropology of Dietary Practices. New York: Gordon and Breach. Rowley-Conwy, Peter (1993a) "Mighty Hunter or Marginal Scavenger?" In: Burenhult, Goran (ed.) The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. (pp. 6061)

486

Rowley-Conwy, Peter (1993b) "What do the Zhoukoudian finds tell us?" In: Burenhult, Goran (ed.) The First Humans: Human Origins and History to 10,000 B.C. New York: Harper-Collins Publishers. (p. 65) Ruff C.B., Trinkaus E., Holliday T.W. (1997) "Body mass and encephalization in Pleistocene Homo." Nature, vol. 387, pp. 173-176. (Figures for brain mass across human evolutionary time can be found in Table 1 on p. 174.) Scarre, Chris (ed.) (1993) Smithsonian Timelines of the Ancient World: A Visual Chronology from the Origins of Life to A.D. 1500. New York: Dorling Kindersley. Seamens, Dan (1992) "Eating for optimum health: is there a place for animal food in a healthy diet?" Natural Health, Nov./Dec. 1992, pp. 62-69. Sears, Cathy (1990) "The chimpanzee's medicine chest." New Scientist, vol. 127 (Aug. 4, 1990), pp. 42-44. Shnirelman, Victor A. (1994) "Cherchez le Chien: perspectives on the economy of the traditional fishingoriented people of Kamchatka." In: Burch, Ernest S. Jr.; Ellanna, Linda J., Key Issues in Hunter-Gatherer Research. Oxford, England; Providence, Rhode Island: Berg. (pp. 169-188) Sillen, A. (1992) "Strontium-calcium (Sr/Ca) ratios of Australopithecus robustus and associated fauna from Swartkrans." Journal of Human Evolution, vol. 23, pp. 495-516. Simoons, Frederick J. (1988) "The determinants of dairying and milk use in the old world: ecological, physiological, and cultural." In: Robson, John R.K. (ed) (1988) Food, Ecology and Culture: Readings in the Anthropology of Dietary Practices. New York: Gordon and Breach. (pp. 83-91) Simoons, Frederick J. (1994) Eat Not This Flesh: Food Avoidances from Prehistory to the Present (2nd Ed.). Madison, Wisconsin: Univ. of Wisconsin Press. Smith, G.R.; Hearn, J.P. (eds) (1988a) Reproduction and Disease in Captive and Wild Animals. Oxford, Oxfordshire; New York: Oxford Univ. Press. Smith, G.R. (1988b) "Anaerobic bacteria as pathogens in wild and captive animals." In: Smith, G.R.; Hearn, J.P. (eds) Reproduction and Disease in Captive and Wild Animals. Oxford, Oxfordshire; New York: Oxford Univ. Press. (pp. 159-173) Southgate, D.A.T. (1992) "Nature and variability of human food consumption." In: Whiten A. and Widdowson E.M. (editors/organizers), Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Clarendon Press: Oxford, England. (pp. 121-128) Spencer, H.; Kramer, L.; DeBartolo, M.; Norris, C.; Osis. D. (1983) "Further studies on the effect of a highprotein diet as meat on calcium metabolism." American Journal of Clinical Nutrition, vol. 37, no. 6 (Jun. 1983), pp. 924-929. Stahl, Ann Brower (1984) "Hominid dietary selection before fire." Current Anthropology, April 1984, vol. 25, no. 2, pp. 151-168. Stanford, Craig B. (1995) "To catch a colobus." Natural History, January 1995, pp. 48-54. Thiessen, Del (1996) Bittersweet Destiny: The Stormy Evolution of Human Behavior. New Brunswick, New Jersey: Transaction Publishers.

487

Tutin, Caroline E.G.; Fernandez, Michel; Rogers, M. Elizabeth; Williamson, Elizabeth A.; McGrew, William C. (1992) "Foraging profiles of sympatric lowland gorillas and chimpanzees in the Lop Reserve, Gabon." In: Whiten A. and Widdowson E.M. (editors/organizers), Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Oxford, England: Clarendon Press. (pp. 19-26) Ulijaszek, Stanley J. (1992) "Human dietary change." In: Whiten A. and Widdowson E.M. (editors/organizers), Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Oxford, England: Clarendon Press. (pp. 111-119) Walker, Alan and Shipman, Pat (1996) The Wisdom of the Bones: In Search of Human Origins. New York: Alfred A. Knopf. Whiten A. and Widdowson E.M. (editors/organizers) (1992) Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Oxford, England: Clarendon Press. Wilford, John N. (1996) "Three Human Species Coexisted On Earth, New Data Suggest," New York Times, 12/13/96. [The original reference paper (title unknown to me) was published in the 12/13/96 issue of the journal Science. Wilford's article gives a summary of the major points.] Williams, George C.; Nesse, Randolph M. (1991) "The dawn of darwinian medicine." The Quarterly Review of Biology, Mar. 1991, vol. 66, pp. 1-22. Wrangham, R.W.; Conklin, N.L.; Chapman, C.A.; Hunt, K.D. (1992) "The significance of fibrous foods for Kibale Forest chimpanzees." In: Whiten A. and Widdowson E.M. (editors/organizers), Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Oxford, England: Clarendon Press. (pp. 11-18) Wrangham, R.W.; Holland Jones, J.; Laden, G.; Pilbeam, D.; Conklin-Brittain, N.L. (1999) "The raw and the stolen: cooking and the ecology of human origins." Current Anthropology, Dec. 1999, vol. 40, no. 5, pp. 567-594. Wu Rukang and Lin Shenglong (1983) "Peking man." Scientific American, June 1983, vol. 248, no. 6, pp. 86-94. Yesner, David R. (1994) "Seasonality and resource 'stress' among hunter gatherers: archaeological signatures." In: Burch, Ernest S. Jr.; Ellanna, Linda J., Key Issues in Hunter-Gatherer Research. Oxford, England; Providence, Rhode Island: Berg. (pp. 151-167)

The Late Role of Grains and Legumes in the Human Diet, and Biochemical Evidence of their Evolutionary Discordance REFERENCES
Aoki K (1991) "Time required for gene frequency change in a deterministic model of gene-culture coevolution, with special reference to the lactose absorption problem." Theoretical Population Biology, vol. 40, pp. 354-368.

488

Auger I et al. (1997) "A function for the QKRAA amino acid motif: mediating binding of DnaJ to DnaK." J Clin Invest, vol. 99, pp. 1818-1822. Batchelor AJ et al. (1983) "Reduced plasma half-life of radio-labelled 25-hydroxy vitamin D3 in subjects receiving a high-fiber diet." Brit J Nutr, vol. 49, pp. 213-216. Berger SA et al. (1989) "Relationship between infectious diseases and human blood type." Eur J Clin Microbiol Infect Dis, vol. 8, pp. 681-689. Blair R et al. (1989) "Biotin bioavailability from protein supplements and cereal grains for growing broiler chickens." Int J Vit Nutr Res, vol. 59, pp. 55-58. Bradbury JH et al. (1984) "Digestibility of proteins of the histological components of cooked and raw rice." Brit J Nutr, vol. 52, pp. 507-513. Brand-Miller JC, Colagiuri S (1994) "The carnivore connection: dietary carbohydrate in the evolution of NIDDM." Diabetologia, vol. 37, pp. 1280-1286. Budiansky S (1992) The Covenant of the Wild: Why Animals Chose Domestication. New York: William Morrow & Co. Buonocore V et al. (1977) "Wheat protein inhibitors of alpha amylase." Phytochemistry, vol. 16, pp. 811820. Calloway DH et al. (1971) "Reduction of intestinal gas-forming properties of legumes by traditional and experimental processing methods." J Food Sci, vol. 36, pp. 251-255. Cavalli-Sforza LL et al. (1993) "Demic expansions and human evolution." Science, vol. 259, pp. 639-646. Clement MR et al. (1987) "A new mechanism for induced vitamin D deficiency in calcium deprivation." Nature, vol. 325, pp. 62-65. Cordain L (1999) "Cereal grains: humanity's double-edged sword." World Review of Nutrition and Dietetics, vol. 84, pp. 19-73. Dagnelie PC et al. (1990) "High prevalence of rickets in infants on macrobiotic diets." Am J Clin Nutr, vol. 51, pp. 202-208. De Castro JMB et al. (1997) "A hominid from the lower pleistocene of Atapuerca, Spain: possible ancestor to neandertals and modern humans." Science, vol. 276, pp. 1392-1395. Dickey W et al. (1994) "Lewis phenotype, secretor status, and coeliac disease." Gut, vol. 35, pp. 769-770. Dybwad A et al. (1996) "Increases in serum and synovial fluid antibodies to immunoselected peptides in patients with rheumatoid arthritis." Ann Rhem Dis, vol. 55, pp. 437-441. Eaton SB et al. (1985) "Paleolithic nutrition: A consideration of its nature and current implications." N Engl J Med, vol. 312, pp. 283-289. Eaton SB (1992) "Humans, lipids and evolution." Lipids, vol. 27, pp. 814-820.

489

Elliott RB et al. (1984) "Dietary protein: a trigger of insulin-dependent diabetes in the BB rat?" Diabetologia, vol. 26, pp. 297-299. Ewer TK (1950) "Rachitogenicity of green oats." Nature, vol. 166, pp. 732-733. Ford JA et al. (1972) "Biochemical response of late rickets and osteomalacia to a chupatty-free diet." Brit Med J, vol. 1972;ii pp. 446-447. Ford JA et al. (1977) "A possible relationship between high-extraction cereal and rickets and osteomalacia." Advances in Exp Med & Biol, vol. 81, pp. 353-362. Golenser J et al. (1983) "Inhibitory effect of a fava bean component on the in vitro development of plasmodium falciparum in normal and glucose-6-phosphate dehydrogenase-deficient erythrocytes." Blood, vol. 61, pp. 507-510. Golub MS et al. (1996) "Adolescent growth and maturation in zinc-deprived rhesus monkeys." Am J Clin Nutr, vol. 64, pp. 274-282. Grant et al. (1982) "The effect of heating on the haemagglutinating activity and nutritional properties of bean (Phaseolus vulgaris) seeds." J Sci Food Agric, vol. 33, pp. 1324-1326. Gupta YP (1987) "Antinutritional and toxic factors in food legumes: a review." Plant Foods for Human Nutrition, vol. 37, pp. 201-228. Halsted JA et al. (1972) "Zinc deficiency in man, The Shiraz Experiment." Am J Med, vol. 53, pp. 277-284. Harlan JR (1992) Crops and Man. Madison, Wisconsin: American Society of Agronomy, Inc. Hawkes K, O'Connell JF (1985) "Optimal foraging models and the case of the !Kung." Am Anthropologist, vol. 87, pp. 401-405. Hein HO et al. (1992) "The Lewis blood group--a new genetic marker of ischaemic heart disease." J Intern Med, vol. 232, pp. 481-487. Hengtrakul P et al. (1991) "Effects of cereal alkylresorcinols on human platelet thromboxane production." J Nutr Biochem, vol. 2, pp. 20-24. Hidiroglou M et al. (1980) "Effect of a single intramuscular dose of vitamin D on concentrations of liposoluble vitamins in the plasma of heifers winter-fed oat silage, grass silage or hay." Can J Anim Sci, vol. 60, pp. 311-318. Hochman LG et al. (1993) "Brittle nails: response to daily biotin supplementation." Cutis, vol. 51, pp. 303305. Hoofar J et al. (1993) "Prophylactic nutritional modification of the incidence of diabetes in autoimmune non-obese diabetic (NOD) mice." Brit J Nutr, vol. 69, pp. 597-607. Kimbel WH et al. (1996) "Late pliocene Homo and oldowan tools from the Hadar formation (Kada Hadar Member), Ethiopia." J Hum Evol, vol. 31, pp. 549-561. Koch AE et al. (1994) "Soluble intercellular adhesion molecule-1 in arthritis." Clin Immuunol Immunopathol, vol. 71, pp. 208-215.

490

Kopinksi JS et al. (1989) "Biotin studies in pigs: Biotin availability in feedstuffs for pigs and chickens." Brit J Nutr, vol. 62, pp. 773-780. Larick R, Ciochon RL (1996) "The African emergence and early dispersals of the genus Homo." Am Scientist, vol. 84, pp. 538-551. Liener IE (1986) "Nutritional significance of lectins in the diet." In: Liener IE (ed.), The Lectins: Properties, Functions, and Applications in Biology and Medicine. Orlando: Academic Press. (pp. 527-552) Liener IE (1994) "Implications of antinutritional components in soybean foods." Crit Rev Food Sci Nutr, vol. 34, pp. 31-67. Livingstone JN, Purvis BJ (1980) "Effects of wheat germ agglutinin on insulin binding and insulin sensitivity of fat cells." American Journal of Physiology, vol. 238, pp. E267-E275. MacAuliffe T et al. (1976) "Variable rachitogenic effects of grain and alleviation by extraction or supplementation with vitamin D, fat and antibiotics." Poultry Science, vol. 55, pp. 2142-2147. Macri A et al. (1977) "Adaptation of the domestic chicken, Gallus domesticus, to continuous feeding of albumin amylase inhibitors from wheat flour as gastro-resistant microgranules." Poultry Science, vol. 56, pp. 434-441. Neel JV (1962) "Diabete mellitus: A 'thrifty' genotype rendered detrimental by 'Progress.' " Am J Hum Genetics, vol. 14, pp. 353-362. Noah ND et al. (1980) "Food poisoning from raw red kidney beans." Brit Med J, vol. 2, pp. 236-237. O'Donnell MD et al. (1976) "Purification and properties of an alpha-amylase inhibitor from wheat." Biochimica et Biophysica Acta, vol. 422, pp. 159-169. Oldstone MBA (1987) "Molecular mimicry and autoimmune disease." Cell, vol. 50, pp. 819-820. Perez-Maceda B et al. (1991) "Antibodies to dietary antigens in rheumatoid arthritis--possible molecular mimicry mechanism." Clin Chim Acta, vol. 203, pp. 153-165. Pusztai A et al. (1981) "The toxicity of Phaseolus vulgaris lectins: nitrogen balance and immunochemical studies." J Sci Food Agric, vol. 32, pp. 1037-1046. Pusztai A et al. (1993a) "Antinutritive effects of wheat-germ agglutinin and other N-acetylglucosaminespecific lectins." Brit J Nutr, vol. 70, pp. 313-321. Pusztai A (1993b) "Dietary lectins are metabolic signals for the gut and modulate immune and hormone functions." Eur J Clin Nutr, vol. 47, pp. 691-699. Reaven GM (1994) "Syndrome X: Six years later." J Int Med, vol. 236 (suppl. 736), pp. 13-22. Reinhold JG (1971) "High phytate content of rural Iranian bread: a possible cause of human zinc deficiency." Am J Clin Nutr, vol. 24, pp. 1204-1206. Robertson I et al. (1981) "The role of cereals in the aetiology of nutritional rickets: the lesson of the Irish National Nutrition Survey 1943-1948." Brit J Nutr, vol. 45, pp. 17-22.

491

Sandstrom B et al. (1987) "Zinc absorption in humans from meals based on rye, barley, oatmeat, triticale and whole wheat." J Nutr, vol. 117, pp. 1898-1902. Schalin-Jantti C et al. (1996) "Polymorphism of the glycogen synthase gene in hypertensive and normotensive subjects." Hypertension, vol. 27, pp. 67-71. Schechter Y (1983) "Bound lectins that mimic insulin produce persistent insulin-like activities." Endocrinology, vol. 113, pp. 1921-1926. Scott FW, Sarwar G, Cloutier HE (1988a) "Diabetogenicity of various protein sources in the diet of the diabetes-prone BB rat." Advances in Experimental Medicine and Biology, vol. 246, pp. 277-285. Scott FW et al. (1988b) "Evidence for a critical role of diet in the development of insulin-dependent diabetes mellitus." Diabetes Research, vol. 7, pp. 153-157. Scott FW et al. (1991) "Conference summary: Diet as an environmental factor in development of insulindependent diabetes mellitus." Can J Physiol Pharmacol, vol. 69, pp. 311-319. Sedlet K et al. (1984) "Growth-depressing effects of 5-n-pentadecylresorcinol: a model for cereal alkylresorcinols." Cereal Chem, vol. 61, pp. 239-241. Simoons FJ (1978) "The geographic hypothesis and lactose malabsorption: A weighing of the evidence." Dig Dis, vol. 11, pp. 963-980. Simoons FJ (1981) "Celiac disease as a geographic problem." In: Walcher DN, Kretchmer N (eds.) Food, Nutrition and Evolution. New York: Masson Publishing. (pp. 179-199) Singh VK et al. (1993) "Antibodies to myelin basic protein in children with autistic behavior." Brain, Behavior and Immunity, vol. 7, pp. 97-103. Sly MR et al. (1984) "Exacerbation of rickets and osteomalacia by maize: a study of bone histomorphometry and composition in young baboons." Calcif Tissue Int, vol. 36, pp. 370-379. Sobiech KA et al. (1983) "Determination of amylase by measurement of enzymatic activity and by enzyme immunoassay and radioimmunoassay." Arch Immunol Therap Exp, vol. 31, pp. 845-848. Stead IM, Bourke JB, and Brothwell (1986) Lindow Man: The Body in the Bog. Ithaca, New York: Cornell University Press. Stephen A (1994) "Whole grains--impact of consuming whole grains on physiological effects of dietary fiber and starch." Crit Rev Food Sci Nutr, vol. 34, pp. 499-511. Storlien LH et al. (1996) "Laboratory chow-induced insulin resistance: a possible contributor to autoimmune type I diabetes in rodents." Diabetologia, vol. 39, pp. 618-620. Warren RP et al. (1996) "Strong association of the third hypervariable region of HLA-DR beta 1 with autism." J Neuroimmunol, vol. 67, pp. 97-102. Watkins BA (1990) "Dietary biotin effects on desaturation and elongation of 14C linoleic acid in the chicken." Nutr Res, vol. 10, pp. 325-334.

492

Zohary D (1969) "The progenitors of wheat and barley in relation to domestication and agricultural dispersal in the old world." In: Ucko PJ, Dimbleby GW (eds.) The Domestication and Exploitation of Plants and Animals. Chicago: Aldine Publishing Co. (pp. 46-66)

Dietary Macronutrient Ratios and their Effect on Biochemical Indicators of Risk for Heart Disease

REFERENCES Artaud-Wild SM et al. (1993) "Differences in coronary mortality can be explained by differences in cholesterol and saturated fat intakes in 40 countries but not in France and Finland. A paradox." Circulation, vol. 88, pp. 2771-2779. Ascherio A et al. (1997) "Health effects of trans fatty acids." American Journal of Clinical Nutrition, vol. 66, pp. 1006s-1010s. ASCN/AIN Task Force on Trans Fatty Acids. (1996) "Position paper on trans fatty acids." American Journal of Clinical Nutrition, vol. 63, pp. 663-670. Austin MA et al. (1988) "Inheritance of low-density lipoprotein subclass patterns: results of complex segregation analysis." American Journal of Human Genetics, vol. 43, pp. 838-846. Bang HO, Dyerberg J. (1980) "Lipid metabolism and ischemic heart disease in Greenland Eskimos." In: Draper HH (ed.) Advances in Nutrition Research, vol. 3, New York: Plenum Press, pp. 1-22. Begom R et al. (1995) "Prevalence of coronary artery disease and its risk factors in the urban population of south and north India." Acta Cardiologica, vol. 50, pp. 227-240. Brown MS, Goldstein JL. (1976) "Receptor-mediated control of cholesterol metabolism." Science, vol. 191, pp. 150-154. Chen YDI et al. (1992) "Effect of acute variations in dietary fat and carbohydrate intake on retinly ester content of intestinally derived lipoproteins." J Clin Endocrin Metabolism, vol. 74, pp. 28-32. Collier GR et al. (1988) "The acute effect of fat on insulin secretion." J Clin Endocrin Metabolism, vol. 66, pp. 323-326. Cordain L, Gotshall RW, Eaton SB. (1997) "Evolutionary aspects of exercise." World Review of Nutrition and Dietetics, vol. 81, pp. 49-60. Cordain L, Martin C, Florant G, Watkins BA. (1998) "The fatty acid composition of muscle, brain, marrow and adipose tissue in elk: evolutionary implications for human dietary lipid requirements." World Review of Nutrition and Dietetics, vol. 83, p. 225. Coulston AM et al. (1983) "Plasma glucose, insulin and lipid responses to high-carbohydrate, low-fat diets in normal humans." Metabolism, vol. 32, pp. 52-56. Crawford MA et al. (1969) "Linoleic acid and linolenic acid elongation products in muscle tissue of syncerus caffer and other ruminant species." Biochem J, vol. 115, pp. 25-27.

493

Denke MA, Grundy SM. (1991) "Effects of fats high in stearic acid on lipid and lipoprotein concentrations in men." American Journal of Clinical Nutrition, vol. 54, pp. 1036-1040. Denton D et al. (1995) "The effect of increased salt intake on blood pressure of chimpanzees." Nature Medicine, vol. 1, pp. 1009-1016. Dietschy JM. (1997) "Theoretical considerations of what regulates low-density lipoprotein and high-density lipoprotein cholesterol." American Journal of Clinical Nutrition, vol. 65, pp. 1581S-1589S. Dreon DM et al. (1992) "Gene-diet interactions in lipoprotein metabolism." Monographs in Human Genetics, vol. 14, pp. 325-349. Dreon DM, Krauss RM. (1997) "Diet-gene interactions in human lipoprotein metabolism." J Am Coll Nutr, vol. 16, pp. 313-324. Eaton S. Boyd; Konner M; Shostak M. (1988) "Stone-agers in the fast lane: chronic degenerative diseases in evolutionary perspective." American Journal of Medicine, vol. 84, pp. 739-749. Gardner CD et al. (1995) "Monounsaturated versus polyunsaturated dietary fat and serum lipids. A meta analysis." Arterioscler Thromb Vasc Biol, vol. 15, pp. 1917-1927. Ghafoorunissa. (1984) "Essential fatty-acid nutritional status of apparently normal Indian men." Human Nutrition: Clinical Nutrition, vol. 38C, pp. 269-278. Gray CG et al. (year unavailable; likely early 1980s) "The effect of a three-week adaptation to a lowcarbohydrate/high-fat diet on metabolism and cognitive performance." Naval Research Center Report No. 90-20. Naval Health Research Center, San Diego, California. Grundy SM. (1986) "Comparison of monosaturated fatty acids and carbohydrates for lowering plasma cholesterol." N Engl J Med, vol. 314, pp. 745-748. Harris WS. (1997) "n3 fatty acids and serum lipoproteins: human studies." American Journal of Clinical Nutrition, vol. 65(supp), pp. 1645s-1654s. Hollenbeck CB et al. (1989) "Effects of sucrose on carbohydrate and lipid metabolism in NIDDM patients." Diabetes Care, vol. 12, pp. 62-66. Howell WH et al. (1997) "Plasma lipid and lipoprotein responses to dietary fat and cholesterol: a metaanalysis." American Journal of Clinical Nutrition, vol. 65, pp. 1747-1764. Jacobs DR et al. (1983) "Variability in individual serum cholesterol response to change in diet." Arteriosclerosis, vol. 3. Jeppesen J et al. (1997) "Effects of low-fat, high-carbohydrate diets on risk factors for ischemic heart disease in postmenopausal women." American Journal of Clinical Nutrition, vol. 65, pp. 1027-1033. Kalopissis AD et al. (1995) "Inhibition of hepatic very-low-density lipoprotein secretion in obese Zucker rats adapted to a high-protein diet." Metabolism, vol. 44, pp. 19-29. Katan MB et al. (1988) "Congruence of individual responsiveness to dietary cholesterol and to saturated fat in humans." Journal of Lipid Research, vol. 29, pp. 883-892.

494

Keys A et al. (1965) "Serum cholesterol response to changes in the diet. IV. Particular saturated fatty acids in the diet." Metabolism, vol. 14, pp. 776-787. Krauss RM, Dreon DM. (1995) "Low-density lipoprotein subclasses and response to a low-fat diet in healthy men." American Journal of Clinical Nutrition, vol. 62, pp. 478s-487s. Leaf A et al. (1988) "Cardiovascular effects of n3 fatty acids." New England Journal of Medicine, vol. 318, pp. 549-557. Leonard WR, et al. (1994) "Correlates of low serum lipid levels among Evenki herders of Siberia." American Journal of Human Biology, vol. 6, pp. 329-338. Louheranta AM et al. (1996) "Linoleic acid intake and susceptibility of very-low-density and low-density lipoproteins to oxidation in men." American Journal of Clinical Nutrition, vol. 63, pp. 698-703. McKeigue PM et al. (1985) "Diet and risk factors for coronary heart disease in Asians in northwest London." Lancet, 1985;ii, pp. 1086-1090. Mensink RP et al. (1987) "Effects of monounsaturated fatty acids versus complex carbohydrates on highdensity lipoproteins in healthy men and women." Lancet, 1987;1, pp. 122-125. Mensink RP, Katan MB. (1992) "Effect of dietary fatty acids on serum lipids and lipoproteins. A metaanalysis of 27 trials." Arterioscler Thromb, Aug. 1992, vol. 12, no. 8, pp. 911-919. Miller GJ. et al. (1988) "Dietary and other characteristics relevant for coronary heart disease in men of Indian, West Indian and European descent in London." Atherosclerosis, vol. 70, pp. 63-72. Mistry F et al. (1981) "Individual variation in the effects of dietary cholesterol on plasma lipoproteins and cellular homeostasis in man." J Clin Invest, vol. 67, pp. 493-502. Nelson GJ et al. (1995) "Low-fat diets do not lower plasma cholesterol levels in healthy men compared to high-fat diets with similar fatty acid composition at constant caloric intake." Lipids, vol. 30, pp. 969-976. Nishina PM et al. (1992) "Linkage of atherogenic lipoprotein phenotype to the low-density lipoprotein receptor locus on the short arm of chromosome 19." Proc Natl Acad Sci, vol. 89, pp. 708-712. Phinney SD et al. (1983) "The human metabolic response to chronic ketosis without caloric restriction: physical and biochemical adaptation." Metabolism, vol. 32, pp. 757-768. Reaven GM. (1995) "Pathophysiology of insulin resistance in human disease." Physiol Rev, vol. 75, pp. 473-486. Reddy S. et al. (1994) "The influence of maternal vegetarian diet on essential fatty acid status of the newborn." European Journal of Clinical Nutrition, vol. 48, pp. 358-368. Speth JD. (1989) "Early hominid hunting and scavenging: the role of meat as an energy source." Journal of Human Evolution, vol. 18, pp. 329-343. Stamler J et al. (1986) "Is relationship between serum cholesterol and risk of premature death from coronary heart disease continuous and graded?" JAMA, vol. 256, pp. 2823-2828.

495

Steinberg D et al. (1989) "Beyond cholesterol: modifications of low-density lipoprotein that increase its atherogenicity." New England Journal of Medicine, vol. 320, pp. 915-924. Varo P. (1974) "Mineral element balance and coronary heart disease." Internat J Vit Nutr Res, vol. 44, pp. 267-273. Vega GL et al. (1996) "Hypoalphalipoproteinemia (low high-density lipoprotein) as a risk factor for coronary heart disease." Curr Opin Lipidology, vol. 7, pp. 209-216. Westphal SA et al. (1990) "Metabolic response to glucose ingested with various amounts of protein." American Journal of Clinical Nutrition, vol. 52, pp. 267-272. Willett WC et al. (1994) "Trans fatty acids: are the effects only marginal?" American Journal of Public Health, vol. 84, pp. 722-724. Wolfe BM et al. (1991) "Short-term effects of substituting protein for carbohydrate in the diets of moderately hypercholesterolemic human subjects." Metabolism, vol. 40, pp. 338-343. Wolfe BM et al. (1992) "High-protein diet complements resin therapy of familial hypercholesterolemia." Clin Invest Med, vol. 15, pp. 349-359. Wolfe BM. (1995) "Potential role of raising dietary protein intake for reducing risk of atherosclerosis." Can J Cardiol, vol. 11(supp G), pp. 127G-131G. Yudkin J. (1972) "Sucrose and cardiovascular disease." Proc Nutr Soc, vol. 31, pp. 331-337.

Are Higher Protein Intakes Responsible for Excessive Calcium Excretion?


REFERENCES Appel LJ et al. (1997) "A clinical trial of the effects of dietary patterns on blood pressure." New England Journal of Medicine, vol. 336, pp. 1117-1124. Barzel US, Massey LK (1998) "Excess dietary protein can adversely affect bone." Journal of Nutrition, vol. 128, pp. 1051-1053. Cappuccio FP (1996) "Dietary prevention of osteoporosis: are we ignoring the evidence?" American Journal of Clinical Nutrition, vol. 63, pp. 787-788. Cordain L, Gotshall RW, Eaton SB (1997) "Evolutionary aspects of exercise." World Review of Nutrition and Dietetics, vol. 81, pp. 49-60. Devine A et al. (1995) "A longitudinal study of the effect of sodium and calcium intakes on regional bone density in postmenopausal women." American Journal of Clinical Nutrition, vol. 62, pp. 740-745.

496

Eaton SB et al. (1985) "Paleolithic nutrition: A consideration of its nature and current implications." New England Journal of Medicine, vol. 312, pp. 283-289. Estep H et al. (1969) "Hypocalcemia due to hypomagnesia and reversible parathyroid hormone unresponsiveness." J Clin Endocrinol, vol. 29, pp. 842-848. Evans GH, Weaver CM et al. (1990) "Association of magnesium deficiency with the blood-lowering effects of calcium." J Hypertension, vol. 8, pp. 327-337. Heaney RP (1998) "Excess dietary protein may not adversely affect bone." Journal of Nutrition, vol. 128, pp. 1054-1057. Luft FC et al. (1988) "Effect of high calcium diet on magnesium, catecholamine, and blood pressure of stroke-prone spontaneously hypertensive rats." Proc Soc Exp Biol Med, vol. 187, pp. 474-481. Massey LK (1998) "Does excess dietary protein adversely affect bone?" Symposium overview. Journal of Nutrition, vol. 128, pp. 1048-1050. Matkovic V et al. (1995) "Urinary calcium, sodium and bone mass of young females." American Journal of Clinical Nutrition, vol. 62, pp. 417-425. Nordin BEC et al. (1993) "The nature and significance of the relationship between urinary sodium and urinary calcium in women." Journal of Nutrition, vol. 123, pp. 1615-1622. Rude et al. (1978) "Parathyroid hormone secretion in magnesium deficiency." J Clin Endocrinol Metab, vol. 47, pp. 800-806. Rude et al. (1976) "Functional hypoparathyroidism and parathyroid hormone end organ resistance in human magnesium deficiency." Clin Endocrinol, vol. 5, pp. 209-224. Ruff CB et al. (1993) "Post-cranial robusticity in Homo. I: Temporal trends and mechanical interpretation." American Journal of Physical Anthropology, vol. 91, pp. 21-53. Ryzen E et al. (1990) "Low intracellular magnesium in patients with acute pancreatitis and hypocalcemia." West J Med, vol. 152, pp. 145-148. Sellig MS et al. (1974) "Magnesium interrelationships in ischemic heart disease: a review." Am J Clin Nutr, vol. 27, pp. 59-79. Sojka JE, Weaver CM (1995) "Magnesium supplementation and osteoporosis." Nutr Rev, vol. 53, pp. 7174. Varo P (1974) "Mineral element balance and coronary heart disease." Int J Vit Nutr Res, vol. 44, pp. 267273.

Metabolic Evidence of Human Adaptation to Increased Carnivory


REFERENCES

497

Aiello LC, Wheeler P (1995) "The expensive tissue hypothesis." Current Anthropology, vol. 36, pp. 199221. Chanarin I, Malkowska V, O'Hea A-M, Rinsler MG, Price AB (1985) "Megaloblastic anaemia in a vegetarian Hindu community." The Lancet, 1985(2), pp. 1168-1172. de Pee S, West CE et al. (1995) "Lack of improvement in vitamin A status with increased consumption of dark leafy green vegetables." Lancet, vol. 346, pp. 75-81. Herbert V (1994) "Staging vitamin B-12 (cobalamin) status in vegetarians." American Journal of Clinical Nutrition, vol. 59 (suppl.), pp. 1213S-1222S. Herbert V (1988) "Vitamin B-12: plant sources, requirements, and assay." American Journal of Clinical Nutrition, vol. 48, pp. 852-858. Knopf K et al. (1978) "Taurine: an essential nutrient for the cat." Journal of Nutrition, vol. 108, pp. 773778. Laidlow SA et al. (1990) "The taurine content of common foodstuffs." Journal of Parenteral Enteral Nutrition, vol. 14, pp. 183-188. Laidlow SA (1988) "Plasma and urine levels in vegans." American Journal of Clinical Nutrition, vol. 47, pp. 660-663. Leonard WR et al. (1994) "Evolutionary perspectives on human nutrition: the influence of brain and body size on diet and metabolism." American Journal of Human Biology, vol. 6, pp. 77-88. MacDonald ML et al. (1984) "Nutrition of the domestic cat, a mammalian carnivore." Annual Review of Nutrition, vol. 4, pp. 521-562. Murdock GP (1967) "Ethnographic atlas: a summary." Ethnology, vol. 6, pp. 109-236.
Salem N et al. (1994) "Arachidonate and docosahexaenoate biosynthesis in various species and compartments in vivo." World Review of Nutrition and Dietetics, vol. 75, pp. 114-119.

REFERENCES FOR Is Cooked Food Poison? Looking at the Science on Raw vs. Cooked Foods
Abrams SA, O'Brien KO, Wen J, Liang LK, Stuff JE (1996) "Absorption by 1-year-old children of an iron supplement given with cow's milk or juice." Pediatric Research, vol. 39, pp. 171-175. Aeschbacher HU (1990) "Anticarcinogenic effect of browning reaction products." In: Finot PA, et al. (eds.) The Maillard Reaction in Food Processing, Human Nutrition and Physiology. Basel; Boston: Birkhuser Verlag. (pp. 335-348) Alderton G, et al. (1946) "Identification of the bacteria inhibiting, iron binding protein of egg white as conalbumin." Arch Biochem vol. 11: pp. 9-13.

498

Alldrick AJ, et al. (1986) "Effects of plant-derived flavonoids and polyphenolic acids on the activity of mutagens from cooked food." Mutat Res, vol. 163, no. 3, (Dec. 1986), pp. 225-232. Alldrick AL, et al. (1987a) "High levels of dietary fat: alteration of hepatic promutagen activation in the rat." J Natl Cancer Inst, vol. 69, p. 269. Alldrick AL, et al. (1987b) "The hepatic conversion of some heterocyclic amines to bacterial mutagens is modified by dietary fat and cholesterol." Mutagenesis vol. 2, p. 221. Alldrick AJ, et al. (1989) "Modification of in vivo heterocyclic amine genotoxicity by dietary flavonoids." Mutagenesis, vol. 4, no. 5 (Sept.), pp. 365-370. Ames BN (1983) "Dietary carcinogens and anticarcinogens." Science, vol. 221, pp. 1256-1264. Ames BN (1990) "Nature's chemicals and synthetic chemicals: comparative toxicology." , vol. 87, pp. 7782-7786. Annand JC (1971) "The case against milk protein." Atherosclerosis, vol. 13, p. 137. Annand JC (1972) "Further evidence in the case against heated milk protein." Atherosclerosis, vol. 15, no. 1 (Jan.), pp. 129-133. Annand JC (1986) "Denatured bovine immunoglobulin pathogenic in atherosclerosis." Atherosclerosis, vol. 59, no. 3 (Mar.), pp. 347-351. Augustsson K, et al. (1997) "Assessment of the human exposure to heterocyclic amines," Carcinogenesis, vol. 18, no. 10 (Oct. 1), pp. 1931-1935. Augustsson K, et al. (1999) "Dietary heterocyclic amines and cancer of the colon, rectum, bladder and kidney: a population-based study." Lancet, vol. 353, pp. 703-707. Bach Knudsen KE, et al. (1988) "Effect of cooking, pH and polyphenol level on carbohydrate composition and nutritional quality of a sorghum (Sorghum bicolor (L.) Moench) food, ugali." Br J Nutr, vol. 59, no. 1 (Jan.), pp. 31-47. Backus RC, et al. (1995) "Diets causing taurine depletion in cats substantially elevate postprandial plasma cholecystokinin concentration." J Nutr, vol. 125, no. 10 (Oct.), pp. 2650-2657. Baynes JW, Monnier VM, eds. (1989) The Maillard Reaction in Aging, Diabetes and Nutrition. New York: A.R. Liss. Beier RC, et al. (1994) "Toxicology of naturally occurring chemicals in foods." In: Hui YH, et al., Foodborne Disease Handbook, Vol. 3, Diseases Caused by Hazardous Substances. New York: Marcel Dekker. (pp. 1-186) Benet S, (1974) Abkhasians: The Long-Living People of the Caucasus. New York: Holt, Rinehart and Winston. Bicchieri MG (ed.) (1972) Hunters and Gatherers Today: A Socioeconomic Study of Eleven Such Cultures in the Twentieth Century. New York: Holt, Rinehart and Winston.

499

Bleumink E and Berrens L (1966) "Synthetic approaches to the biological activity of beta-lactoglobulin in human allergy to cow's milk." Nature, vol. 212, pp. 541-543. Bleumink E and Young E (1968) "Identification of the atopic allergen in cow's milk." Int Arch Allergy Appl Immunol, vol. 24, pp. 521. Bleumink E (1970) "Food allergy: the chemical nature of the substances eliciting symptoms." World Rev Nutr Diet, vol. 12, p. 305. Bohe M, Borgstrom A, Genell S, Ohlsson K (1984) "Fate of intravenously injected trypsin in dog with special reference to the existence of an enteropancreatic circulation." Digestion, vol. 29, pp. 158-163. Bornet FRJ, Fontvieille AM, Rizkalla S, Colonna P, Blayo A, Mercier C, Slama G (1989) "Insulin and glycemic responses in healthy humans to native starches processed in different ways: correlation with in vitro alpha-amylase hydrolysis." American Journal of Clinical Nutrition, vol. 50, pp. 315-323. Bradbury JH, et al. (1984) "Digestibility of proteins of the histological components of cooked and raw rice." Br J Nutr, vol. 52, no. 3 (Nov.), pp. 507-513. Branch DW (1992) "Physiologic adaptations of pregnancy." Am J Reprod Immunol, vol. 28, no. 3-4 (Oct.), pp. 120-122. Review. Brown JP (1980) "A review of the genetic effects of naturally occurring flavonoids, anthraquinones and related compounds." Mutat Res, vol. 75, no. 3 (May), pp. 243-277. Review. Bruder G, et al. (1984) "High concentrations of antibodies to xanthine oxidase in human and animal sera." J Clin Invest, vol. 74, pp. 783-794. Bulkley JL (1927) "Cancer among primitive tribes." Cancer, vol. 4, pp. 289-295. Buonocore V, Petrucci T, Silano V (1977) "Wheat protein inhibitors of alpha-amylase." Phytochemistry, vol. 16, pp. 811-820. Chau C-F, Cheung P C-K, Wong Y-S (1997) "Effects of cooking on content of amino acids and antinutrients in three Chinese indigenous legume seeds." Journal of the Science of Food and Agriculture, vol. 75, pp. 447-452. Christian E, Mr. and Mrs. (1904) Uncooked Foods and How to Use Them, The Health Culture Company, New York. Chung KT, Wong TY, Wei CI, Huang YW, Lin Y (1998) "Tannins and human health: a review." Critical Reviews in Food Science and Nutrition, vol. 38, pp. 421-464. Chuyen NV, et al. (1990) "Antioxidative effect of Maillard reaction products in vivo." In: Finot PA, et al. (eds.) The Maillard Reaction in Food Processing, Human Nutrition and Physiology. Basel; Boston: Birkhuser Verlag. Clark J (1956) Hunza, Lost Kingdom of the Himalayas. New York: Funk and Wagnalls. Clastres P (1972) "The Guayaki." In: Hunters and Gatherers Today; ed. Bicchieri MG; Holt, Rinehart and Winston, New York, pp. 138-174.

500

Clemente A, Sanchez-Vioque R, Vioque J, Bautista J, Millan F (1998) "Effect of cooking on protein quality of chickpea (Cicer arietinum) seeds." Food Chemistry, vol. 62, pp. 1-6. Clifford AJ, et al. (1990) "Bioavailability of folates in selected foods incorporated into amino acid-based diets fed to rats." J Nutr, vol. 120, no. 12 (Dec.), pp. 1640-1647. Cook JD, Reddy MB (1995) "Efficacy of weekly compared with daily iron supplementation." Am J Clin Nutr, vol. 62, pp. 117-120. Cordain L (1997) "Loren Cordain responds to Enig & Fallon (Part 1)." Internet PALEODIET listgroup archives, 10/9/1997. [Note that this post was forwarded to the list via moderator Dean Esmay, in case you are doing a search on author's name in the list archives.] Cridland R (1998) "Tired of Singing the Cooked-Food Blues." Health Science, vol. 21, no. 3 (May/June), p. 26. Davies D (1975) The Centenarians of the Andes. Garden City, NY: Anchor Press. Davis DF, et al. (1969) "Antibodies to reconstituted dried cow's milk protein in coronary heart disease." J Atherocler Res, vol. 9, pp. 103-107. Davis CM (1928) "Self-selection of diet by newly weaned infants: an experimental study." Am J Dis Child, vol. 28, pp. 651-679. Davis CM (1934) "Studies in the self-selection of diet by young children." J Am Dent Assoc, vol. 21, pp. 636-640. Davis CM (1938) "The self-selection of diet experiment: its significance for feeding in the home." Ohio State Med J, vol. 34, pp. 862-868. Davis CM (1939) "Results in the self-selection of diets by young children." Can Med Assoc J, vol. 41, pp. 257-261. Donovan UM, et al. "Iron and zinc status of young women aged 14 to 19 years consuming vegetarian and omnivorous diets". J Am Coll Nutr. 1995 Oct; 14(5): 463-472. Eaton SB, et al. (1985) "Paleolithic nutrition: a consideration of its nature and current implications." N Engl J Med, vol. 312, no. 5 (Jan. 31), pp. 283-289. Review. Dunn FL (1968) "Epidemiological factors: health and disease in hunter-gatherers." In: Man the Hunter, eds. Lee RB, DeVore I; Aldine Publishing, Chicago, pp. 221-228. Eaton SB, Konner M (1985) "Paleolithic nutrition: a consideration of its nature and current implications." The New England Journal of Medicine, vol. 312, no. 5, pp. 283-289. Eaton SB, Konner M, Shostak M (1988) "Stone-agers in the fast lane: chronic degenerative diseases in evolutionary perspective." American Journal of Medicine, vol. 84, pp. 739-749. Eaton SB (1992) "Humans, lipids and evolution." Lipids, vol. 27, no. 10, pp. 814-820. Review. Eaton SB, Pike MC, Short RV, et al. (1994) "Women's reproductive cancers in evolutionary context." Quart Rev Biol, vol. 69, pp. 353-367.

501

Eaton SB, et al. (1996) "An evolutionary perspective enhances understanding of human nutritional requirements." J Nutr, vol. 126, no. 6 (June), pp. 1732-1740. Review. Ejeta G, et al. (1987) "In vitro digestibility and amino acid composition of pearl millet (Pennisetum typhoides) and other cereals." Proc Natl Acad Sci U.S.A., vol. 84, no. 17 (Sept.), pp. 6016-6019. Englyst HN, et al. (1985) "Digestion of the polysaccharides of some cereal foods in the human small intestine." Am J Clin Nutr, vol. 42, no. 5 (Nov.), pp. 778-787. Erbersdobler H, et al. (1984) "Histopathological studies of rat kidneys following the feeding of heatdamaged proteins." Z Ernahrungswiss vol. 23, no. 3 (Sept.), pp. 219-229. Erdman, et al. (1993) "Absorption and transport of carotenoids." Ann NY Acad Sci, vol. 691, pp. 76-85. Etzel K (1993) "Role of salivary glands in nutrition." In: Biology of the Salivary Glands, ed. DobrosielskiVergona K; CRC Press, Boca Raton, Florida, pp. 129-152. Faldet MA, et al. (1992) "Determining optimal heat treatment of soybeans by measuring available lysine chemically and biologically with rats to maximize protein utilization by ruminants." J Nutr, vol. 122, no. 1 (Jan.), pp. 151-160. Finot PA, et al. (eds.) (1990) The Maillard Reaction in Food Processing, Human Nutrition and Physiology. Basel; Boston: Birkhuser Verlag. Fomon SJ, Ziegler EE, Nelson SE, Serfass RE, Frantz JA (1995) "Erythrocyte incorporation of iron by 56day-old infants fed a 58Fe-labeled supplement." Pediatric Research, vol. 38, pp. 373-378. Ford JE, Hurrell RF, Finot PA (1983) "Storage of milk powders under adverse conditions. 2. Influence on the content of water-soluble vitamins." British Journal of Nutrition, vol. 49, pp. 355-364. Furniss DE, Vuichoud J, Finot PA, Hurrell RF (1989) "The effect of Maillard reaction products on zinc metabolism in the rat." British Journal of Nutrition, vol. 62, pp. 739-749. Gabriel H, et al. (1997) "The acute immune response to exercise: what does it mean?" Int J Sports Med, vol. 18, suppl 1 (Mar.), pp. S28-S45. Review. Giovanelli J (1987) "Sulfur amino acids of plants: an overview." In: "Sulfur and Sulfur Amino Acids," Methods in Enzymology, vol. 143; eds. Jakoby W, Griffith O; Academic Press, Orlando, Florida; pp. 419426. Golenser J et al. (1983) "Inhibitory effect of a fava bean component on the in vitro development of plasmodium falciparum in normal and glucose-6-phosphate dehydrogenase-deficient erythrocytes." Blood, vol. 61, pp. 507-510. Gomaa EA, et al. (1993) "Polycyclic aromatic hydrocarbons in smoked food products and commercial liquid smoke flavourings." Food Addit Contam, vol. 10, no. 5, (Sept. 1), pp. 503-521. Gotze H, Rothman SS (1975) "Enteropancreatic circulation of digestive enzyme as a conservation mechanism." Nature, vol. 257, pp. 607-609. Goudsblom J (1992) Fire and Civilization. London; New York: Penguin Books.

502

Grant G, et al. (1982) "The effect of heating on the haemagglutinating activity and nutritional properties of bean (Phaseolus vulgaris) seeds." J Sci Food Agric, vol. 33, no. 12 (Dec.), pp. 1324-1326. Ha YL, et al. (1987) "Anticarcinogens from fried ground beef: heat-altered derivatives of linoleic acid." Carcinogenesis, vol. 8, no. 12 (Dec.), pp. 1881-1887. Hawkes K, Hill K, O'Connell JF (1982) "Why hunters gather: optimal foraging and the Ache of eastern Paraguay." American Ethnologist, vol. 9, pp. 379-398. Heinrich HC, Gabbe EE, Bruggemann J, Icagic F, Classen M (1979) "Enteropancreatic circulation of trypsin in man." Klinische Wochenschrift, vol. 53, pp. 1295-1297. Heinz HJ, Maguire B (197X; released in 1970s but exact date unknown) The Ethno-Biology of the !ko Bushmen, Occasional Paper No. 1, Botswana Society, Gabrone, Botswana. Herbert V, Drivas G, Manusselis C, Mackler B, Eng J, Schwartz E (1984) "Are colon bacteria a major source of cobalamin analogues in human tissues?..." Trans Assoc of Amer Physicians, vol. 97, pp. 161-171. Hertog MGL, Katan MB (1998) "Quercetin in foods, cardiovascular disease, and cancer." In: Rice-Evans CA, Packer L (eds.), Flavonoids in Health and Disease, Marcel Dekker, Inc., New York. (pp. 447-463) Hickman MA, et al. (1990) "Effect of processing on fate of dietary [14C]taurine in cats." J Nutr, vol. 120, no. 9 (Sept.), pp. 995-1000. Hickman MA, et al. (1992) "Intestinal taurine and the enterohepatic circulation of taurocholic acid in the cat." In: Lombardini JB, et al. (eds.), Taurine: Nutritional Values and Mechanisms of Action. New York: Plenum. Hill K, Hawkes K, Hurtado M, Kaplan H (1984) "Seasonal variance in the diet of Ache hunter-gatherers in Eastern Paraguay." Human Ecology, vol. 12, pp. 101-135. Holder CL, et al. (1997) "Quantification of heterocyclic amine carcinogens in cooked meats using isotope dilution liquid chromatography/atmospheric pressure chemical ionization tandem mass spectrometry. Rapid Commun Mass Spectrom, vol. 11, no. 15, pp. 1667-1672. Hollman PCH (1997) "Bioavailability of flavonoids." European Journal of Clinical Nutrition, vol. 51 (Suppl. 1), pp. S66-S69. Holm J, et al. (1988) "Degree of starch gelatinization, digestion rate of starch in vitro, and metabolic response in rats." Am J Clin Nutr,vol. 47, no. 6 (Jun.), pp. 1010-1016. Howell E (1946) The Status of Food Enzymes in Digestion and Metabolism, originally published in 1946, and republished as Food Enzymes for Health and Longevity, Lotus Press, 2nd edition, 1994. Howell E (1985) Enzyme Nutrition, Avery Publishing, Wayne, New Jersey. (Note: This version of Enzyme Nutrition is claimed by some to be an abridgement of an earlier book with the same title, said to be 700+ pages long. However, no one we have talked to who has mentioned it has actually been able to locate the book; nor have we, so far.) Howes AL, et al. (1989) "Effect of dietary fibre on the mutagenicity and distribution of 2-mino-3,4dimethylimidazo [4,5-f]-quinoline (MeIQ)." Mutat Res, vol. 210, pp. 227.

503

Huang MT, et al. (1983) "Inhibition of the mutagenicity of bay-region diol-epoxides of polycyclic aromatic hydrocarbons by phenolic plant flavonoids." Carcinogenesis,vol. 4, no. 12 (Dec.) pp. 1631-1637. Hurrell R (1997) "Bioavailability of iron." European Journal of Clinical Nutrition, vol. 51 (suppl 1), pp. S4S8. Hurrell RF (1990) "Influence of the Maillard reaction on the nutritional value of foods." In: Finot PA, et al. (eds.) The Maillard Reaction in Food Processing, Human Nutrition and Physiology. Basel; Boston: Birkhuser Verlag. (pp. 245-258) Huxtable RJ (1992) "Physiological actions of taurine." Physiological Reviews, vol. 72, pp. 101-163. Ivie GW, et al. (1981) "Natural toxicants in human foods: psoralens in raw and cooked parsnip root." Science, vol. 213, no. 4510 (Aug. 21), pp. 909-910. Iwasaki K, et al. (1988) "The influence of dietary protein source on longevity and age-related disease processes of Fischer rats." J Gerontol, vol. 43, pp. B42. Jacobsen JG, Smith, LH (1968) "Biochemistry and physiology of taurine and taurine derivatives." Physiological Reviews, vol. 48, pp. 424-511. Jagerstad M and Skog K (1991) "Formation of meat mutagens." Nutritional and Toxicological Consequences of Food Processing, vol. 289 in Advances in Experimental Medicine and Biology, pp. 83-105. Review. Johansson MA, et al. (1994) "Occurrence of mutagenic/carcinogenic heterocyclic amines in meat and fish products, including pan residues, prepared under domestic conditions." Carcinogenesis, vol. 15, no. 8 (Aug. 1), pp. 1511-1518. Johnson L (1991) "Thermal degradation of carotenes and influence on their physiological functions." Adv Exp Med Biol, vol. 289, pp. 75-82. Johnson PE (1991) "Effect of food processing and preparation on mineral utilization." Nutritional and Toxicological Consequences of Food Processing, Advances in Experimental Medicine and Biology, vol. 289, pp. 483-498. Kada T, et al. (1984) "Adsorption of pyrolysate mutagens by vegetable fibres." Mutat Res, vol. 141, p. 149. Kataria A, Chauhan BM (1988) "Contents and digestibility of carbohydrates of mung beans (Vigna radiata L.) as affected by domestic processing and cooking." Plant Foods for Human Nutrition, vol. 38, pp. 51-59. Kim SW, et al. (1996a) "Dietary antibiotics decrease taurine loss in cats fed a canned heat-processed diet." J Nutr,vol. 126, no. 2 (Feb.), pp. 509-515. Kim SW, et al. (1996b) "Maillard reaction products in purified diets induce taurine depletion in cats which is reversed by antibiotics." J Nutr, vol. 126, no. 1 (Jan.) pp. 195-201. Klohs WD, Steinkampf RW (1988) "Possible link between the intrinsic drug resistance of colon tumors and a detoxification mechanism of intestinal cells." Cancer Research, vol. 48, pp. 3025-3030. Knasmuller S, et al. (1992) "Organ-specific distribution of genotoxic effects in mice exposed to cooked food mutagens." Mutagenesis, vol. 7, no. 4 (Jul. 1), pp. 235-241.

504

Knize MG, et al. (1994) "Effect of cooking time and temperature on the heterocyclic amine content of fried beef patties." Food Chem Toxicol, vol. 32, no. 7 (Jul.) pp. 595-603. Knize MG, et al. (1997) "Analysis of cooked muscle meats for heterocyclic aromatic amine carcinogens." Mutat Res, vol. 376, nos. 1-2 (May 12), pp. 129-134. Koschinsky T, et al. (1997) "Orally absorbed reactive glycation products (glycotoxins): an environmental risk factor in diabetic nephropathy." Proc Natl Acad Sci U.S.A., vol. 94, no. 12 (Jun. 10), pp. 6474-6479. Kouchakoff P (1930) "The influence of cooking food on the blood formula of man." First International Congress of Microbiology, Paris. Kouchakoff P (1937) "Nouvelles lois de l'alimentation humaine bases sur la leucocytose digestive." Mmoires de la socit Vaudoise des sciences naturelles, No. 39, vol. 5, no. 8, pp. 319-348. Kreutler PA and Czajka-Narins DM (1987) Nutrition in Perspective, 2nd edition. Englewood Cliffs, New Jersey: Prentice Hall. Kubow A (1993) "Lipid oxidation products in food and atherogenesis." Nutr Rev, vol. 51, pp. 33-40. Kulvinskas V, Lahiri R (1997) The Lover's Diet. Ihopea Enterprises, Hot Springs, Arkansas. Kummerow FA (1979) "Nutrition imbalance and angiotoxins as dietary risk factors in coronary heart disease." Am J Clin Nutr, vol. 32, pp. 58-83. Laidlow SA (1988) "Plasma and urine levels in vegans." American Journal of Clinical Nutrition, vol. 47, pp. 660-663. Lee PC, et al. (1985) "Digestibility of native and modified starches: in vitro studies with human and rabbit pancreatic amylases and in vivo studies in rabbits." J Nutr, vol. 115, no. 1 (Jan.), pp. 93-103. Lee RB (1979) The !Kung San: Men, Women, and Work in a Foraging Society. Cambridge University Press. Leibow C, Rothman SS (1975) "Enteropancreatic circulation of digestive enzymes." Science, vol. 189, pp. 472-474. Lennon JM (1998) "Much Ado about Raw Food." Health Science, vol. 21, no. 2 (Mar./Apr.), p. 12. Levitt MD, Ellis CJ, Murphy SM, Schwartz ML (1981) "Study of the possible enteropancreatic circulation of pancreatic amylase in the dog." American Journal of Physiology, vol. 241, pp.G54-G58. Lewin R (1988) "New views emerge on hunters and gatherers." Science, vol. 240, pp. 1146-1148. Liener IE (1994) "Implications of antinutritional components in soybean foods." Critical Reviews in Food Science and Nutrition, vol. 34, no. 1, pp. 31-67. Review. Lijinsky W (1991) "The formation and occurrence of polynuclear aromatic hydrocarbons associated with food." , vol. 259, no. 3-4 (Mar.), pp. 251-261. Review. Lindeskog, et al. (1988) "Influence of fried meat and fiber on cytochrome P-450 mediated activity of mutagens in rats." Mutat Res, vol. 204, p. 553.

505

Lodovici M, et al. (1995) "Polycyclic aromatic hydrocarbon contamination in the Italian diet." Food Addit Contam, vol. 12, no. 5 (Sept. 1), pp. 703-713. Lonnerdahl B (1984) "Iron and Breast Milk." In: Iron Nutrition in Infancy and Childhood, ed. Stekel A; Raven Press, pp. 95-117. Lykken GI, et al. "Effect of browned and unbrowned corn products intrinsically labeled with 65Zn on absorption of 65Zn in humans." J Nutr, vol. 116, pp. 795-801. Makkar HPS, Becker K, Schmook B (1998) "Edible provenances of Jatropha curcas from Quintana Roo state of Mexico and effect of roasting on antinutrient and toxic factors in seeds." Plant Foods for Human Nutrition, vol. 52, pp. 31-36. Malanin K, et al. (1995) "Anaphylactic reaction caused by neoallergens in heated pecan nut." Allergy, vol. 50, pp. 998-991 Malinow MR, et al. (1982) "Systemic lupus erythematosus-like syndrome in monkeys fed alfalfa sprouts: role of a nonprotein amino acid." Science, vol. 216, no. 4544 (Apr. 23), pp. 415-417. Malinow MR, et al. (1984) "Elimination of toxicity from diets containing alfalfa seeds." Food Chem Toxicol, vol. 22, no. 7 (Jul.), pp. 583-587. Matsuda T, et al. (1990) "Immunodominancy and antigenic structure of lactose-protein Maillard products." In: Finot PA, et al. (eds.) The Maillard Reaction in Food Processing, Human Nutrition and Physiology. Basel; Boston: Birkhuser Verlag. (pp. 297-302) Maugh TH (1979) "Cancer and environment: Higginson speaks out." Science, vol. 205, pp. 1363-1366. McClure JW (1975) "Physiology and functions of flavonoids." In: The Flavonoids, eds. Harborne JB, Mabry TJ, Mabry H; Academic Press, New York, pp. 970-1055. McKay CM, Crowell MF, Maynard LA (1935) "The effect of retarded growth upon the length of life span and upon the ultimate body size." Journal of Nutrition, vol. 10, pp. 63-79. McPherson LL (1990) "The effect of the consumption of red kidney beans (Phaseolus vulgaris) on the growth of rats and the implications for human populations." J R Soc Health, vol. 110, no. 6 (Dec.), pp. 222226. Milton K (1987) "Primate diets and gut morphology: implications for hominid evolution." In: Food and Evolution: Toward a Theory of Food Habits, eds. Harris M, Ross EB; Temple University Press, Philadelphia, pp. 93-115. Mogelonsky M (1995) "Milk doesn't always do a body good." American Demographics, Jan. 1995. Available online at: http://www.demographics.com/publications/ad/95_ad/9501_ad/9501ab07.htm. Mukhtar H, et al. (1984) "Epidermal benzo[a]pyrene metabolism and DNA-binding in Balb/C mice: inhibition by ellagic acid." Xenobiotica, vol. 14, no. 7 (Jul.), pp. 527-531. Nagao M, et al. (1981) "Mutagenicities of 61 flavonoids and 11 related compounds." Environ Mutagen, vol. 3, no. 4, pp. 401-419.

506

Nagao M, et al. (1993) "Carcinogenic factors in food with relevance to colon cancer development." Mutat Res, vol. 290, no. 1 (Nov. 1), pp. 43-51. Review. Neumann D, Nieden U, Lichtenberger O, Leopold I (1995) "How does Armeria maritima tolerate high heavy metal concentrations?" J Plant Physiol, vol. 146, pp. 704-717. National Research Council (NRC) (1986) Nutritional Requirements of Cats, Revised Edition, 1986. National Academy Press, Washington, D.C. Nicholson, PH (1981) "Fire and the Australian Aborigine: an enigma." In: Gill AM, Groves RH, Noble IR (eds.) Fire and the Australian Biota. Canberra: Australian Academy of Science. (pp. 55-76) Nicoli MC, et al. (1997) "Loss and/or formation of antioxidants during food processing and storage." Cancer Lett, vol. 114, no. 1-2 (Mar. 19), pp. 71-74. Novotny MJ, Hogan PM, Flannigan G (1994) "Echocardiographic evidence of myocardial failure induced by taurine deficiency in domestic cats." Canadian Journal of Veterinary Research, vol. 58, pp. 6-12. O'Brien J, Walker R (1988) "Toxicological effects of dietary Maillard reaction products in the rat." Food Chem Toxicol, vol. 26, no. 9 (Sept.), pp. 775-783. O'Dea K (1984) "Marked improvement in carbohydrate and lipid metabolism in diabetic Australian aborigines after temporary reversion to traditional lifestyle." Diabetes, vol. 33, no. 6 (Jun.), pp. 596-603. O'Dea, K (1992) "Traditional diet and food preferences of Australian Aboriginal hunter-gatherers." In: Whiten A and Widdowson EM (editors/organizers), Foraging Strategies and Natural Diet of Monkeys, Apes, and Humans: Proceedings of a Royal Society Discussion Meeting held on 30 and 31 May, 1991. Oxford, England: Clarendon Press. (pp. 73-81). O'Keefe JH, Lavie CJ, McAllister BD (1995) "Insights into the pathogenesis and prevention of coronary artery disease." Mayo Clinic Proceedings, vol. 70, pp. 69-79. Oste RE (1996) "Digestibility of processed food protein." Adv Exp Med Biol, vol. 289, pp. 371-388. Review. Pagel M, Harvey P (1988) "Recent developments in the analysis of comparative data." The Quarterly Review of Biology, vol. 63, pp. 413-440. Panigrahi A, et al. (1996) "The nutritive value of stackburned yellow maize for livestock: tests in vitro and in broiler chicks." Br J Nutr vol. 76, no. 1 (Jul.), pp. 97-108. Pao EM, Sickle SJ (1981) "Problem nutrients in the United States." Food Technology, vol. 35, no. 9 (September), pp. 58-69. Pollack OJ (1958) "Serum cholesterol levels resulting from various egg diets: experimental studies with clinical implications." J Am Ger Soc, vol. 6, pp. 614-618. Pottenger, FM (1946) "The effect of heat-processed foods and metabolized vitamin D milk on the dentofacial structures of experimental animals." American Journal of Orthodontics and Oral Surgery, vol. 32, pp. 467-485. Pottenger, FM (1995) Pottenger's Cats. Price-Pottenger Nutrition Foundation, La Mesa, California.

507

Powrie WD, et al. (1986) "Browning reaction systems as sources of mutagens and antimutagens." Environ Health Perspect, vol. 67, p. 47. Probst-Hensch NM, et al. (1997) "Meat preparation and colorectal adenomas in a large sigmoidoscopybased case-control study in California (United States)." Cancer Causes Control, vol. 8, no. 2 (Mar.), pp. 175-183. Prochaska LJ, Piekutowski WV (1994) "On the synergistic effects of enzymes in food with enzymes in the human body. A literature survey and analytical report." Medical Hypotheses, vol. 42, pp. 355-362. Proctor RN (1995) Cancer Wars: How Politics Shapes What We Know and Don't Know about Cancer. New York: Basic Books. Pusztai A, et al. (1995) "Inhibition of starch digestion by alpha-amylase inhibitor reduces the efficiency of utilization of dietary proteins and lipids and retards the growth of rats." Journal of Nutrition, vol. 125, pp. 1554-1562. Pyne, SJ (1991) Burning Bush: A Fire History of Australia. New York: Holt. Rodale JI (1948) The Healthy Hunzas. Emmaus, PA: Rodale Press. Rohr G, Kern H, Scheele G (1981) "Enteropancreatic circulation of digestive enzymes does not exist in the rat." Nature, vol. 292, pp. 470-472. Rohr G, Scheele G (1983) "Fate of radioactive exocrine pancreatic proteins injected into the blood circulation of the rat." Gastroenterology, vol. 85, pp. 991-1002. Rosen E (date unknown, received 1998) "Contributions to the Evolution of Life Food Trend." (part of promotional mailing from Viktoras Kulvinskas) Rossie K (1993) "Influence of diseases on salivary glands." In: Biology of the Salivary Glands, ed. Dobrosielski-Vergona K; CRC Press, Boca Raton, Florida, pp. 201-227. Sandberg AS, Svanberg U (1991) "Phytate hydrolysis by phytase in cereals; effects on in vitro estimation of iron availability." Journal of Food Science, vol. 56, pp. 1330-1333. Schaeffer MC, Rogers QR, Morris JG (1985) "Protein in the nutrition of dogs and cats." In: Nutrition of the Dog and Cat, Waltham Symposium Number 7, eds. Burger IH, Rivers JPW; Cambridge University Press. Schaeffer O (1981) "Eskimos (Inuit)." In: Western Diseases: Their Emergence and Prevention, eds. Trowell H, Burkitt D; Harvard University Press, pp. 113-128. Scheele G, Rohr G (1984) "Enteropancreatic circulation of digestive enzymes [letter]." Gastroenterology, vol. 86(4), pp. 778-779. Schmid RF (1997) Traditional Foods Are Your Best Medicine. Rochester, Vermont: Healing Arts Press. Schweitzer A (1957) Preface to: Berglas A, Cancer. Paris. Seidler T (1987) "Effects of additives and thermal treatment on the content of nitrogen compounds and the nutritive value of hake meat." Die Nahrung, vol. 31, no. 10, pp. 959-970.

508

Seo YJ, et al. (1990) "The effect of lectin from Taro tuber (Colocasia antiquorum) given by force-feeding on the growth of mice." J Nutr Sci Vitaminol (Tokyo), vol. 36, no. 3 (Jun.), pp. 277-285. Shahid HS (1979) Karakuram Hunza: The Land of Just Enough. Karachi: Ma'aref. Shibamoto T (1989) "Genotoxicity testing of Maillard reaction products." In: JW Baynes and VM Monnier (eds.), The Maillard Reaction in Aging, Diabetes and Nutrition. (pp. 359-376) Sidky H (1995) Hunza, An Ethnographic Outline. Aipur: Illustrated Book Publishers. Simoons FJ (1988) "The determinants of dairying and milk use in the old world: ecological, physiological, and cultural." In: Robson JRK (ed.), Food, Ecology and Culture: Readings in the Anthropology of Dietary Practices. New York: Gordon and Breach. (pp. 83-91) Sinha R, et al. (1997) "Exposure assessment of heterocyclic amines (HCAs) in epidemiologic studies." Mutat Res, vol. 376, no. 1-2 (May 12), pp. 195-202. Sinha R, et al. (1998) "Heterocyclic amine content of pork products cooked by different methods and to varying degrees of doneness." Food Chem Toxicol, vol. 36, no. 4 (Apr.), pp. 289-297. Skog K (1993) "Cooking procedures and food mutagens: a literature review." Food Chem Toxicol, vol. 31, no. 9 (Sept. 1), pp. 655-675. Review. Skog K, et al (1995) "Factors affecting the formation and yield of heterocyclic amines." Princess Takamatsu Symp,vol. 23 (Jan. 1), pp. 9-19. Review. Sotelo A, Flores F, Hernandez M (1987) "Chemical composition and nutritional value of Mexican varieties of chickpea (Cicer arietinum L.)." Plant Foods for Human Nutrition, vol. 37, pp. 299-306. Stefansson V (1913) My life with the Eskimo. MacMillan, New York. Stefansson V (1960) Cancer: Disease of Civilization? Hill and Wang, New York. Steinberg D, Parthasarathy S, Carew TE, Khoo JC, Witzum JL (1989) "Modifications of low-density lipoprotein that increase its atherogenicity." New England Journal of Medicine, vol. 320, pp. 915-924. Stekel A (1984) "Iron Requirements in Infancy and Childhood." In: "Iron Nutrition in Infancy and Childhood," ed. Stekel A; Raven Press, pp. 1-10. Story M, Brown JE (1987) "Do young children instinctively know what to eat?" New England Journal of Medicine, vol. 316, pp. 103-106. Sturman JA, Gargano AD, Messing J, Imaki H (1986) "Feline maternal taurine deficiency: effect on mother and offspring." Journal of Nutrition, vol. 116, pp. 655-667. Sturman JA (1991) "Dietary taurine and feline reproduction and development." J Nutr, vol. 121 (11 suppl) (Nov.), pp. S166-170. Sugimura T, et al. (1990) "Mutagens and carcinogens formed by cooking meat and fish: heterocyclic amines." In: Finot PA, et al. (eds.) The Maillard Reaction in Food Processing, Human Nutrition and Physiology. Basel; Boston: Birkhuser Verlag. (pp. 323-334)

509

Tanaka J (1980) The San: Hunter-Gatherers of the Kalahari, A Study in Ecological Anthropology. University of Tokyo Press. Taylor CB, et al. (1979) "Spontaneously occurring angiotoxic derivatives of cholesterol." Am J Clin Nutr, vol. 32, pp. 40-57. Teel RW, et al. (1985) "The effect of ellagic acid on the uptake, persistence, metabolism and DNA-binding of benzo[a]pyrene in cultured explants of strain A/J mouse lung." Carcinogenesis, vol. 6, no. 3 (Mar.) pp. 391-395. Tortora GJ, Anagnostakos NP (1981) Principles of Anatomy and Physiology, Harper and Row, New York. Toth B, et al. (1986) "Cancer induction in mice by feeding of the uncooked cultivated mushroom of commerce Agaricus bisporus." Cancer Res, vol. 46, no. 8 (Aug.), pp. 4007-4011. Turesky RJ (1990) "Metabolism and biodisposition of heterocyclic amines." Prog Clin Biol Res, vol. 347, pp. 39-53. Review. Urban E, Zingery AA (1982) "In vivo absorption of pancreatic ribonuclease from the small intestine of young and mature rats." Journal of Pediatric Gastroenterology and Nutrition, 1982, vol. 1, pp. 597-601. Urban E, Zingery AA, Bundrant T, Weser E, Ziegler DM (1982) "Permeability of adolescent rat intestine to pancreatic ribonuclease." Journal of Pediatric Gastroenterology and Nutrition, vol. 1, pp. 267-272. U.S. Department of Agriculture, Agricultural Research Service (1998) USDA Nutrient Database for Standard Reference, Release 12. Nutrient Data Laboratory home page: http://www.nal.usda.gov/fnic/foodcomp. Vanderstoep J (1981) "Effect of germination on the nutritive value of legumes." Food Technology, vol. 35(3), pp. 83-85. Vlassara H, et al. (1995) "Advanced glycation endproducts promote adhesion molecule (VCAM-1, ICAM1) expression and atheroma formation in normal rabbits." Mol Med, vol. 1, no. 4 (May), pp. 447-456. Vlassara H, et al. (1989) "Macrophage receptor-mediated processing and regulation of advanced glycosylation endproduct (AGE)-modified proteins: role in diabetes and aging." In: Baynes JW and Monnier VM (eds.), The Maillard Reaction in Aging, Diabetes and Nutrition. Vogel R, Trautschold I, Werle E (1968) Natural Proteinase Inhibitors. Academic Press, New York. The Waltham Book of Companion Animal Nutrition (1993) Burger I (ed.), Pergamon Press. The Waltham Book of Clinical Nutrition of the Dog and Cat (1994) Pergamon Press. Ward MH, et al. (1997) "Risk of adenocarcinoma of the stomach and esophagus with meat cooking method and doneness preference." Int J Cancer, vol. 71, no. 1, pp. 14-19. Weindruch R, et al. (1986) "Dietary restriction and aging in mice." J Nutr, vol. 116, pp. 641-654. Wickstrom K, et al. (1986) "Polycyclic aromatic compounds (PAC) in leaf lettuce." Z Lebensm Unters Forsch, vol. 183, no. 3 (Sept. 1), pp. 182-185.

510

Williams WR (1908) The Natural History of Cancer, with Special Reference to its Causation and Prevention. New York: William Wood. Wostmann BS, et al. (1971) "Dietary stimulation of immune mechanisms." Fed Proc, vol. 20, pp. 1779. Wostmann BS (1996) Germfree and Gnotobiotic Animal Models. Boca Raton: CRC Press. Wood AW, et al. (1982) "Inhibition of the mutagenicity of bay-region diol epoxides of polycyclic aromatic hydrocarbons by naturally occurring plant phenols: exceptional activity of ellagic acid." Proc Natl Acad Sci U.S.A.,vol. 79, no. 18 (Sept.) pp. 5513-5517. World Cancer Research Fund in association with the American Institute for Cancer Research (1997) Food, nutrition and the prevention of cancer: a global perspective. World Cancer Research Fund. (p. 35) Yunginger JW (1991) "Food antigens." In: Metcalfe DD, Sampson HA, Simon RA (eds.), Food Allergy: Adverse Reactions to Foods and Food Additives. Boston: Blackwell Scientific. (pp. 49-63) Zhang XM, et al. (1988) "Mutagenic and carcinogenic heterocyclic amines in Chinese cooked foods." Mutat Res, vol. 201, no. 1 (Sept. 1), pp. 181-188

511

You might also like