Stone Age Nutrition: The Original Human Diet
Stone Age Nutrition: The Original Human Diet
Aside from casual interest, there is a reason to appreciate the nutrition that fueled nearly all of human evolution. An increasing number of investigators believe the dietary patterns of our ancestors may constitute a guide to proper nutrition in the present. Early twenty-first-century dietary recommendations run a broad gamut, from the ultra-low-fat Pritikin program, most recently championed by Dean Ornish, to the 30:30:40 (protein:fat:carbohydrate) Zone diet of Barry Sears, to the low-carb, high-fat-and-protein Atkins diet. These popular authors are not the only ones whose recommendations vary widely, however. Academic nutritionists writing in prestigious medical journals advocate a similarly broad range of nutritional regimens, from the low-fat East Asian eating pattern to the much more fat-liberal Mediterranean approach. These conflicting recommendations, especially when they originate in respected professional publications, tend to confuse and dismay health-conscious readers who frequently learn of dietary findings through simplistic and often sensationalized media accounts. Sometimes completely contradictory nutritional findings are announced just a few years apart. For example, beta-carotene appeared to reduce cancer risk in initial studies; then it seemed to increase risk in a later investigation. Dietary fiber was first thought to reduce colon cancer susceptibility, and then it was found to have no such effect. Sodium consumption has been linked to high blood pressure in many studies but not in numerous others. High-fat diets cause coronary heart disease, but consider the "French paradox": the French consume at least as much saturated fat as do Americans, but they have considerably fewer heart attacks and other manifestations of coronary artery disease.
In light of such inconsistencies, it is not surprising that dietary recommendations vary. A logical, straightforward, and understandable starting point from which to develop research protocols and upon which generally accepted recommendations may ultimately be based is highly desirable. The ancestral human diet might provide such a foundation. Even though the Stone Age occurred in the very distant past, eminent paleoanthropologists, geneticists, biologists, and evolutionary theorists believe that human genes have changed hardly at all in the interim. Although many refer to modern times as the "Space Age," genetically speaking, human beings are still Stone Agers. One can argue that the genetic determinants of our current biology were selected not for contemporary circumstances, but for the conditions of life as experienced in the remote past. There are two potential corollaries to this argument. First, that the afflictions of affluence (chronic degenerative diseases such as diabetes, many cancers, atherosclerosis [including coronary heart disease], hypertension, osteoporosis, and obesity) are prompted by dissonance between human genes and the lives of certain groups of people. Second, that the impact of these diseases, which are the major causes of illness and mortality in affluent nations, might be greatly reduced or, in some cases, eliminated altogether by reinstating essential features of the ancestral lifestyle, including relevant nutritional practices, into current existence.
Early humans, who appeared about 50,000 years ago, were hunter-gatherers (or foragers) and were similar in most respects to hunter-gatherer groups studied during the twentieth century. However, there were important differences between these groups. Modern hunter-gatherers have been increasingly restricted to infertile areas that are poorly suited to farming and where the availability of animals for hunting, especially large game, has been much reduced. Also, modern foragers generally have some contact with nearby agriculturists, which affects their culture to some extent. Hunter-gatherer groups that came under observation in the twentieth century were commonly used as models for the prehistoric and pre-agricultural peoples of 25,000 years ago. The modern hunter-gatherer groups were the best available surrogates for prehistoric peoples, but researchers also needed to consider the altered circumstances of otherwise similar people living many thousands of years apart.
Ancestral Foods: Plants
The vegetable foods available to prehistoric foragers grew naturally, without cultivation, and included nuts, leafy vegetables, beans, fruits, flowers, gums, fungi, stems, and other similar items. These had been primate staples for tens of millions of years, but at some point along the hominid (human-like) evolutionary track, the digging stick came into use. This simple implement widened dietary breadth by providing access to roots, bulbs, and tubers, which were plentiful but previously inaccessible sources of food energy. The nutrient values of such foods vary naturally, but if one pools the several hundred representative vegetable foods that hunter-gatherers utilized during the twentieth century and then compared their averaged nutrient content with the mean values of vegetable foods commonly consumed in Western nations, several noteworthy differences emerge. For example, wild-plant foods provide less energy per unit weight. A 3.5-ounce (100 gram) portion of the fruits and vegetables that our ancestors ate would yield, on average, only about one-third the calories that 3.5 ounces of contemporary vegetable food provide. This is primarily because so much of our current plant-food intake is derived from high-energy cereal grains—rice, corn, wheat, and the like. Stone Age humans knew that grains were a potential food source. However, given the technology available to them, the work required to process wild cereals into digestible form was generally excessive compared with the work needed to gather and process other types of wild plants. Foragers generally viewed grains as emergency goods to be used during times of shortage. It was only "late" in the human career, perhaps thirty thousand years ago in Australia and between ten an fifteen thousand years ago elsewhere (for example, the Near East), that evidence of routine cereal-grain use became common.
Another difference between the vegetable foods of the hunter-gatherers and those of Western nations is illustrated as follows. The nutrient content of wild-plant foods is high, especially when one considers the ratio of nutrients to calories. While there is, of course, considerable individual variation among these foods, a mixed grocery bag of the fruits and vegetables available to ancestral humans would provide substantially more vitamins, minerals, and fiber than would a comparably representative collection of contemporary plant foods. In many cases, vitamins and some minerals are artificially added to current foods, making them "enriched." This enrichment process is less successful for adding fiber and is not yet feasible for phytochemicals, which are plant constituents that influence the body's metabolic reactions. Phytochemicals can be considered semi-vitamins, but their total number (at least dozens, perhaps hundreds) is unknown and their mode of action is poorly understood. However, the importance of phytochemicals for optimal health is becoming increasingly well established. Ancestral human biology became genetically adapted to the phytochemicals provided by fruits and vegetables over hundreds to thousands of millennia. The phytochemicals of modern-day cereal grains, in contrast, are relative newcomers to the human metabolism. It is perhaps for this reason that fruit and vegetable intake appears to reduce cancer susceptibility and consumption of cereal grain products has little or no such effect.
Lastly, the plant foods available to ancestral humans afforded a fairly balanced ratio of essential polyunsaturated fatty acids. Like essential amino acids, the body does not synthesize these fatty acids—humans must obtain them from their diet. Polyunsaturated fatty acids are necessary for cell membrane fabrication, especially in the brain, and they are also the basic molecules from which eicosanoids, a large class of important locally acting hormones, are made. Essential fatty acids are divided into two families: omega 6's and omega 3's. Both types are required in mammalian physiology, but they produce opposing biochemical effects, so roughly equal amounts in the diet are desirable. Their effects on blood clotting provide a good example. If there is too much omega 6 in a person's system, their blood clots too easily, which increases the likelihood of coronary thrombosis (heart attack). An overabundance of omega 3 in a person's system reduces blood clotting excessively and increases the risk of cerebral hemorrhage (one kind of stroke). Roughly equal dietary intake of each type of these polyunsaturated fatty acids avoids both undesirable consequences. Unfortunately, in recent decades the use of safflower, corn, sunflower, and cottonseed for spreads and cooking oils has distorted the ratio. These materials contain fifty to one hundred times more omega 6 than omega 3 and, overall, Americans now consume ten to fifteen times more omega 6's than omega 3's.
Ancestral Foods: Animals
The wild game that human ancestors ate differed in important ways from the commercial meats available in the twenty-first century. In the first place, modern commercial meat is fatter. Whether one compares the whole carcass or the most popular cuts (for example, flank, loin, shank, etc.), commercial meat has up to four times more fat than game. For example, 3.5 ounces of regular hamburger provides 268 kilocalories, whereas the same amount of venison yields 126 kilocalories. Even when all visible fat is removed from a T-bone steak, the resulting separable lean portion contains 30 percent more energy than game. These energy differences reflect the greater fat content of commercial meat.
Not only is there more total fat in commercial meat, but the chemical composition of the fat in this meat also varies from that in game animals. In general, fat from commercial meat has a higher proportion of saturated fatty acids (the kind that tend to raise serum cholesterol levels) than does the fat from game. Saturated fatty acids containing either fourteen or sixteen carbon atoms have a special propensity for raising serum cholesterol. Game fat typically has less than one-fifth the content of these substances when compared to an equal amount of fat from commercial meat. Another chemical difference between these two types of meat involves the essential polyunsaturated fatty acids discussed earlier. These fats are present in nearly equal amounts in wild-animal adipose tissue, as compared to the uneven ratios in most commercial meat. Grain feeding appears to be responsible for this difference: the essential fatty acid composition of animals whose feed is based on corn becomes skewed, as their systems contain a far greater amount of omega 6 than omega 3 fatty acids.
Other Considerations
Several categories of foods that are regularly consumed at present were uncommonly used or wholly unavailable for ancestral humans. These "new" foods confer some advantages, but, in several cases, there are important negatives as well.
Grains. Today cereal grains are "superfoods." This term is not a characterization of their nutrient properties, but rather it is recognition that in many parts of the world, members of the grain family may provide from one-third to two-thirds or more of the population's daily caloric intake. The consumption of rice in the Far East, corn in Mesoamerica, and sorghum in parts of central Africa are examples. Such a dependence on one or a few plant foods contrasts with the more broad-spectrum subsistence pattern of hunter-gatherers, who commonly utilize one hundred or more types of food plants during the year. With limited exceptions, which probably did not apply in the remote past, no one of these approaches the "superfood" status accorded cereals today.
It was mentioned earlier that ancestral humans used grains infrequently. Hand milling grains to render them digestible was such hard work that it was not desirable to use grains unless other foods were in short supply. The situation changed when pre-agricultural populations reached a point where a nomadic life was no longer feasible. This shift came about when several groups of people started to migrate towards the same areas. When people were required to settle more or less permanently in a given area, grain consumption became a viable option because other types of plant and animal food became increasingly difficult to obtain. It became apparent shortly that raising grains like wheat or barley could increase the total food energy available from a given geographical area. When people began to farm regularly, population growth accelerated to rates greatly exceeding those before the advent of agriculture. On the other hand, individual health seems to have deteriorated. People became shorter in stature, and skeletal evidence of nutritional stress and infection became more frequent. Average life expectancy also appears to have declined, so the adoption of agriculture may not have been the societal boon it is often considered. In fact, Pulitzer Prize winner Jared Diamond has called it "the worst mistake in the history of the human race" (Discover [May 1987]: 64–66).
Dairy foods. For most humans, dairy foods are important constituents of each day's diet, but for free-range non-human mammals, a mother's milk is the only "dairy product" ever consumed. After weaning, milk was not available for any primates, including humans, until the domestication of cows, goats, camels, and the like. However, dairy foods have been an important component of official nutritional recommendations, at least in Western nations, since the first of these foods were introduced. Nevertheless, human ancestors, including behaviorally modern humans during four-fifths of their existence, thrived and evolved without any dairy foods whatsoever after they ceased breast-feeding.
Alcohol. In the United States, alcohol provides from 3 to 5 percent of the average adult's daily caloric intake. It is not clear when the production of alcoholic beverages first developed, but most anthropologists doubt that wine, beer, mead, and especially distilled spirits were manufactured before agriculture. No hunter-gatherer groups studied in the twentieth century made such drinks.
Separated fats. The fats that ancestral human consumed were generally obtained as integral components of whole foods; both animal and vegetable fats came part and parcel with the other nutrients intrinsic to the original source. In contrast, separated fats are staples for contemporary humans. Olive oil, butter, margarine, vegetable oils, lard, and the like are all vital ingredients for today's cooks. Such separated fats enhance our cuisine, but because fat provides about nine calories per gram (versus about 4 calories per gram for protein and carbohydrate), the availability of fat in this form makes it possible to increase the energy density of our food in ways our ancestors could not.
Refined flour and sugar. Like separated fats, refined flours and sugars allow us to create foods with unnaturally high energy density. Essentially, they are nearly pure energy—empty calories with few or no associated vitamins, minerals, or fiber. Although there are essential amino acids and essential fatty acids (required building blocks our bodies need to make necessary structural elements and required hormones), there are no essential simple carbohydrates like those available from refined flour and sugar. Such carbohydrates are a convenient and efficient source of energy, but they provide little if any nutritional benefit over and above their caloric content. Fortified flours have additional nutrients that food manufacturers consider desirable. Our ancestors obtained their carbohydrate together with the nutrients that nature provided.
Processed and prepared foods. Humans are the only free-living creatures that consume foods whose natural origins are obscure. Individuals unfamiliar with our culture would be unable to identify the ultimate sources of bread, pasta, sausage, cheese, and similar items that have been staples for millennia. Less traditional artificially fabricated foods became immensely popular during the twentieth century, to the point that for some people these foods, often laced with gratuitous sodium, fat, and sugar, made up most of their daily intake. The list of ingredients on the wrapper of almost any prepared-food package provides one of the most telling commentaries on the differences between contemporary nutrition and that of pre-agricultural human ancestors.
Artificial constituents. Organic food proponents would quickly point out that there are still other important differences between the naturally occurring plants and animals of twenty thousand years ago and most of those available to today's grocery shopper. Pesticides, hormones, fertilizers, antibiotics, dyes, and other additives are widely used in contemporary food production but were not, of course, considerations in the remote past when humans ate exclusively "organic" food. The pros and cons of these modern innovations are debatable, but there is no question that such innovations are "unnatural," and that humans evolved for millions of years before encountering the adulterated foods that most of us eat at present.
Overall Dietary Patterns
There was no one universal pre-agricultural diet. Our ancestors ate foods that were available locally and focused on those that returned the most food energy for the least expenditure of physical energy—a general rule for all biological organisms. Two important factors affecting diet choices were latitude and rainfall. In the savanna-like environment of northeast Africa, which according to the "out of Africa" theory is thought to have been the epicenter of human evolution, both game and vegetable foods were plentiful. Gathering plant foods in such an environment was an integral aspect of the food quest for both males and females before human ancestors and those of chimpanzees diverged and for an uncertain length of time thereafter. At some point, most likely during the later stages of Australopithicine evolution, scavenging is thought to have become a significant component of hominid subsistence. It is not known whether this was an exclusively male function or whether females participated as well. Because potential competitors for animal remains included hyenas and similarly dangerous beasts, as well as the original predators, scavenging was a little less hazardous than hunting, the main difference being the degree of technological expertise required. Later, most likely for the past 500,000 years and almost certainly since the appearance of behaviorally modern humans about fifty thousand years ago, obtaining food probably resembled the pattern observed among modern foragers: a division of labor according to gender, with men hunting and women gathering.
Where large animals such as mammoths, red deer (similar to elk), horses, megamarsupials (some as large as rhinoceroses), and eland were relatively abundant, hunting them made sense in terms of energy expended. More food energy could be obtained from one such carcass than from many smaller animals, and the physical energy expended by the hunters in such a process is substantially less compared to hunting small animals. Where large animals had become scarce, a variety of sophisticated techniques, including trapping and net hunting, were used to increase the efficiency of obtaining small game. Weirs and nets were used along rivers where fish migrated seasonally (for example, salmon runs). Stone Agers sometimes lived year-round in such locations, abandoning a nomadic life, establishing relatively large communities, and developing an early form of social stratification with elites—as opposed to nomadic hunter-gatherers who were almost always egalitarian.
Gathering was not confined to plant foods: women often brought home shellfish, eggs, small mammals, frogs, turtles, and the like. This process could be physically demanding. Women occasionally walked several miles, dug through hard ground (with a digging stick) to obtain roots or tubers, then walked back to camp carrying twenty to thirty pounds of foodstuff.
The relative contributions of hunting and gathering to a forager economy have been the subject of debate. The respective importance of these tasks almost certainly varied according to season and was surely affected by latitude. In the mammoth steppe of central Siberia, which was surprisingly well populated during the late Paleolithic era of ten to thirty thousand years ago, abundant wild grasses supported great herds of large game, especially mammoths, so hunting flourished. However, edible plant food for humans was scarce. In this region, hunting must have greatly exceeded gathering as a means of acquiring subsistence.
On the other hand, in northeast Africa, both game animals and wild plant foods were plentiful, and in such areas hunting and gathering were of almost equal importance. Early studies of foraging groups inhabiting regions of this sort suggested that about two-thirds of food was obtained by gathering. However, later analyses suggested that hunting actually made a somewhat greater contribution than gathering. The newer interpretation fit well with "optimal foraging theory," an anthropological law that formalizes the common-sense observation that humans, like all other biological organisms, arrange their subsistence activities to maximize return relative to effort expended. When animals are plentiful and hunting techniques are well developed, as seems to have been the case for the past 100,000 years (and probably longer), the average returns from hunting exceed those from gathering. Nevertheless, gathering remained very important because even skillful hunters can experience unsuccessful periods of a sometimes of uncomfortable duration. The practical botanical knowledge of foragers is so great that the women's success rate in finding plant food in fruitful regions approached 100 percent; many times tsi-tsi beans, baobab fruit, water lily roots, and the like would have been our ancestors' only menu choices for dinner.
Macronutrient Ratios
Overall subsistence patterns in East Africa are of particular interest. If the "out of Africa" theory is correct, which seems increasingly likely, what was eaten routinely in this region affected genetic adaptation in the direct ancestors of all living humans, while what was consumed elsewhere, even as late as fifty or sixty thousand years ago, had little or no direct bearing on the contemporary human gene pool. The reconstructed nutritional patterns in this area, beginning perhaps 200,000 years ago, are quite useful to those interested in the original "natural" human diet.
With behavioral modernity came increasingly rapid cultural change, which has, to an ever-greater extent, outpaced genetic evolutionary adaptation. Because of this cultural change, subsequent dietary innovations, including the routine use of grains by everyone and of dairy foods by adults, as well as the Mediterranean, East Asian, and vegetarian approaches to healthy eating have emerged. However, these trends have appeared too recently to have had a marked effect on our genetic makeup. If there is a basic nutritional pattern to which humans are genetically adapted, the constituents provided by foods consumed in East Africa 100,000 years ago arguably define its nature.
During that time, energy intake would have been higher than at present—probably about three thousand kilocalories per day for males and perhaps 2,750 kilo-calories for females. Because humans at that time lacked motorized equipment, draft animals, and the most simple machines, caloric expenditure at this level was obligatory. In fact, it is likely that up until the early twentieth century energy expenditure and intake requirements remained substantially above those typical at present.
About 55 percent of nutrients would have come from animal and fish sources, while about 45 percent, on average, would have been of vegetable origin. Total caloric intake was likely partitioned about 25–30 percent from protein, 30–35 percent from carbohydrates, and 40–45 percent from fat. These estimates differ from the contemporary American pattern and also from current orthodox recommendations:
Prehistoric societies | |||
East African hunter-gatherers* | Contemporary U.S.A. | Current recommendations | |
Protein | 25-30% | 15% | 15% |
Carbohydrate | 30-35% | 48% | 55% |
Fat | 40-45% | 34% | 30% |
Alcohol | – | 3% | – |
*Surrogates for our earliest truly human ancestors |
The differences are striking and, at first glance, suggest that the Paleolithic diet was unhealthy. A little further analysis, however, is comforting for health-conscious paleoenthusiasts.
Fats
Our ancestors ate more fat than modern humans. Muscle meat from game animals is very lean, but Stone Agers ate everything edible, such as marrow, brain, organ meat, and fat deposits from the thoracic and abdominal cavities, not just muscle as we tend to consume today. Optimal foraging means using the whole carcass. Many different parts of farm animals, such as tripe, chitlins, tongue, sweetbreads, brain, gizzard, etc., were considered standard fare only a few generations ago, and in a few places they still are. However, in contrast to fat from today's cattle, sheep, and pigs, the carcass fat of wild animals has relatively little serum cholesterol-raising effect. Most game fat is of the cholesterol-neutral monounsaturated variety, a substantial proportion is polyunsaturated, and much less is the saturated, cholesterol-raising type. Also, ancestral foods contained little or none of the cholesterol-raising trans fatty acids that commercial hydrogenation adds to current diets. That the hunter-gatherer diets are heart-healthy is corroborated by the finding that the serum cholesterol levels of such people from around the world average below 130 mg/dl as opposed to a bit over 200 mg/dl for Americans. Although the available evidence is not ideal (for example, no coronary angiograms and few autopsies), coronary heart disease is virtually unknown among hunter-gatherers, as far clinical data can show. An additional factor that enhanced the heart-healthy nature of Paleolithic diets was the nearly equal proportions of omega 6 and omega 3 essential polyunsaturated fatty acids in those diets. The great preponderance of omega 6's in contemporary Western diets is believed to be a factor contributing to the cardiovascular disease epidemic in countries with such eating patterns.
Carbohydrates
Ancestral humans ate fewer carbohydrates than is typical for contemporary humans, the major difference being the near total absence of cereal grains from pre-agricultural diets. However, the amount of fruits and vegetables consumed in areas resembling east Africa substantially exceeded amounts consumed in any part of the world and was more than double the typical fruit-and-vegetable consumption in western and northern Europe. Contemporary carbohydrates comprise refined flours and simple sugars, which are quickly absorbed and capable of inducing rapid rises in pancreatic insulin secretion. Stone Agers loved honey, but its availability was usually limited and seasonal (as indicated by their relatively cavity-free dental remains). A large proportion of ancestral carbohydrates was in the complex form that had a less adverse effect on insulin secretion.
Protein
As far as nutritionists and exercise physiologists can ascertain, the levels of protein our ancestors consumed are not necessary for health, even for weight trainers and other high-performance athletes. On the other hand, earlier studies that attributed negative health effects to excessive dietary proteins now seem suspect. Initial reports suggested that high protein intake might cause renal failure, colon cancer, and/or elevated blood cholesterol levels. However, more recent investigations have reversed or at least significantly modified scientific opinion about their relationships. High-protein, low-carbohydrate diets have, in some cases, been shown to be beneficial. High-protein diets do aggravate kidney failure once it is established, but they do not appear to initiate the process. Autopsy studies of traditional Inuit (Eskimos) whose protein intake was extremely high did not reveal any extra incidences of kidney disease. It was once thought that high-protein diets were associated with colon cancer. There is a connection here, but this is primarily because Westerns diets that contain a lot of meat provide excess saturated fat along with protein, and it is the saturated fat, not the protein, which seems to foster the development of colonic neoplasms. Diets rich in meat were once thought to raise serum cholesterol levels, but here again, associated saturated fat is the culprit. High-protein diets that contain little saturated fat actually lower serum cholesterol levels, an investigative result that might have been predicted based on findings among hunter-gatherers studied during the last century.
Micronutrients
Americans and many others in affluent nations spend enormous amounts of money on vitamins (and, to a lesser extent, minerals) presumably in the hope that consuming such micronutrients will minimize the adverse effects of an otherwise unhealthy diet and lifestyle. Nutritionists usually decry this practice, arguing that micronutrient intake above and beyond recommended daily allowance (RDA) levels is unnecessary, and that a balanced diet provides all the vitamins and minerals one needs.
From a Paleolithic perspective, there is some virtue to both these views. Nutritionists follow Stone Age practice when they argue that it is better to obtain micronutrients from real foods rather than from capsules. However, ancestral micronutrient intake exceeded RDA levels in nearly every case (sodium and, in some areas, iodine were the exceptions). The greater total caloric intake necessitated by a physically vigorous lifestyle together with a micronutrient:energy ratio much higher for ancestral foods than for those commonly consumed at present means Stone Agers typically obtained from 1.5 to 5 times the RDA levels of vitamins and minerals each day. However, they did not obtain anything near the recommendations of megavitamin enthusiasts, which can be 10 to 100 times the RDA in some instances.
Words like lycopene, anthocyanin, lutein, sulforaphane, isothiocyanate, and indole have begun to appear regularly in popular articles on nutrition. These substances and many others with equally unfamiliar names are phytochemicals, which, as noted earlier, are vitamin-like molecules that affect our metabolism and biochemistry. Phytochemicals in fruits and vegetables seem much more vital to human health than those from cereal grains, presumably because our metabolism became adapted to the former over many millions of years as opposed to the few thousand years during which human biochemistry has routinely interacted with phytochemicals from cereals. There has been little research on the phytochemical content of uncultivated fruits and vegetables, but it is likely that the phytochemical load in such foods would have paralleled their high content of known vitamins and minerals. Based on this supposition, in addition to the fact that Stone Agers in most areas consumed abundant quantities of fresh fruits and vegetables, it is probable that ancestral phytochemical intake exceeded that of the present.
Only 10 percent of the sodium consumed in Western nations is intrinsic to the foods people eat. The remainder is added during processing, preparation, and at the table. For human ancestors, as for all other free-living terrestrial mammals, potassium intake exceeded sodium intake, a circumstance almost certainly relevant to blood-pressure regulation and to maintenance of cell-membrane electrical potential. After salt became commercially available, and especially after it became inexpensive, our diets have inverted the potassium–sodium relationship that characterized human and pre-human evolution, perhaps from the appearance of multi-cellular organisms over 500 million years ago.
Fiber
Since Denis P. Burkitt's research first drew public attention to the value of fiber in human diets, official recommendations for fiber intake have centered on about 0.07 ounces (20 grams) per day. However, our nearest non-human primate relatives, chimpanzees, consume about 7 ounces (200 grams) of fiber per day. The fiber intake of ancestral humans would have been strongly influenced by the proportion of fruits and vegetables in their subsistence base because dietary fiber comes exclusively from plant foods. Stone Agers living at high latitudes, where edible vegetation was scarce, would have consumed even less fiber than modern humans. However, in east Africa, where modern human metabolism evolved, Paleolithic fiber intake is estimated to have been between 1.77 and 3.53 ounces (between 50 and 100 grams) per day.
There are two main fiber types, both of which are necessary for optimal human physiological function. Most plant foods provide some of each, but the proportions vary. Whole wheat and brown (unpolished) rice contain predominantly insoluble fiber, which is good for intestinal tract function. Oats, corn, and most fruits and vegetables provide a high proportion of soluble fiber, which is valuable for regulating cholesterol absorption after meals. Modern, refined grain-centered diets generally have too little fiber, but, in addition, they have a disproportionate amount of insoluble fiber. Pre-agricultural diets featuring more fruits and vegetables than at present provided a better balanced insoluble-to-soluble fiber ratio.
Conclusion
The uncultivated plant foods and wild game that nourished ancestral humans and their pre-human predecessors were those to which our genetic makeup became adapted. Increasingly rapid cultural innovations during the past few thousand years have transformed our nutrition such that Cro-Magnons might not recognize many constituents of a typical meal. However, genetic evolution during the same period has been glacially slow; thus human beings' genetically determined biology remains adapted for the literally natural and organic foods of the remote past. This dissonance between human genes and human lives has critical implications for human health.
See also Agriculture, Origins of ; Anthropology and Food ; Food Archaeology ; Nutritional Anthropology ; Pale-onutrition, Methods of .
BIBLIOGRAPHY
ATBC Cancer Prevention Study Group. "The Effects of Vitamin E and Beta Carotene on the Incidence of Lung Cancer and Other Cancers in Male Smokers." New England Journal of Medicine 330 (1994): 1029–1035.
Brand Miller, Janette C., and Susanne H. A. Holt. "Australian Aboriginal Plant Foods: A Consideration of Their Nutritional Composition and Health Implications." Nutrition Research Review ll (1998): 5–23.
Cohen, Mark Nathan. Health and the Rise of Civilization. New Haven: Yale University Press, 1989.
Cohen, Mark Nathan, and George J. Armelagos, eds. Pale-opathology at the Origins of Agriculture. New York: Academic Press, 1984.
Cordain, Loren, et al. "Plant-Animal Subsistence Ratios and Macronutrient Energy Estimations in Worldwide Hunter-Gatherer Diets." American Journal of Clinical Nutrition 71 (2000): 682–692.
Cordain, Loren, S. Boyd Eaton, Janette Brand Miller, and Kim Hill. "The Paradoxical Nature of Hunter-Gatherer Diets: Meat-Based Yet Non-Atherogenic." European Journal of Clinical Nutrition 56, suppl. 1 (2002): S1–S11.
Eaton, S. Boyd, and Melvin Konner. "Paleolithic Nutrition: A Consideration of Its Nature and Current Implication." New England Journal of Medicine 312 (1985): 283–289.
Eaton, S. Boyd, et al. "An Evolutionary Perspective Enhances Understanding of Human Nutritional Requirements." Journal of Nutrition 126 (1996): 1732–1740.
Eaton, S. Boyd, and Loren Cordain. "Evolutionary Aspects of Diet: Old Genes, New Fuels." World Review of Nutrition and Dietetics 81 (1997): 26–37.
Eaton, S. Boyd, and Stanley B. Eaton III. "Hunter-Gatherers and Human Health." In The Cambridge Encyclopedia of Hunters and Gatherers, edited by Richard B. Lee and Richard Daly. Cambridge, U.K.: Cambridge University Press, 1999.
Eaton, S. Boyd, and Stanley B. Eaton III. "The Evolutionary Context of Chronic Degenerative Diseases." In Evolution in Health and Disease, edited by Stephen C. Stearns. Oxford and New York: Oxford University Press, 1999.
Eaton, S. Boyd, Stanley B. Eaton III, and Melvin J. Konner. "Paleolithic Nutrition Revisited." In Evolutionary Medicine, edited by Wenda R. Trevathan, E. O. (Neal) Smith, and James J. McKenna. New York and Oxford: Oxford University Press, 1999.
Fuchs C. S., et al. "Dietary Fiber and the Risk of Colorectal Cancer and Adenoma in Women." New England Journal of Medicine 340 (1999): 169–76.
Howe, G. R., et al. "Dietary Intake of Fiber and Decreased Rate of Cancers of the Colon and Rectum: Evidence from the Combined Analyses of 13 Case-Control Studies." Journal of the National Cancer Institute 84 (1992): 1887–1896.
Klein, Richard G. The Human Career. Human Biological and Cultural Origins. 2d ed. Chicago: University of Chicago Press, 1999.
Larsen, Clark Spencer. "Dietary Reconstruction and Nutritional Assessment of Past Peoples: The Bioanthropological Record." In The Cambridge World History of Food, edited by Kenneth F. Kiple and Kriemhild Coneè Ornelas, vol. 1. Cambridge, U.K.: Cambridge University Press, 2000.
Lee, Richard B. "What Hunters Do for a Living, or How to Make Out on Scarce Resources." In Man the Hunter, edited by Richard B. Lee and Irven De Vore. Chicago: Aldine, 1968.
Menkes, M. S., et al. "Serum Beta-Carotene, Vitamins A and E, Selenium, and the Risk of Lung Cancer. New England Journal of Medicine 315 (1986): 1250–1254.
Milton, Katherine. "Diet and Primate Evolution." Scientific American 269 (August 1993): 86–93.
Sinclair, Andrew. "Was the Hunter-Gatherer Diet Prothromboic?" In Essential Fatty Acids and Eicosanoids, edited by Andrew Sinclair and R. Gibson. Champaign, Ill.: American Oil Chemists Society, 1992.
Stringer, Christopher B., and Robin McKie. African Exodus: The Origins of Modern Humanity. New York: Holt, 1996.
Taubes, Gary. "The (Political) Science of Salt." Science 281 (1998): 898–907.
S. Boyd Eaton