The firestorm continues to spiral out of control following the publication of the article titled “The Importance of Dietary Carbohydrate in Human Evolution” in The Quarterly Biology of Review.1 Now, a new paper titled “” published in this week’s Proceedings of the National Academy of Sciences (PNAS)2 fuels the misconception that our hunter-gatherer ancestors may have consumed oats in large quantities. I was approached by National Geographic to set the record straight.
I read the PNAS paper. Nowhere in the paper did the authors indicate that people at the archaeological site were utilizing large quantities of oats. Indeed, their primary evidence comes from an assumed single “grinding stone” dated to 32,614 +/- 429 BP which maintained residues of starch containing granules upon its surface. The authors provide no evidence that this stone was regularly used as a “grinding stone” and no control stones found next to it were examined for similar phytoliths. Could it be that the starch concentration on this stone was similar to the starch (unreported) concentration on other non-grinding stones in the proximity?
My point here is that this evidence is indirect and in no way indicates that humans 32,614 years ago were regularly consuming “large quantities of oats” as the media has publicized. In fact, there is absolutely no way of determining whether starch phytoliths on a stone, assumed to be a grinding stone can directly quantify the amount of any dietary element (starch or otherwise) assumed to be consumed by people living 32,614 years ago. Indeed, the authors admit that the identification of the starch phytoliths’ plant origin from a specific species is not certain.
If you were to analyze the phytolith remnants from a modern cook’s knife, cutting board, or mortar and pestle or a farmer’s or woodcutter’s tools (knife, axe, grinding stone) could you make any accurate quantitative inferences about their diet? Or even any quantitative inferences about consumption of any food or non-food residues found on these tools?
A huge theoretical obstacle that the authors of this paper did not consider is that isotopic analyses of fossilized bones of hunter gatherer living in Southern Italy during the Upper Palaeolithic and Mesolithic show no trace of despite residing in the same general geographic location (more than 20,000 years later) where these foods have always been available.3
The importance of this PLOS paper is that it is direct dietary evidence from the fossilized bones of humans living in Southern Italy rather than from inferences based upon indirect evidence of inadequately identified starch granules found upon a single stone assumed to be a grinding tool. Further, to a scientific paper, over the past 30 years, no isotopic skeletal evidence of any fossilized European skeleton (or for that matter, from anywhere in the world) during the Palaeolithic demonstrates cereal grain consumption as evidenced by direct Carbon delta 13 data.
If we buy into the and contrast it to the , why would have early humans in Italy discovered grain consumption 32,000 years ago and then have abandoned this practice almost 20,000 years later? Further, as the authors of the PNAS paper readily admit oat starch is largely inedible unless cooked. Accordingly, to eat oats or any cereal grains, the seeds must be first ground, wood must be gathered, and fires must be lit and the cereal grain must be slowly cooked in order for the starch to be hydrolyzed. 32,000 years ago pottery hadn’t been invented, hence boiling of oats would have been difficult or impossible. Placing oats into an open fire or even upon hot coals would have quickly incinerated them. In order to achieve digestibility in the human GI tract, the starch in oats or any grain has to be slowly cooked and hence broken down (hydrolyzed) to make it edible.
Further, optimal foraging theory suggests that gathering tiny grass seeds (cereal grains) of low caloric density that require grinding, wood collection and fire production to make the food edible is energetically inefficient. In other words, lots of energy must be spent to make a low energy food (oats, cereal grains) edible. All animals must derive more energy from the food that they consume compared to the energy they expend to acquire the food (Optimal Foraging Theory). The ethnographic hunter gatherer data which our group has compiled indicates that cereal grains (grasses) were rarely or never consumed except as starvation foods.
In summary, the isotopic data, the enthnographic data and the human physiological data do not support the notion that cereal grains were a major component of human diets until after the advent of the Agricultural revolution.
So, what do I make of this “growing evidence” from starch granules, dental calculus, etc., that Paleolithic people relied on tubers, starchy plant stems, and similar foods the media continues to sensationalize?4, 5, 6
The experts in archaeology don’t read the nutrition literature, and the experts in nutrition don’t read the archaeology literature. Hence we have a huge disconnect when interpreting prehistoric dietary data.
Starch (polymers of glucose) in plant foods are generally indigestible in the human GI tract unless the cell walls of the plant foods are broken down (via grinding or other mechanical means) and then hydrolyzed via cooking. Modern humans cannot digest raw cereal grains, raw potatoes, or raw legumes/beans and experience huge gastrointestinal upsets and toxic symptoms if we try to eat these foods in their raw state. Hence these foods would not have been part of our ancestral dietary menu until we were able to control fire.
Controlling fire generally occurred late in our species evolution. Here is a quote from the most recent comprehensive review of ancestral fire control:7
“However, surprisingly, evidence for use of fire in the Early and early Middle Pleistocene of Europe is extremely weak. Or, more exactly, it is nonexistent, until ∼300–400 ka.”
An enormous caveat here is that the ability to control fire is far different from the ability to make fire at will. The best available evidence indicates that Neanderthals living in Europe never had the ability to make fire at will.8, 9
Accordingly, plant food starch from cereal grains, tubers and legumes would not have been a usable caloric source until fires could be lit at will and cooking became a normal part of the human technological repertoire. The best available evidence suggests that the ability to make fire at will did not occur until modern humans (Homo sapiens) developed this technology via fire drills about 75,000 to 100,000 years ago.
Hence dental starch calculus on teeth does not represent digestible, hydrolyzed starch in the GI tract to be used as a source of calories in our bodies but rather only represents remnants of plant consumption in which little of the apparent starch granules are available for digestion and metabolism. Recent studies of Neanderthals and Denisovan DNA evidence indicates that they had not yet evolved the genes coding for the enzyme (amylase) necessary to hydrolyze starch either in the saliva or in pancreatic enzymes.10 This empirical evidence in no way supports the notion that cereal grains, tubers or legumes could have been part of the ancestral human diet until after fire was produced at will.
In our modern world, cereal grains represent a ubiquitous and inexpensive source of calories, whereas in our ancestor’s Paleolithic world these plant food grains were inedible for most of our species sojourn on planet earth. Only until the innovation of fire starting at will could cereal grains have ever been consumed as staple foods. This technological advance only occurred very recently on an evolutionary time table. Hence, humans are poorly adapted to a food group which now represents more than 50% of the food energy consumed by all peoples on earth.
We are adapted, however, to closely mimic the eating patterns of our hunter-gather ancestors and whether we can truly adopt an authentic Paleo diet.
Our studies and those of my colleagues indicate that the nutrient composition of wild plant foods is identical or nearly similar to their domesticated counterparts for vitamins and slightly lower (5-7%) for minerals. Our laboratory analyses of the nutrient content of animal foods (meats and organs) show that wild animal meats contain less fat, more protein, more omega-3 fatty acids and less omega-6 fatty acids than grain produced domestic meats. Grass produced meats have nutritional characteristics which more closely resemble wild meats than feed lot produced meats.
Given this information, it is entirely possible to mimic the nutritional characteristics of our ancestral hunter-gatherer diets with common modern foods available at your local supermarket by consuming fresh fruits, vegetables, seeds, nuts, grass produced meats, fresh fish, fresh seafood and free ranging eggs. There is absolutely no nutritional requirement in our species for cereal grain consumption, dairy food consumption or processed food consumption.
 Karen Hardy, Jennie Brand-Miller, Katherine D. Brown, Mark G. Thomas, Les Copeland. The Importance of Dietary Carbohydrate in Human Evolution. The Quarterly Review of Biology, 2015; 90 (3): 251.
 Marta Mariotti Lippi, Bruno Foggi, Biancamaria Aranguren, Annamaria Ronchitelli, and Anna Revedin, Multistep food plant processing at Grotta Paglicci (Southern Italy) around 32,600 cal B.P. PNAS 2015 ; published ahead of print September 8, 2015.
 Mannino MA1, Catalano G, Talamo S, Mannino G, Di Salvo R, Schimmenti V, Lalueza-Fox C, Messina A, Petruso D, Caramelli D, Richards MP, Sineo L. Origin and diet of the prehistoric hunter-gatherers on the mediterranean island of Favignana (Ègadi Islands, Sicily). PLoS One. 2012;7(11):e49802. doi: 10.1371/journal.pone.0049802. Epub 2012 Nov 28.
 Revedin A, et al. (2010) Thirty thousand-year-old evidence of plant food processing. Proc Natl Acad Sci. USA 107:18815–18819.
 Henry A., Brooks A. & Piperno D. Plant foods and the dietary ecology of Neanderthals and early modern humans. J Hum Evol 69, 44–54 (2014).
 Buckley, Stephen et al. “Dental Calculus Reveals Unique Insights into Food Items, Cooking and Plant Processing in Prehistoric Central Sudan.” Ed. Debbie Guatelli-Steinberg. PLoS ONE 9.7 (2014): e100808. PMC. Web. 11 Sept. 2015.
 Roebroeks W, Villa P. On the earliest evidence for habitual use of fire in Europe. Proc Natl Acad Sci U S A. 2011 Mar 29;108(13):5209-14
 Sandgathe DM, Dibble HL, Goldberg P, McPherron SP, Turq A, Niven L, Hodgkins J. Timing of the appearance of habitual fire use. Proc Natl Acad Sci U S A. 2011 Jul 19;108(29):E298.
 Sandgathe DM, Dibble HL, Goldberg P, McPherron SP, Turq A, Niven L, Hodgkins J. On the role of fire in Neandertal adaptations in western Europe: evidence from Pech de l’Aze IV and Roc de Marsal, France. PaleoAnthropology 2011;216-242.
 Perry GH, Kistler L, Kelaita MA, Sams AJ. Insights into hominin phenotypic and dietary evolution from ancient DNA sequence data. J Hum Evol. 2015;79:55-63.