Animal source foods in (pre)historical diets


Claims that the per capita consumption of animal source foods in high-income countries has reached unprecedent and abnormal levels are baseless. There is nothing unnatural or biologically deviant about the current intake levels per se. Red meat and animal fat have been a vital part of human diets throughout their evolutionary prehistory, in much higher quantities than is the case today, even in Western economies. However, with the introduction of agriculture, the dietary proportion of these foods decreased as cereals became the staple. This transition to less nutrient-dense diets coincided with periods of malnutrition, particularly among the peasantry. The industrial revolution granted greater access to animal source foods, resulting in improved nutrition in Western societies. The current diet in these countries, now typified by an excessive intake of ultra-processed foods, has led to debates surrounding health and sustainability implications.




This subsection addresses the following four questions:
  • Is there a natural human diet?
  • How animal-based were prehistorical diets?
  • How did agriculture affect diets?
  • The industrial age - what changed for diets?
_________________________________

 Is there a natural human diet? 
 
There is no single true human diet; Homo has evolved on a
continuum of hunter-gatherer diets with regional and seasonal variations. These ancestral diets have been shaped by the availability of outside-bone (meat, liver, fat) and inside-bone (marrow, brains) nutrients derived from animals, as well as eggs, fish, honey, and a vast multitude of plant materials, depending on the climate and ecological diversity. Meat and animal fat were modest sources of nutrition for early hominins, but their importance grew as access to these resources improved through aggressive scavenging and hunting, 2 million years ago. The hunting of large animals is well-documented, and it has been argued that early humans may have been hypercarnivores, especially during glacial winters. As mega-herbivores disappeared, humans adapted by hunting smaller and faster animals, leading to the selection of more agile and cognitively developed hominins. 

Further reading (summary of the literature): 

Evolutionary diets

Dietary incorporation of animal source foods (ASFs) from terrestrial and aquatic origin has been central to human evolution [Stanford & Bunn 2001Cunnane et al. 2007Mann 2007, 2018Braun et al. 2010Pobiner 2013Leroy et al. 2023]. While the consumption of plants has been pervasive during all of Homo's past [Yates et al. 2021], reliance on ASFs through scavenging, hunting, and fishing was vital for nutrient security during the Pleistocene [Stiner 2002; Richards & Trinkaus 2009; Balter et al. 2012; Sahnouni et al. 2013; Starkovitch & Conard 2015; Soulier & Morin 2016; Blasco et al. 2019; Wißing et al. 2019]. Most likely, even older infants and toddlers developed a dependency on (premasticated) meat [Hambidge et al. 2011]. As a result, ASFs have shaped human anatomy and metabolism, whereas their consumption in sufficient quantities is able to meet a large part of the physiological and nutritional needs of the human genus [see elsewhere]. It is important, however, to underline that there is not one true, natural human diet. Rather, evolutionary adaptation relates to a large spectrum of hunter-gatherer diets, which may display strong regional and seasonal variation [Pontzer et al. 2018; Pontzer & Wood 2021].

Timeline: the rise of a top predator

With respect to animal source foods derived from the evolutionary grassland niche, both 'outside-bone' (in meat, organs, and animal fat) and 'inside-bone' (in marrow and brains) nutrients were of key relevance [Thompson et al. 2021]. They were modest sources of nutrition for hominins at first [Speth 1989; Teaford & Ungar 2000], but gained in importance as access improved through hunting and/or aggressive scavenging, possibly around 2 million years ago (Mya) [Domínguez-Rodrigo 2002; Rhodin et al, 2015; Pobiner 2015; Willems & van Shaik 2017; Bunn et al. 2017; Parkinson 2018; Nakamura et al., 2019]. Eventually, this became sufficiently efficient to threaten competing predators [Lewis & Werdelin 2007; Werdelin & Lewis, 2013; Faurby et al., 2020]. Predation was probably far more important than kleptoparasitic meat sourcing for early Pleistocene humans, with the latter only being potentially more meaningful prior to 2 Mya [Domínguez-Rodrigo et al. 2021], but exact timing in early Pleistocene remains controversial [Barr et al. 2022]. Meat sourcing was coupled to a percussion-based search for inside-bone nutrients [Pante et al. 2018; Thompson et al. 2019; Blasco et al., 2019], the oldest-known stone tool being 3.3 million years old [Harmand et al., 2015]. The appearance of tools and carnivory coincided with the emergence of the Homo clade from Australopithecus >2 Mya [de Heinzelin et al. 1999; Pante et al. 2018]. 

The hunting of megafauna to extinction

The predatory sourcing of meat clarified over time, up till the emergence of modern humans [Lewis & Werdelin 2007; Werdelin & Lewis 2013; Domínguez-Rodrigo & Pickering 2017; Wissing et al. 2019]. Optimal foraging theory [Hawkes et al. 1982; Mann 2007] and age-at-death profiles [Klein 1982; Stiner 1990; Cruz-Uribe 1991; Germonpré et al. 2008; Bunn & Pickering 2010;Shipman 2014; Bunn & Gurtov 2014; Bunn et al. 2017; Wojtal et al. 2019; Wilczynski et al. 2019; but see also Discamp & Costamagno 2015 and Haynes 2016 for methodological considerations], suggest hunting of large animals [Fisher 1984; Broughton et al. 2011; Speth 2012; Ben-Dor & Barkai, 2020], but not necessarily the largest ones [Lupo & Schmidt 2016]. Yet, hunting of Proboscidea is well-documented [Wei et al. 2010; Rabinovitch et al. 2012; Yravedra et al. 2012; Brugere 2014; Shipman 2014; Boschian & Saccà 2015; Wojtal & Wilczynsky 2015; Bocherens et al. 2015; Agam & Barkai, 2015; Reschef & Barkai, 2015; Pitulko et al. 2016; Drucker et al. 2017; Agam & Barkai 2018; Wißing et al. 2019; Wojtal et al. 2019; Wilczynski et al. 2019; Venditti et al. 2019], perhaps to extinction for mammoths [Cherney, 2016; Klapman & Capaldi, 2017]. In the Levant, hominins probably hunted very large animals >1Mya [Dembitzer et al. 2021], again perhaps to extinction. As such, humans evolved into the only living primates that regularly kill and consume animals of similar or larger body size [Butynski 1982]. Some authors argue that Homo members may even have been, at least for a part of their existence, hypercarnivores [Ben-Dor et al. 20162021].

Adaptations to ecological challenges  

Be that as it may, the disappearance of mega-herbivores created a need to hunt for smaller and faster animals, thus selecting for more agile and cognitively developed hominins [Ben-Dor et al. 2011; Ben-Dor & Barkai 2021, 2023]. To increase diet breadth and flexibility of prey choice, long-range weapons were developed (e.g., use of spearthrowers) [Coppe et al. 2023]. In addition to the sourcing of meat, organs, and marrow, and the gathering of plants, eggs, fish, and seafood have played a role in food security, at least in regions where it was appropriate to do so [Hu et al 2009; Kuipers et al. 2010]. In addition, the use of food processing was likely also very prominent in early human diets, especially the use of fire and food fermentation technologies [Bryant et al. 2023]. 

 

 How animal-based were prehistoric diets? 



Hunter-gatherer communities obtain about 2/3 of their caloric intake from hunting and fishing on average, and around 1/2 calories in warm climates. At least 20-30%, and up to 70% or more, of the total energy in Palaeolithic diets must have been derived from animal sources. Within hunter-gatherer communities, yearly per capita intake of meta and/or fish range from 40 kg to >500 kg, depending on the region. In addition to animal source foods (incl. honey and insects), plants contribute substantially to the nutrient intake of hunter-gatherers. However, some plants like tubers and bulbs are seen as fall-back and supplementary foods due to their relatively low energetic rewards. Cooking and food preparation have nonetheless made them meaningful foods. Estimates of macronutrient ratios in hunter-gatherer diets vary depending on ecological zones and plant/animal ratios. Such estimates are but educated guesses for what may have happened in the Pleistocene and may not fully capture what must have been an overwhelming variability.

Further reading (summary of the literature): 

Share of animal source foods 

The share of animal source foods (ASFs) in ancestral pre-agricultural diets was substantial, although fully reliable estimates are not available. Some authors have nonetheless tried to reconstruct the composition of Palaeolithic diets to the best of their possibilities. Contemporary hunter-gatherers serve to some degree as models, albeit imperfectly so due to differences in large prey availability compared to the Pleistocene [Faith et al. 2019; Ben-Dor & Barkai 2020; Ben-Dor & Barkai 2021; Dembitzer et al. 2022a; but see Orbach et al. 2022 and Dembitzer 2022b for commentary]. Although the correctness of estimates from ethnographic data is debatable, hunter-gatherer communities may obtain about 65% of their energetic intake from hunting and fishing on average, and about 50% in warm climates [Lee 1968; Cordain et al. 2000 ; Cordain et al. 2002; Marlowe 2005; Marlowe 2007Pontzer & Wood 2021]. For about 3/4 of the world-wide hunter-gatherer societies, ASFs would provide more than half of the calories [Cordain et al. 2000]. According to another estimate, at least 30% and up to 70% of the total caloric intake of Paleolithic diets must have been of animal origin [Kuipers et al. 2010], although levels may have been higher than 70% in some cases (e.g., in glacial areas or when mega-herbivores were still available) [Ben-Dor et al. 2016]. 

Meat consumption levels

Within hunter-gatherer communities, meat consumption varies considerably, between 5% to 90% of the caloric intake [Pontzer & Wood 2021]. Even within a single community, intake levels differ during the year. For the Hiwi, for instance, caloric contributions vary between 40 and 70% [Hurtado & Hill 1990]. For Hadza adults, average meat intake varies between 70 and 1,200 g/d, depending on the month, but is mostly around 300-400 g/d (100-150 kg/y) [Pontzer & Wood 2021]. Yearly per capita intake levels of meat have been estimated in the range of 40-220 kg for the !Kung, Nakak, Onge, Yanomamo, and Anbarra people (and others) and up to 350-650 kg for the Australian Arnhem and the South-American Hiwi and Aché [Hawkes et al. 1982; Kaplan et al. 2000; Nishida 2012Pontzer & Wood 2021]. Theoretical estimates of Paleolithic diets reflect a similar order of magnitude (300-400 kg of meat and/or fish) [Kuipers et al. 2010]. Even higher quantities of ASFs have been described for (sub)Arctic communities; very rough estimates of traditional ASF intake levels by the Inuit in eastern Greenland are 300-500 kg/p/y of meat combined with 150-300 kg of blubber [Robert-Lamblin 2004]. 

The role of plants (and honey)

In addition to ASFs [including honey and insects; MacEvilly 2000], a variety of plants contribute to the nutrient intake of hunter-gatherers and did so too for early humans [Kaplan et al. 2000; Marlowe & Berbesque 2009; Henry et al. 2014]. When available, honey and plant carbohydrates are among the preferred foods [Marlowe & Berbesque 2009]. Honey, which is also foraged by other primates, may account for 10-20% of the caloric intake in some hunter-gatherer populations [Pontzer & Wood 2021]. These carb-rich foods lead to protein-sparing effects, especially when meat is lean [Speth & Spielmann 1983]. Some plants, e.g., tubers and bulbs, should nonetheless be seen as fall-back and supplementary foods rather than as staples, due to their relatively low energetic rewards [Hawkes et al. 1982; Marlowe & Berbesque 2009]. Yet, carbohydrates were scarce during Pleistocenic glacial winters. For Neanderthals, >75% of calories would then have had to come from animal fat [Ben-Dor et al. 2016].  
 
With the exception of readily edible foods, such as fruits and seeds, several plants only became meaningful foods after the introduction of cooking and food preparation technologies, allowing for improved digestibility and sufficient detoxification [Southgate 1991]. Human-controlled fire goes back to 0.2-1.0 Mya at least [Pontzer & Wood 2021Stancampiano et al. 2023]. The consumption of substantial amounts of tubers, bulbs, and grains emerged in the human diet has emerged during the Palaeolithic, with findings tracing back to the period 0.04-0.8 Mya [Kuhn & Stiner 2001; Melamed et al. 2016Power et al. 2018]. Because of cooking and high intake of ASFs, dietary energy density of human diets is high, the ingested food mass is >60% lower than for other primates [Simmen et al. 2017].  

Macronutrient composition 

Depending on the ecological zone and plant/animal ratios, variable estimates are obtained for the share of carbohydrates (20-50% of kcal), fat (20-70%), and protein (10-35%) in most hunter-gatherer diets [Kuipers et al. 2010]. For contemporary Hadza, those levels have been estimated at 20-70%, 13-36%, and 11-43%, respectively [Pontzer & Wood 2021]. Ancestral intake of saturated fat may have been situated within a 7-19% (of kcal) range, or 20-60 g/d [Kuipers et al. 2010]. These numbers are but educated guesses, mostly ignorant of contextual factors and what must have been an overwhelming variability. Low-carbohydrate diets, heavy in meat, may have been the norm for some populations but not for others that were more dependent on starches or sugars (honey) [Pontzer et al. 2018Pontzer & Wood 2021].

Explanatory video 💬  Mann 2022


 How did agriculture affect diets? 

The shift from a rich and diverse hunter-gatherer diet to a Neolithic diet dominated by cereal staples often led to widespread malnutrition. While livestock was valued for various purposes, the production of animal source foods was limited. Over time, the supply of meat stabilized, whereas dairy products were incorporated in some regions, improving nutrient security. However, poor nourishment continued to affect poorer segments of agricultural populations. In contrast, pastoralist communities could often secure higher levels of animal-derived foods. Despite dietary improvements in some areas, unbalanced reliance on cereal-heavy diets still persists in many regions of the Global South. The availability and consumption of animal source foods varied (and still varies) significantly depending on socio-economic factors, cultural practices, and local resources.

Further reading (summary of the literature): 

The Neolithic origin of farming 

During the Neolithic era, food supply became increasingly dependent on the domestication of plants and animals [Richards 2003; Vigne 2011; Navarrete et al. 2023]. The spreading of farming from Anatolia, first into south-eastern Europe around 7000 BC, was followed by the Copper Age. During the latter, intercultural contact between farmers and nomadic herders from the Pontic Steppes took place around 4500 BC in the Western Black Sea region, which led to innovations in herd management, food processing, and dairying practices [Penske et al. 2023; Rohrlach & Penske 2023]. This gave rise to the expansion of fully developed pastoralist groups during the third millennium BC and, eventually, to the complex European cultural and genetic mosaic of ancestries coming from hunter-gatherers, farmers, and pastoralists. 

Livestock in early farming communities

Although there are debates on the original motives for animal husbandry [cf. Leroy et al. 2020], livestock was highly valued for a number of reasons (food, traction, manure, cultural and religious significance, etc.) However, only small amounts of animal source foods (ASFs) were produced, while crops lacked the variety needed to compensate for the lack of nutrition. This resulted in a low meat/high cereal intake, which correlated (for various reasons) with a shift from good health and moderate fertility to a status of disease and high reproduction rates, and thus from population stability to a sequence of population booms and busts [Larsen 2006Scott 2017; Williams & Hill 2017].

Effects of early agricultural diets on health

This legacy of diets dominated by domesticated cereal staples and reduced dietary quality, together with the emergence of epidemics, resulted in some of the major challenges the world is still facing today [Larsen 2023]. The outcome of a fragile Neolithic food system was characterized by malnutrition and infectious disease, paralleling reduced stature, poor bone health, nutritional deficiencies, and dental caries [Ulijaszek 1991; Larsen 1995; Mann 2007; Latham 2013; Marciniak et al. 2022]. Reduced stature serves as a marker for deprivation and low dietary quality [Perkins et al. 2016], originating from a combination of low protein intake [Grasgruber et al. 2016] and a high glycaemic load [Mitchell 2006]. The change to an agrarian lifestyle has likely been a contributory factor to iron deficiency in young children ever since [Hambidge et al. 2011]. There was, however, an increasing stabilization of the supply of meat and secondary ASFs (e.g., dairy products), throughout the Bronze Age [Münster et al. 2018], and thereafter. Consumption in some regions and eras may even have been higher than what common belief is assuming, sometimes above the levels of post-industrial Western diets [now at 50-70 kg/p/y of red meat and 20-50 kg of poultry; see elsewhere].

Agricultural diets in the Middle Ages 

In the Middle Ages, some areas in Western Europe had access to a high supply of mutton, beef, and/or pork, as well as cheese and butter [Banegaz López 2010; Dunne et al. 2019; Camisa 2020]. In late Medieval Barcelona, average meat intake was around 80-120 kg/p/y without counting poultry (depending on status, but not limited to the elites). Although aristocrats would have consumed the highest levels (e.g., 1 kg for a typical meal in 15th-century England aristocratic household, with two meat meals a day, five days a week), even monks (1 kg of meat per day, four days a week, plus plenty of eggs and fish) and common people were often well provisioned.
 
However, while some parts of the population were clearly well fed, malnutrition remained rampant in the poorer population segments of agricultural populations, even until the present age. Although industrialized countries have moved into foodscapes characterized by abundance of ASFs [see below and elsewhere], and even if low- and middle-income countries have optimized their diets in ways that are contingent on local resources, unbalanced reliance on cereal-heavy diets still typifies many regions in the Global South [see elsewhere]. 

Pastoralist communities

In contrast to impoverished and fragile rural communities depending on crop agriculture, pastoralist communities in low- and middle-income countries typically were able to secure higher levels of ASFs. This is valid for both ancestral [Wilkin et al. 2020] and contemporary variants [see elsewhere]. 
 

 The industrial age - what changed for diets? 

During the 18th and 19th centuries, animal source foods became widely accessible in industrializing Western countries, boosting dietary adequacy. Technological developments, improved logistics, and rising urban demand led to increased production and availability. This period also saw an increase in human stature, indicating improved nutrition with higher energy, protein, and micronutrient intake. Eventually, however, the diets of industrialized countries evolved into what is now known as the Western diet, which includes large amounts of ultra-processed foods. Controversies surrounding the health effects of animal source foods often stem from their integration into the Western diet rather than their intrinsic properties.

Further reading (summary of the literature): 

Improved access to animal source foods

During the 18th and, especially, 19th century, animal source foods (ASF) became widely accessible albeit predominantly in the industrializing countries of the West. This was particularly the case for the production of red meat, due to a series of technological developments (e.g., cooling), improved logistics and transportation means (e.g., railways), a rising urban demand for meat, and a rapid transformation of the food chain [Leroy & Degreef 2015; Leroy et al. 2020]. 

Nutritional adequacy and health

A sharp increase in stature was seen [OWiD 2019], suggesting improved nutrition due to a higher intake of energy, protein, and micronutrients (although other factors, such as a reduced prevalence of infections, must also have been of importance). The transition did not, of course, result in a return to Paleolithic-style diets [despite the initial gains in nutritional adequacy, suggestive of a diet more adapted to the biological needs of humans; see elsewhere], neither with respect to quality nor quantity of ASF consumption (for the better or worse). Moreover, the diets of industrialized countries eventually (d)evolved into the 'Western diet', incorporating large amounts of ultra-processed foods [Cordain et al. 2005; see elsewhere]. Much of the controversies related to ASF consumption relates to how they have been inserted in the Western diet, rather than being a result of their intrinsic properties.

Translate content