Claims that humans have ‘never eaten so much’ animal source foods (ASFs) are unfounded. This holds true when compared to many pre-industrial but not pre-agricultural communities, when ASFs were consumed as an essential part of evolutionary diets. During the 19th century, access to ASFs improved and led to higher nutritional adequacy in industrializing countries. Although portrayed as excessive, there is nothing ‘unnatural’ or biologically aberrant about such levels of consumption. Whether it is unethical, unsustainable, or unhealthy to do so is another debate, addressed elsewhere.
Dietary incorporation of ASFs, from both terrestrial and aquatic origin [Cunnane et al. 2007; Braun et al. 2010; Pobiner 2013], has been central to human evolution [Stanford & Bunn 2001; Mann 2007, 2018]. While plant and starch consumption may have been pervasive during Homo's history [Yates et al. 2021], reliance on ASFs through scavenging, hunting, and fishing must have been vital throughout the Pleistocene [Stiner 2002; Richards & Trinkaus 2009; Balter et al. 2012; Sahnouni et al. 2013; Starkovitch & Conard 2015; Soulier & Morin 2016; Blasco et al. 2019; Wißing et al. 2019]. As a result, ASFs have shaped human anatomy and metabolism, whereas their consumption in sufficient quantities is able to meet a large part of the physiological and nutritional needs of the human genus [see elsewhere]. It is important, however, to underline that there is not one true, natural human diet. Rather, evolutionary adaptation relates to a very large heterogeity of hunter-gatherer diets, which may display strong regional and seasonal variation [Pontzer et al. 2018; Pontzer & Wood 2021].
Although meat and fat were only modest sources of nutrition for hominins at first [Speth 1989; Teaford & Ungar 2000], they gained in importance as access improved through hunting and/or aggressive scavenging, possibly around 2 million years ago (Mya) [Domínguez-Rodrigo 2002; Rhodin et al, 2015; Pobiner 2015; Willems & van Shaik 2017; Bunn et al. 2017; Parkinson 2018; Nakamura et al., 2019]. Eventually, this became sufficiently efficient to threaten competing predators [Lewis & Werdelin 2007; Werdelin & Lewis, 2013; Faurby et al., 2020]. Predation was probably far more important than kleptoparasitic meat sourcing for early Pleistocene humans, with the latter only being potentially more meaningful prior to 2 Mya [Domínguez-Rodrigo et al. 2021], but exact timing in early Pleistocene remains controversial [Barr et al. 2022]. Meat sourcing was coupled to a percussion-based search for inside-bone nutrients [Pante et al. 2018; Thompson et al. 2019; Blasco et al., 2019], the oldest-known stone tool being 3.3 million years old [Harmand et al., 2015]. The appearance of tools and carnivory coincided with the emergence of the Homo clade from Australopithecus >2 Mya [de Heinzelin et al. 1999; Pante et al. 2018].
The predatory sourcing of meat clarified over time, up till the emergence of modern humans [Lewis & Werdelin 2007; Werdelin & Lewis 2013; Domínguez-Rodrigo & Pickering 2017; Wissing et al. 2019]. Optimal foraging theory [Hawkes et al. 1982; Mann 2007] and age-at-death profiles [Klein 1982; Stiner 1990; Cruz-Uribe 1991; Germonpré et al. 2008; Bunn & Pickering 2010;Shipman 2014; Bunn & Gurtov 2014; Bunn et al. 2017; Wojtal et al. 2019; Wilczynski et al. 2019 ], (but see also Discamp & Costamagno 2015 and Haynes 2016 for methodological considerations), suggest hunting of large animals [Fisher 1984; Broughton et al. 2011; Speth 2012; Ben-Dor & Barkai, 2020], but not necessarily the largest ones [Lupo & Schmidt 2016]. Yet, hunting of Proboscidea is well-documented [Wei et al. 2010; Rabinovitch et al. 2012; Yravedra et al. 2012; Brugere 2014; Shipman 2014; Boschian & Saccà 2015; Wojtal & Wilczynsky 2015; Bocherens et al. 2015; Agam & Barkai, 2015; Reschef & Barkai, 2015; Pitulko et al. 2016; Drucker et al. 2017; Agam & Barkai 2018; Wißing et al. 2019; Wojtal et al. 2019; Wilczynski et al. 2019; Venditti et al. 2019], perhaps to extinction for mammoths [Cherney, 2016; Klapman & Capaldi, 2017]. In the Levant, hominins probably hunted very large animals >1Mya [Dembitzer et al. 2021], again perhaps to extinction. As a result, some authors argue that Homo members may have been, at least for a part of their history, hypercarnivores (defined as deriving >70% of their diet from meat) [Ben-Dor et al, 2021].
Be that as it may, the disappearance of mega-herbivores created a need to
hunt for smaller and faster animals, thus selecting for more agile and
cognitively developed hominins [Ben-Dor et al. 2011; Ben-Dor & Barkai 2021]. In addition to the sourcing of meat, organs, and marrow, and the gathering of plants, the consumption of eggs and fish must have played a role in food security, at least in regions where it was appropriate to do so [Hu et al 2009; Kuipers et al. 2010].
Pre-agricultural diets were rich in animal source foods (but how rich?)
The share of ASFs in ancestral pre-agricultural diets was substantial, although fully reliable
estimates are not available.
Some authors have nonetheless tried to reconstruct the composition of Paleolithic diets to the best of their possibilities. Contemporary hunter-gatherers serve to some degree as models, albeit imperfectly so due to differences in large prey availability compared to the Pleistocene [Faith et al. 2019; Ben-Dor & Barkai 2020; Ben-Dor & Barkai 2021; Dembitzer et al. 2022a (but see Orbach et al. 2022 and Dembitzer 2022b for comment and reply to comment)]. Although the correctness of estimates from ethnographic data is debatable, hunter-gatherer communities may obtain about 65% of
their energetic intake from hunting and fishing on average, and about 50% in warm climates, [Lee 1968; Cordain et al. 2000 ; Cordain et al. 2002; Marlowe 2005; Marlowe 2007; Pontzer & Wood 2021]. For about 3/4 of the world-wide hunter-gatherer societies, ASFs would provide more than half of the calories [Cordain et al. 2000]. According to another estimate, at least 30% and up to 70% of the total caloric intake of Paleolithic diets must have been of animal origin [Kuipers et al. 2010], although levels may have been higher in some cases (e.g., in glacial areas or when mega-herbivores were still available) [Ben-Dor et al. 2016].
Within hunter-gatherer communities, meat consumption varies considerably, between 5% to 90% of the caloric intake [Pontzer & Wood 2021]. Even within a single community, intake levels differ during the year. For the Hiwi, for instance, caloric contributions vary between 40 and 70% [Hurtado & Hill 1990]. For Hadza adults, average meat intake varies between 70 and 1,200 g/d, depending on the month, but is mostly around 300-400 g/d (100-150 kg/y) [Pontzer & Wood 2021]. Yearly per capita intake levels of meat have been estimated in the range of 40-220 kg for the !Kung, Nakak, Onge, Yanomamo, and Anbarra people (and others) and up to 350-650 kg for the Australian Arnhem and the South-American Hiwi and Aché [Hawkes et al. 1982; Kaplan et al. 2000; Nishida 2012; Pontzer & Wood 2021]. Theoretical estimates of Paleolithic diets reflect a similar order of magnitude (300-400 kg of meat and/or fish) [Kuipers et al. 2010]. Even higher quantities of ASFs have been described for (sub)Arctic
communities; very rough estimates of traditional ASF intake levels by the Inuit
in eastern Greenland are 300-500 kg/p/y of meat combined with 150-300 kg of blubber [Robert-Lamblin 2004].
In addition to ASFs [including honey and insects; MacEvilly 2000], a variety of plants contribute to the nutrient intake of hunter-gatherers and did so too for early humans [Kaplan et al. 2000; Marlowe & Berbesque 2009; Henry et al. 2014]. When available, honey and plant carbohydrates are among the preferred foods [Marlowe & Berbesque 2009]. Honey, which is also foraged by other primates, may account for 10-20% of the caloric intake in some hunter-gatherer populations [Pontzer & Wood 2021]. These carb-rich foods lead to protein-sparing effects, especially when meat is lean [Speth & Spielmann 1983]. Some plants, e.g., tubers and bulbs, should nonetheless be seen as fall-back and
supplementary foods rather than as staples, due to their relatively low
energetic rewards [Hawkes et al. 1982; Marlowe & Berbesque 2009]. Yet, carbohydrates were scarce during Pleistocenic glacial winters. For Neanderthals, >75% of calories would then have had to come from animal fat [Ben-Dor et al. 2016].
With the exception of readily edible foods, such as fruits and seeds, several plants only became meaningful foods after the introduction of
cooking and food preparation technologies, allowing for improved digestibility and sufficient detoxification [Southgate 1991]. Human-controlled fire goes back to 0.2 Mya at least, but use of fire for cooking could be as early as 0.5-1.0 Mya [Pontzer & Wood 2021; Stancampiano et al. 2023]. The consumption of substantial amounts of tubers, bulbs, and grains emerged in the human diet has emerged during the Paleolithic, with findings tracing back to the period 0.04-0.8 Mya [Kuhn & Stiner 2001; Melamed et al. 2016; Power et al. 2018]. Because of cooking and high intake of ASFs, dietary energy density of human diets is high, the ingested food mass is >60% lower than for other primates [Simmen et al. 2017].
Depending on the ecological zone and plant/animal ratios, variable estimates are obtained for the share of carbohydrates (20-50% of kcal), fat (20-70%), and protein (10-35%) in most hunter-gatherer diets [Kuipers et al. 2010]. For contemporary Hadza, those levels have been estimated at 20-70%, 13-36%, and 11-43%, respectively [Pontzer & Wood 2021]. Ancestral intake of saturated fat may have been situated within a 7-19% (of kcal) range, or 20-60 g/d [Kuipers et al. 2010]. These numbers are but educated guesses, mostly ignorant of contextual factors and what must have been an overwhelming variability. Low-carbohydrate diets, heavy in meat, may have been the norm for some populations but not for others that were more dependent on starches or sugars (honey) [Pontzer et al. 2018; Pontzer & Wood 2021].
The agricultural transition decreased the intake of animal source foods (but not always)
During the Neolithic era, food supply became increasingly dependent on the domestication of plants and animals [Richards 2003; Vigne 2011]. Although there are debates on the original motives for animal husbandry [cf. Leroy et al. 2020], livestock was highly valued for a number of reasons (food, traction, manure, cultural and religious significance, etc.) However, only small amounts of ASFs were produced, while crops lacked the variety needed to compensate for the lack of nutrition. This resulted in
a low meat/high cereal intake, which correlated (for various reasons) with a shift from good health
and moderate fertility to a status of disease and high fertility, and thus from
population stability to a sequence of population booms and busts [Scott 2017; Williams & Hill 2017]. This legacy of diets dominated by domesticated plant staples and reduced dietary quality, together with the emergence of novel pathogens, epidemics, and pandemics, resulted in some of the major challenges the world is currently facing [Larsen 2023].
Indeed, the outcome of this novel, fragile food system was characterized by malnutrition and infectious disease, paralleling reduced stature, poor bone health, nutritional
deficiencies, and dental caries [Ulijaszek 1991; Larsen 1995; Mann 2007; Latham 2013; Marciniak et al. 2022]. Reduced stature serves as a marker for deprivation and low dietary quality [Perkins et al. 2016], originating from a combination of low protein intake [Grasgruber et al. 2016] and a high glycemic load [Mitchell 2006]. There was, however, an increasing stabilization of the supply of meat and secondary ASFs throughout the Neolithic [Münster et al. 2018], and thereafter. Consumption in some regions and eras may have been higher than what common belief is assuming, sometimes above the levels of post-industrial Western diets [now at 50-70 kg/p/y of red meat and 20-50 kg of poultry; see elsewhere].
In the Middle Ages, for instance, some areas in Western Europe had access to a high supply of mutton, beef, and/or pork, as well as cheese and butter [Banegaz López 2010; Dunne et al. 2019; Camisa 2020]. In late Medieval Barcelona, average meat intake was around 80-120 kg/p/y without counting poultry (depending on status, but not limited to the elites). Although aristocrats would have consumed the highest levels (e.g., 1 kg for a typical meal in 15th-century England aristocratic household, with two meat meals a day, five days a week), even monks (1 kg of meat per day, four days a week, plus plenty of eggs and fish) and common people were often well provisioned.
However, while some parts of the population were clearly well fed, malnutrition remained rampant in the poorer population segments of many agricultural populations, even until the present age. Although industrialized countries have moved into foodscapes that are characterized by abundance of ASFs [see below and elsewhere], and even if low- and middle-income countries have optimized their diets in ways that are contingent on local resources, unbalanced reliance on cereal-heavy diets still typifies many regions in the Global South [see elsewhere].
In contrast to the often impoverished and fragile rural communities depending on crop agriculture, pastoralist communities in low- and middle-income countries typically were able to secure much higher levels of ASFs.
This is the case for both ancestral [Wilkin et al. 2020] and contemporary variants (the
Turkana are still obtaining 62% of their calories from dairy and
12% from meat, fat, and blood) [Lea et al. 2020]. In India, 6-8% of the population is directly or indirectly involved in pastoralism, providing three quarters of the national meat supply and half of the country's milk [Kukreti 2020].
The industrial transition: meat is back on the menu
During the 18th and, especially, 19th century, ASFs became widely accessible albeit predominantly in the industrializing countries of the West. This was particularly the case for the production of red meat, due to a series of technological developments (e.g., cooling technologies), improved logistics and transportation means (e.g., railways), a rising urban demand for meat, and a rapid transformation of the food chain [Leroy & Degreef 2015; Leroy et al. 2020].
A sharp increase in stature was seen [Our World in Data 2019], suggesting improved nutrition due to a higher intake of energy, protein, and micronutrients (although other factors, such as a reduced prevalence of infections, must also have been of importance). The transition did not, of course, result in a return to Paleolithic-style diets [despite the initial gains in nutritional adequacy, suggestive of a diet more adapted to the biological needs of humans; see elsewhere], neither with respect to quality nor quantity of ASF consumption (for the better or worse). Moreover, the diets of industrialized countries eventually
(d)evolved into what is now known as 'the Western diet', incorporating large amounts of ultra-processed foods [Cordain et al. 2005; see elsewhere]. Much of the controversies related to ASF consumption relates to how they have been inserted in the Western diet, rather than being a result of their intrinsic properties.