Animal source foods in historical diets

Claims that humans have ‘never eaten so much’ animal source foods (ASFs) are unfounded. This holds true when compared to many pre-industrial but not pre-agricultural communities, when ASFs were consumed as an essential part of evolutionary diets. During the 19th century, access to ASFs improved and led to higher nutritional adequacy in industrializing countries. Although portrayed as excessive, there is nothing ‘unnatural’ or biologically aberrant about such levels of consumption. Whether it is unethical, unsustainable, or unhealthy to do so is another debate.  
 
Dietary incorporation of ASFs, from terrestrial and aquatic origin [Cunnane et al. 2007; Braun et al. 2010; Pobiner 2013] has been central to human evolution [Stanford & Bunn 2001; Mann 2007, 2018]. Reliance on ASFs is seen throughout the Pleistocene [Stiner 2002; Richards & Trinkaus 2009; Balter et al. 2012; Sahnouni et al. 2013; Starkovitch & Conard 2015; Soulier & Morin 2016; Blasco et al. 2019; Wißing et al. 2019]. As a result, ASFs have shaped human anatomy and metabolism and are able to meet a large part of the physiological and nutritional needs of the human genus [see elsewhere].
 
Although meat and fat were only modest sources of nutrition for hominins at first [Speth 1989; Teaford & Ungar 2000], they gained in importance as access improved through hunting and/or aggressive scavenging [Domínguez-Rodrigo 2002; Rhodin et al, 2015 ; Pobiner 2015; Willems, 2017; Bunn, 2017; Parkinson 2018; Nakamura et al., 2019], with sufficient efficiency to eventually threaten competing predators [Werdelin & Lewis, 2013; Faurby et al., 2020]. Predation was probably far more important than kleptoparasitic meat sourcing for early Pleistocene humans, with the latter only being potentially more meaningful prior to 2 million years ago (Mya) [Domínguez-Rodrigo et al. 2021]. Be that as it may, meat sourcing was also coupled to a percussion-based search for inside-bone nutrients [Pante et al. 2018; Thompson et al. 2019; Blasco et al., 2019], the oldest-known stone tool being 3.3 million years old [Harmand et al., 2015]. The appearance of tools and carnivory coincided with the emergence of the Homo clade from Australopithecus >2 Mya [de Heinzelin et al. 1999; Pante et al. 2018]. 
 
The predatory sourcing of meat clarified over time, up till the emergence of modern humans [Lewis & Werdelin, 2007; Werdelin & Lewis 2013; Domínguez-Rodrigo & Pickering 2017; Wissing et al. 2019]. Optimal foraging theory [Hawkes et al. 1982; Mann 2007] and age-at-death profiles [Bunn et al. 2017] suggest hunting of large animals [Fisher 1984; Broughton et al. 2011; Speth 2012], but not necessarily the largest ones [Lupo & Schmidt 2016]. Yet, hunting of Proboscidea is well-documented [Rabinovitch et al. 2012; Brugere 2014; Shipman 2014; Boschian & Saccà 2015; Wojtal & Wilczynsky 2015; Bocherens et al. 2015; Agam & Barkai, 2015; Reschef & Barkai, 2015; Pitulko et al. 2016; Drucker et al. 2017; Agam & Barkai 2018; Wißing et al. 2019; Venditti et al. 2019], perhaps to extinction for mammoths [Cherney, 2016; Klapman & Capaldi, 2017]. 
 
Be that as it may, the disappearance of mega-herbivores created a need to hunt for smaller and faster animals, at the same time selecting for more agile and cognitively developed hominins [Ben-Dor et al. 2011]. In addition to the sourcing of meat, organs, and marrow, and besides the extensive gathering of plants, the consumption of eggs and fish must also have played its role in food security, at least in regions where it was appropriate to do so [Hu et al 2009; Kuipers et al. 2010].
 
Pre-agricultural diets
 
The share of ASFs in ancestral pre-agricultural diets was substantial, although fully reliable estimates are not available. Some authors have nonetheless tried to reconstruct the composition of Paleolithic diets to the best of their possibilities. Contemporary hunter-gatherers serve to some degree as models, albeit imperfectly so due to differences in large prey availability compared to the Pleistocene [Faith et al. 2019; Ben-Dor & Barkai 2020]. Although the correctness of estimates from ethnographic data is debatable, it has been reported that hunter-gatherer communities obtain 65% of their energetic intake from ASFs on average [Cordain et al. 2002]. For about 3/4 of the world-wide hunter-gatherer societies, ASFs would provide more than half of the calories [Cordain et al. 2000]. According to another estimate, at least 30% and up to 70% of the total caloric intake of Paleolithic diets must have been of animal origin [Kuipers et al. 2010], although levels may have been higher in some cases (e.g., in glacial areas or when mega-herbivores were still available) [Ben-Dor et al. 2016]. 
 
Within hunter-gatherer communities, yearly per capita intake levels of meat have been estimated in the range of 80-220 kg for the !Kung, Nukak, Onge, and Anbarra people and up to 350-500 kg for the Australian Arnhem and the South-American Hiwi and Aché [Hawkes et al. 1982; Kaplan et al. 2000; Nishida 2012]. Theoretical estimates of Paleolithic diets reflect a similar order of magnitude (300-400 kg of meat and/or fish) [Kuipers et al. 2010]. Even higher quantities of ASFs have been described for (sub)Arctic communities; very rough estimates of traditional ASF intake levels by the Inuit in eastern Greenland are 300-500 kg/p/y of meat combined with 150-300 kg of blubber [Robert-Lamblin 2004]. 
 
In addition to ASFs [including honey and insects; MacEvilly 2000], a variety of plants contribute to the nutrient intake of hunter-gatherers and did so too for early humans [Kaplan et al. 2000; Marlowe & Berbesque 2009; Henry et al. 2014]. When available, honey and plant carbohydrates are among the preferred foods [Marlowe & Berbesque 2009]. They lead to protein-sparing effects, especially when meat is lean [Speth & Spielmann 1983]. Some plants, e.g., tubers and bulbs, should nonetheless be seen as fall-back and supplementary foods rather than as staples, due to their relatively low energetic rewards [Hawkes et al. 1982; Marlowe & Berbesque 2009]. Yet, carbohydrates were scarce during Pleistocenic glacial winters. For Neanderthals, >75% of calories would then have had to come from animal fat [Ben-Dor et al. 2016].  
 
With the exception of readily edible foods, such as fruits and seeds, several plants only became meaningful foods after the introduction of cooking and food preparation technologies, allowing for improved digestibility and sufficient detoxification [Southgate 1991]. As a result, the consumption of substantial amounts of tubers, bulbs, and grains may have appeared in the human diet some 0.04 Mya [Kuhn & Stiner 2001; Power et al. 2018]. 
 
Depending on the ecological zone and plant/animal ratios, variable estimates are obtained for the share of carbohydrates (20-50% of kcal), fat (20-70%), and protein (10-35%) in most hunter-gatherer diets [Kuipers et al. 2010]. Intake of saturated fat may have been situated within a 7-19% (of kcal) range, or 20-60 g/d [Kuipers et al. 2010]. These numbers are but educated guesses, mostly ignorant of contextual factors and what must have been an overwhelming variability.

The agricultural transition

During the Neolithic era, food supply became increasingly dependent on the domestication of plants and animals [Richards 2003; Vigne 2011]. Although there are debates on the original motives for animal husbandry [cf. Leroy et al. 2020], livestock was highly valued for a number of reasons (food, traction, manure, cultural and religious significance, etc.) However, only small amounts of ASFs were produced, while crops lacked the variety needed to compensate for the lack of nutrition. This resulted in a low meat/high cereal intake, which correlated (for various reasons) with a shift from good health and moderate fertility to a status of disease and high fertility, and thus from population stability to a sequence of population booms and busts [Scott 2017; Williams & Hill 2017]. 
 
The outcome of this novel, fragile food system was characterized by malnutrition and infectious disease, paralleling reduced stature, poor bone health, nutritional deficiencies, and dental caries [Ulijaszek 1991; Larsen 1995; Mann 2007; Latham 2013]. Reduced stature serves as a marker for deprivation and low dietary quality [Perkins et al. 2016], originating from a combination of low protein intake [Grasgruber et al. 2016] and a high glycemic load [Mitchell 2006]. There was, however, an increasing stabilization of the supply of meat and secondary ASFs throughout the Neolithic [Münster et al. 2018], and thereafter. Consumption in some regions and eras may have been higher than what common belief is assuming, sometimes above the levels of post-industrial Western diets [now at 50-70 kg/p/y of red meat and 20-50 kg of poultry; see elsewhere].
 
In the Middle Ages, for instance, some areas in Western Europe had access to a high supply of mutton, beef, and/or pork, as well as cheese and butter [Banegaz López 2010; Dunne et al. 2019; Camisa 2020]. In late Medieval Barcelona, average meat intake was around 80-120 kg/p/y without counting poultry (depending on status, but not limited to the elites). Although aristocrats would have consumed the highest levels (e.g., 1 kg for a typical meal in 15th-century England, with two meat meals a day, five days a week), even monks (1 kg of meat per day, four days a week, plus plenty of eggs and fish) and common people were often well provisioned.
 
However, while some parts of the population were clearly well fed, malnutrition remained rampant in the poorer population segments of many agricultural populations, even until the present age. Although industrialized countries have moved into foodscapes that are characterized by abundance of ASFs [see below and elsewhere], and even if low- and middle-income countries have optimized their diets in ways that are contingent on local resources, unbalanced reliance on cereal-heavy diets still typifies many regions in the Global South [see elsewhere]. 
 
In contrast to the often impoverished and fragile rural communities depending on crop agriculture, pastoralist communities in low- and middle-income countries typically were able to secure much higher levels of ASFs. This is the case for both ancestral [Wilkin et al. 2020] and contemporary variants (the Turkana are still obtaining 62% of their calories from dairy and 12% from meat, fat, and blood) [Lea et al. 2020]. In India, 6-8% of the population is directly or indirectly involved in pastoralism, providing three quarters of the national meat supply and half of the country's milk [Kukreti 2020].
 
The industrial transition
 
During the 18th and, especially, 19th century, ASFs became widely accessible albeit predominantly in the industrializing countries of the West. This was particularly the case for the production of red meat, due to a series of technological developments (e.g., cooling technologies), improved logistics and transportation means (e.g., railways), a rising urban demand for meat, and a rapid transformation of the food chain [Leroy & Degreef 2015; Leroy et al. 2020]. 
 
A sharp increase in stature was seen [Our World in Data 2019], suggesting improved nutrition due to a higher intake of energy, protein, and micronutrients (although other factors, such as a reduced prevalence of infections, must also have been of importance). The transition did not, of course, result in a return to Paleolithic-style diets [despite the initial gains in nutritional adequacy, suggestive of a diet more adapted to the biological needs of humans; see elsewhere], neither with respect to quality nor quantity of ASF consumption (for the better or worse). Moreover, the diets of industrialized countries eventually (d)evolved into what is now known as 'the Western diet', incorporating large amounts of ultra-processed foods [Cordain et al. 2005; see elsewhere]. Much of the controversies related to ASF consumption relates to how they have been inserted in the Western diet, rather than being a result of their intrinsic properties.

Translate content