The cultural relevance of foraging in the western world (part one: the domestication of humans)

This material was originally published as chapter 1 of my book Edible Plants: a forager’s guide to the plants and seaweeds of Britain, Ireland and temperate Europe. Signed copies available here, also available from Amazon and other online retailers.

 

“As the whole world is self, killing a plant or animal is not murder but transformation. Finding food is taken for granted, reinforced by myths telling the hunter to be the animal before presuming to kill and eat it.”  (Graeme Barker, on animistic hunter-gatherers. From The Agricultural Revolution in Prehistory: Why did Foragers become Farmers? (2006)).

Losing Paradise

Humanity is at the end of a transition from tribal hunter-gatherers to techno-industrial dominators of the planet – a process that began with the emergence of the first agricultural technologies, and their promise of new ways to control the wild world. At the time of publication, fewer than 200,000 humans are living as pure hunter-gatherers – 0.0025% of the global population, and still falling. Yet human society today is less secure than it was at the beginning of the transition, and after a very long decline, wild food is making a comeback.
This foraging renaissance has nothing to do with people whose own recent ancestors were hunter-gatherers. It is a feature of western societies, especially those furthest removed from that way of life – those which had, until recently, stopped foraging almost entirely. It is dismissed in some quarters as a “craze”. Nobody, it has been said, needs to forage in the modern world; humans should stick to cultivated food and leave the wild stuff for wildlife. According to this view, today’s European foragers are romantic dreamers, pining for a lost way of life that in reality must have been nasty, brutish and short.

But perhaps the notion of a lost paradise is not so crazy. The technological wonders of the 21st century are so familiar that most of us have a hard time imagining living in the world of just decades ago, let alone millennia, while at the same time our civilisation faces existential threats that would have seemed completely alien to our hunter-gatherer ancestors. They were animists, as are “primitive people” everywhere. They believed the world had always been the way it was, and would always be that way, and that the natural and spiritual realms were one and the same, and would always provide what was needed. They respected it as if it was their own kin. Their way of life was absolutely sustainable. Ours, regardless of science and technology that would look like magic to them, is absolutely not.

The Neolithic Revolution

The Fertile Crescent. This is where agriculture first arose, and the first great civilisations flourished a few millennia later. Along with these civilisations came a new sort of spirituality, very different to animism. Animistic belief systems are holistic, inherently mystical-panentheistic, and anathema to mainstream Abrahamic monotheism. The Old Testament instructs humans to “Be fruitful and increase in number; fill the earth and subdue it. Rule over the fish in the sea and the birds in the sky and over every living creature that moves on the ground.” (Genesis 1:28). The unstratified and relatively egalitarian world of animistic tribalism was steadily replaced with a rigid hierarchal system: God at the top, humans in the middle (now in a massive hierarchy of their own) and the natural world at the bottom. And instead of there being no beginning and no end, the cosmology of Abrahamic religion starts with humanity being kicked out of Paradise on a path that leads to apocalypse.
(image License CCSA 3.0, Creative Commons user Semhur)

The human animal evolved to forage and hunt for a living, just as its ancestral species had done for billions of years, but in one respect this animal was different to anything that had gone before: it had a brand new survival strategy. Other creatures depend for their survival on things like being able to melt into the background or run very fast, or being very large, or having very sharp teeth, powerful venom or an impenetrable shell. Humans were the first creatures to depend entirely on their wits. For 2.6 million years of “Ice Age” (scientifically the “Pleistocene epoch”) evolution worked on an animal whose bipedal locomotion, dextrous hands and anatomical capability for complex speech, though all of them game-changers in their own right, were mere auxiliary features compared to the revolutionary evolutionary technology of overwhelming brain power. This, in control of those hands and communications, was applied to create weapons for hunting creatures so large and dangerous that an unarmed human would be fortunate to escape from unharmed. Lithic technology – the working of stone – advanced at a pace that wasn’t even glacial, but it eventually led to the invention of tools for harvesting and processing wild grains and pulses, and even made the felling of large trees possible.
There is a lively debate about the impact anatomically modern humans had on populations of large animals outside their African evolutionary home. Fossils show they first left Africa at least 180,000 years ago, but genetic studies indicate that these early migrants died out [1]. The successful colonisation of the rest of the world started about 60,000 years ago. Megafauna have been disappearing ever since, but we do not know to what extent this was the result of human activity. But even if these humans were responsible for an elevated extinction rate in the late Pleistocene, their impact on the ecosystems they colonised must have been limited by their relatively small numbers.

Then about 12,000 years ago, things started to change more quickly. The ice retreated, and large areas of the northern hemisphere became more attractive for human habitation. Within a few centuries of the start of the next epoch (the Holocene), beginning in south-west Asia, humans started farming. Why here first, rather than somewhere else? The “Fertile Crescent” had high-quality soil, a reliable water supply, and was home to a disproportionate number of wild species that were particularly suitable (“pre-adapted”) for domestication [2]. But that doesn’t explain why humans didn’t just remain hunter-gatherers.

This first agricultural (or “Neolithic” – new stone age) revolution has been described by American anthropologist Jared Diamond as “the worst mistake in the history of the human race” and by Israeli historian Yuval Noah Harari as “history’s biggest fraud.” Studies [3,4] of 20th century foraging societies suggest they averaged no more than four hours a day hunting and gathering food, leaving plenty of time for leisure – and this was typically on relatively poor land, rejected by farmers. It is reasonable to assume that mesolithic (middle stone age) foragers living in a more plentiful environment had at least as much free time. Neolithic farming was a much harder life, requiring back-breaking work from dawn till dusk, and there’s no evidence to suggest those early farmers were any healthier or better fed than their mesolithic predecessors. The best evidence we have suggests the opposite was true.[5] And even if farming could provide more net calories, it couldn’t (and still can’t) touch foraging for variety: mesolithic foragers must have enjoyed a more interesting and balanced diet than Neolithic farmers. This lack of variety persists: globally, today, about 25 cultivated crops comprise 90% of our diet.[6] Farming also impacted the ecosystem in a fundamental way that foraging does not, causing far bigger changes to the balance of species in the local environment and the genetics of the domesticated species. Additionally, it opened the door to wealth and social inequalities on a scale that had previously been impossible, some of which still plague us today. There should be no myths about noble savages [7], and analysis of bones suggests that few mesolithic hunter-gatherers made it to the age of 50, but in many respects it was an easier, healthier and more pleasant way of life than the agricultural one that replaced it. This view of the Neolithic revolution and its consequences forms the foundational beliefs of the anarchist critique of civilisation known as anarcho-primitivism.

Exactly how and why foragers became farmers are important questions that archaeologists have been trying to answer for over a century. Many theories have been proposed, mostly based on speculative reasoning, and supported by insufficient or disputed evidence. All the simple answers are inadequate – this was a complex and lengthy process. Initially, the most obvious answer was that farming was a giant technological and cultural leap forwards: “progress”. Foragers, according to this hypothesis, would obviously have concluded that cultivating plants and keeping livestock is far more reliable and efficient than the lottery of foraging and hunting, right? This sort of thinking projects backwards from our modern perspective instead of understanding what really happened. It involves a concept of progress that is itself a product of modernity and would have been incomprehensible to the first farmers, and almost as meaningless to most medieval Europeans. On a geological-evolutionary timescale, the entire transition from foraging to farming happened in the blink of an eye, but on the timescale of human lives, it took hundreds of generations. It was not a conscious choice – no forager woke up one day and thought “why don’t we clear the forest, and deliberately grow the plants we need instead of foraging for them!” It happened as the result of countless individual human decisions, each taken because it made sense in the immediate situation, and in the foreseeable, rather than more distant future.

Foraging as a way of life naturally limits fertility and population density. In most territory available prior to the Neolithic Revolution, the only way to find enough food was to keep moving: Mesolithic hunter-gatherers were almost entirely nomadic (the only permanent settlements to significantly predate the Neolithic Revolution were a few fishing villages). The key to successful foraging, then as now, was to be in the right place at the right time. And if you are continually on the move then there’s a limit to how fast you can reproduce – carrying one infant is manageable, but carrying two is not. This practical limitation was reinforced by the fact that children in foraging societies are weaned much later, which has the biological effect of reducing female fertility due to higher levels of the hormone prolactin. Even with a low population density and a slow birthrate, populations in those foraging societies would have steadily risen during years of abundance, but sooner or later there would be leaner years when food was scarce, even for foragers, with their flexible diet. The young, the old and the weak would struggle to survive these hard times, lowering the human population back into balance with the rest of the ecosystem.

Fear of hunger must have been a factor in the development of agriculture. If your foraging territory is limited by the presence of competitors, threatening your ability to feed yourself, you might be tempted to create a clearing, to provide habitat for the kind of plants that are good for foraging, and attract herbivores to hunt. Perhaps you’ll selectively hunt the males, leaving the females to reproduce. In these ways humans started to intentionally modify the ecosystem and the genetics of the plant and animal species that they depended on (they had already been doing so unintentionally for millennia).

Towards the end of the Mesolithic period there was an increase in sedentism: abandoning a nomadic way of life is a necessary pre-requisite for agriculture as we understand it. The first evidence of humans spending extended periods in fixed settlements dates from about 2000 years before the start of the Neolithic Revolution, but these people were still foragers. What made them different to the nomadic majority was that they had claimed the most productive territories as exclusively theirs, usually at boundaries between multiple ecosystems, such as where a river meets the coast or passes through the foothills of a mountain range. Such locations would have been able to supply abundant resources at different times of the year, and permanent or semi-permanent settlement would also have allowed people to start preserving and storing some of the glut of food that is available to hunter-gatherers in spring and autumn. Sedentism also led to a step-change in the rate at which humans were modifying the genomes of the species destined to become the first domesticated crops and livestock. For example, the very act of harvesting seeds for storage puts selective pressure on the species involved. It favours the development of larger seeds in more easily harvested configurations on the plant, because those are the ones most likely to end up in a basket and accidentally dropped near a settlement, even if they aren’t deliberately planted.

Unfortunately, establishing long-term settlements in productive foraging locations and increasing the amount of edible plants in your immediate vicinity does not solve the problem of food insecurity for very long. It actually makes things worse, because the resulting population increase leaves a greater number of people vulnerable to future shortages. But there was no way back, either for the earliest farmers or for the majority who still foraged. For a while – in some cases for a long while – foragers and farmers co-existed in relative peace, and even traded with each other [8]. The foraging societies sometimes adopted some elements of the “Neolithic package” (sedentism, animal husbandry, agriculture and pottery), while foraging remained their primary means of subsistence. The products of farming would have been seen as exotic, high-status goods within foraging societies. This wasn’t enough to immediately convert them to a way of life that must have looked as hard as it was, but in almost all cases, eventually, the farmers either displaced the foragers, or the foragers became farmers. Farming then began spreading geographically in fits and starts from south-west Asia north-west into Europe and east into Asia, and other domesticated species spread from other points of origin slightly later.

The vicious circle was complete. Gradual replacement of nomadic foraging with sedentary farming, and incremental improvements in farming technology, drove further increases in human population levels, threatening food security again. Overshoot was followed by famine, but the long-term trend was an inexorable growth of the human operation on Planet Earth, always at the expense of the wider ecosystem.

On a global scale this process continues, for now. At the time of writing, the human population is increasing by over 250,000 every day – more than the total number of pure hunter-gatherers still in existence. Food production still needs to meet an ever increasing human demand, and the wild world is being destroyed at a more frightening pace than ever. Both are only made possible by the use of vast amounts of fossil fuels to artificially fertilise farmland and power the industrialised world. Unfortunately for the human race, those fossil fuels are both non-renewable and irreplaceable. This chapter of our story will not have a happy ending. Overshoot, resource depletion and ecological collapse on a grand scale threaten the biggest food crisis of them all, and a brutal restoration of the long-term balance between humans and the rest of the global ecosystem that was interrupted twelve millennia ago.

The Anthropocene

Some have suggested that the Holocene epoch, which started around the end of the ice age and the beginning of the Neolithic revolution, has itself recently ended. Depending on whose version you prefer, the end point lay somewhere between the start of the Industrial Revolution in the 18th century and the first man-made nuclear explosions in 1945. “Anthropocene” refers to a new epoch, the defining character of which is humans having become a key factor determining the content of the Earth’s biosphere, and consequently the rocks being formed at the surface. In those rocks, geologists of the distant future (should there be any) will find a massive increase of fossilised humans, the structures we build and our waste products, and a corresponding decrease in the quantity and diversity of fossils of almost everything else.“Anthropos” is Greek for “man” or “human being,“cene” means “recent”. The word has been in use since the mid-1970s, but since the turn of the century the Anthropocene has become an increasingly useful concept for people trying to understand and explain the changes taking place on our planet, and provide proper context in terms of geological history and the long-term future. Anthropogenic phenomena such as climate change and mass extinction may well have become unstoppable, at least without intentional intervention on a global scale (“geoengineering”) which would itself leave an indelible mark on the geology currently being formed.

A minority of the popularisers of the term still hope for a “Good Anthropocene”, where further technological advances allow humans to solve some of these global systemic problems, save the day for advanced civilisation and maybe even pave the way for a glorious transhuman future. The majority expect a “Bad Anthropocene”, where the centre does not hold and the question is not so much whether things will fall apart, but how. In the words of Graeme Barker, we differ from our wild foraging ancestors not just in our ways of doing, but in our ways of thinking and being. The Anthropocene is not just a geological concept about the selection of species being fossilised; it frames key questions about how 21st century humans think about ourselves and our place in the world, and how that differs from the way people thought about those things before and after the emergence of farming, science and industrialisation.
This is the context of the foraging renaissance. Two paradises have been lost – that of the human condition before farming and that of the Holocene ecosystem that techno-industrial civilisation has systematically destroyed. We might say that our society has become dangerously disconnected from the natural world, and that foraging helps to re-establish that lost connection, but maybe the view from the Anthropocene should be that we are as connected to the natural world as we ever were, and that human civilisation is part of a natural process, however unnatural it looks to us.

Perhaps we are not murdering the wild world, but transforming it.

The Anthropozoic

Black banded ironstone, formed during the Great Oxygenation Event. Photo credit Andre Karwath (license CCSA 2.5 generic)

Every now and then in the Earth’s long history, something happens that changes everything and changes it forever. An early example was the Great Oxygenation Event, approximately 2.5 billion years ago, caused by the first appearance of a revolutionary evolutionary technology. Cyanobacteria found a new way to acquire food – a new kind of photosynthesis that released free oxygen as a by-product. Oxygen is a highly reactive element that had never previously existed in this form in the Earth’s biosphere. For a few hundred million years, the consequences were limited by vast amounts of iron dissolved in the oceans, which slowly precipitated out as iron oxide, forming rich ore deposits that we’re still mining. When the iron in the oceans was exhausted, free oxygen started building up, first in the water, then in the atmosphere. Nearly all of the organisms in existence at the time had evolved in an oxygen-free world, and for them, the accumulating oxygen was a death sentence. The entire biosphere had to adapt to a new reality: out with the old ecological order, and in with the new.

It happened again 540 million years ago at the start of the “Cambrian Explosion”, when all the major branches of animal life that exist today suddenly appeared, along with a mind-boggling array of others destined to be failed evolutionary experiments, displacing almost everything that existed before. We don’t know for sure what triggered this process, but we do know that it was world-transforming. Some 292 million years later, massive volcanic activity changed the climate and caused the Permian Extinction, or “Great Dying”. About 95% of marine species were wiped out, and almost as many on land. From the ruins of the old ecosystem, a new kind of reptile emerged – the dinosaurs. They ruled Earth unchallenged until their own fate was sealed 65 million years ago when an asteroid slammed into what is now the Yucatán Peninsula in Mexico, setting the planet on fire and poisoning the atmosphere again. Once more life recovered from the shock, and this time it was the turn of the mammals to take over.
These three great revolutions in the course of geological and evolutionary history mark the boundaries between one great geological era and the next – the Paleozoic (“old life”), Mesozoic (“middle life”) and Cenozoic (“recent life”). These eras are divided into periods (we’re in the “Quaternary”), and the periods are divided into epochs like the Holocene and Anthropocene.
There is an older term than “Anthropocene”, coined by Italian geologist Antonio Stoppani (1824-91). Way ahead of his time, Stoppani sensed the enormity of the impact humans were having on the rest of life on Earth, and he tried to warn people. For him, declaring merely a new epoch was an understatement. When Earth’s ecosystem eventually arrives at a new equilibrium, following the emergence of the revolutionary evolutionary technology of overwhelming brainpower, the long-term changes may well be comparable to those of the Great Oxygenation Event and the three era-ending cataclysms. If so, then the Holocene was the final epoch of the Cenozoic and we have just entered not only a new epoch called the Anthropocene but a whole new era: the Anthropozoic.

Continue to part two: foraging traditions old and new

1.When did modern humans leave Africa, Stringer and Galway-Witham, Science 26 Jan 2018: Vol 359, Issue 6374, pp. 389-390.
2. Guns, Germs and Steel: A short history of everybody for the last 13,000 years, Jared Diamond (1998).
3. Stone Age Economics, Marshall Sahlins (1972).
4 Primitive Affuence, Bob Black (1992).
5. Early agriculture’s toll on human health, George R. Milner, Proceedings of the National Academy of Sciences of the US (2019) 116 (28) 13721-13723.
6. How many plants feed the world? Prescott-Allen & Prescott-Allen, Conservation Biology (1990).
7. The Truth About Primitive Life: A Critique of Anarchoprimitivism, Ted Kaczynski (2008).
8. Parallel palaeogenomic transects reveal complex genetic history of early European farmers. Nature, 2017; DOI: 10.1038/nature24476.