Posts Tagged ‘agriculture’

‘Ineffective and inhumane’ – or in denial?

March 3, 2014

***Updated, March 16th (see below)***

badger

Back in media-land the insanity on the badger cull continues (see previous post). I was unfortunate enough last Friday to witness this Channel 4 news report on a ‘scientific assessment’ which called the recent pilot culls in Gloucestershire and Somerset ‘ineffective’ and ‘inhumane’. It struck me as a classic example of media framing – laying down (and subsequently policing) the boundaries of public discussion to extremely narrow parameters in a way that benefits the powerful. So, for example, you hear respectable commentators talk about the 2003 Iraq war and polarise the debate between those who view it as ‘justified’ and those who think it was a ‘miscalculation’. You hardly ever hear the conclusion that the evidence supports – namely that it was a deliberate act of criminal aggression. Likewise, with the negative effects of ‘austerity’ in the UK (dismantling of the NHS,  removal of benefits, pay freezes, public sector job losses), at the liberal extreme these are most often presented as a failure or a mistake on the part of politicians, but practically never as intentional, cynical policies to further reconstruct the economy as a channel of wealth from the poor (and middle-classes) to the super-rich.

So how does this apply to the badger cull, as discussed in this particular Channel 4 bulletin? Well first off presenter Cathy Newman passes on the government’s stated justification for the cull without criticism – we are to believe from the start that the cull was ‘aimed at tackling the spread of TB in cattle’ and everything that follows rests on this premise. Other possible motivations such as irrational hatred of wildlife, scapegoating and displacement of responsibility on the part of the farming lobby and a willingness of the political establishment to ‘offer [them] a carrot’* don’t merit consideration.

Science Editor Tom Clarke then comes on to make his presentation. He fleshes out the Official Explanation for our benefit:

The purpose of these [pilot culls] was to show that you could effectively and humanely kill badgers to control TB and kill enough of them quickly enough to prevent spreading the disease

and presents the ‘very strict rules’ by which success or failure are apparently to be judged – namely a minimum 70% kill rate, a six week culling period and to shoot them in a ‘humane’ way using trained marksmen. The news, then, was that

What we now know is that they managed to fail on several of these counts.

It turns out that they were able to kill less than 50% of the target populations. However, this is not a source of concern or grief to Clarke – a very real failure to stop badgers getting killed for highly dubious reasons – rather:

50% [is] an important number because that’s actually getting down to the point at which this policy could in fact cause more TB to spread around than not.

(To his credit he does refer to the ‘humane’ goal of less than 5% of badgers taking longer than five minutes to die as an ‘arbitrary and not particularly laudable target’)

We then hear from Dominic Dyer of ‘Badger Trust & Care For The Wild’ with a priceless soundbite:

This cull has been an absolute disaster. They’ve only killed a fraction of the badgers they thought they would be able to kill.

What? Does that mean he would be happy if they had killed 70% of the populations?? This pathetic opposition which implicitly accepts – indeed, which appears to cheerlead the government’s insane policies (so long as they are carried out ‘efficiently’ and ‘scientifically’) was repeated by the scientist chosen by the BBC’s newsteam, Prof. Rosie Woodroffe  of the Zoological Society of London [0:45]:

These culls have not killed enough badgers, haven’t done it fast enough. The benefits will not be, therefore, as great, we expect [as in a former trial]

Benefits??? Whose benefits would that refer to, I wonder? Not much benefit in a fucking bullet, is there! At least, not if you’re on the receiving end…

The discussion back in the Channel 4 studio takes a surreal twist after Newman poses this corker: ‘It sounds like an odd question but how hard is it to kill a badger humanely?’ – as if she was thinking about getting involved herself and wondering how to go about it. Clarke responds with a sympathetic portrayal of the marksman’s plight:

[T]he one thing that struck me is how impossible it must be – it’s dark, it’s raining most of the time, it’s thick forest. I’m actually surprised these marksmen managed to kill any badgers at all, let alone cleanly. So I don’t think we should be too surprised that there were some problems. […] This is a complicated, difficult, rather messy business, killing animals. But we’re a society that prides ourselves on our humaneness, especially our farming industry [ha!], so it makes it hard to justify from a humane point of view.

This reminded me of the way the media guides us to empathise with the ‘difficult task’ of the soldiers in the UK’s overseas wars, presenting it as a dry, technical challenge and not speaking of moral culpability when they go about their business of killing defenceless creatures  – of their own species (and again for highly dubious reasons which we’re not supposed to scrutinise too closely). Yes, it’s a complicated, difficult, rather messy business, killing Afghans and Iraqis. But we’re a society that prides ourselves on our deep commitment to democracy and human rights, especially our military, so it makes it hard to justify – a tough decision, a difficult job but someone’s got to do it etc etc ad nauseam.

I think lots of people think the cull has been highly successful, even if it doesn’t get rolled out across the country for the next three years as originally planned. For them the point was never to do something about Bovine TB – if they were serious about that then they would look to the farming practices that create the perfect breeding ground for this disease among cattle in the first place. As with the focus on dredging as a supposed cure to lowland flooding (it isn’t), I think the intention was to be seen to be doing something about the problem, regardless of how effective this might prove to be. There’s also the possibility that the issue has served as a distraction while bigger things were going on behind the scenes, as raised in this Think Left article:

However, there is another rather concerning thought. The public outrage and likely direct action against the cull, may distract the media away from something that the government wants to slip through unnoticed. There is little doubt, that it was just this sort of distraction tactic, that lay behind the proposal to sell off the forest which was announced just as the Health and Social Care bill took its first steps through the Commons.

I suppose if that’s true I’ve fallen for it, hook line & sinker (although at least I’ve taken it in a direction of my own choosing).

But really, I think the major success of this policy has been an emotional one – to lash out in frustration at something that can’t fight back, and to act out destructive urges on something which has no real or immediate value, according to the metrics of the current dominant culture.

***Update, March 16th***

Other opposition figures captured by government/media rhetoric about ‘effectiveness’ include Caroline Lucas:

Now that its own research has demonstrated that badger culling is cruel as well as pointless, it’s time for the Government to heed the evidence and end this failed policy once and for all. (link)

and Chris Packham:

Let me be clear from the outset, if the scientific evidence pointed to culling badgers being an effective, humane, sustainable and economically viable solution to the increasing occurrence of TB in cattle then I’d be agreeing to it. (link)

Now I’m struggling to find a parallel that doesn’t violate Godwin’s Law, but… imagine that you live as a minority group in a racist totalitarian society. There’s an outbreak of epidemic disease which affects a large portion of the population, due mainly to overcrowded living conditions, poor sanitation, malnutrition etc. – basically the incompatibility of the human organism to prolonged city life. Your group is relatively lucky in that marginalisation and prejudice means you live on the outskirts in a somewhat more resilient, rural form of subsistence economy, although you also suffer losses from the city-born epidemic. But when disaster strikes the city founders and leaders (including those who most benefit from its continued operations) quite understandably don’t want to undermine their Great Accomplishments or any future ambitions by drawing attention to the real causes of the epidemic. Instead they divert attention away from themselves by casting an eye for the briefest of moments outside of their sphere of influence until they find someone else they can blame who won’t cause them too many problems. So your group becomes the scapegoat and wild accusations about your inherent uncleanliness or genetic impurity start flying around. You are judged guilty of causing a ‘health risk’ to the broader population either passively (through your supposedly lax standards) or even through active conspiracies (eg: poisoned wells, subversion of the social fabric etc.) So they send the shock troops in to carry out a ‘controlled cull’ of your population. Terror ensues, your way of life is shattered, you can no longer trust anybody any more, you suffer all the symptoms of post traumatic stress as a community. Then one day you hear that some intellectual luminaries in the city opposed this atrocity on the grounds that it was a ‘failed policy’, ‘ineffective’, ‘inhumane’, ‘unsustainable’, ‘economically unviable’. They say that the cull and any future culls ought to be discontinued because it is difficult, if not impossible to ensure that targets of killing 50% or more of the selected populations – your friends and relations – are met. How would you react?

To be fair to Lucas, Packham and others, these are the comments of theirs that have been deemed acceptable for inclusion and propagation by the media system. It’s highly likely they have a whole host of unacceptable opinions on this issue which they pretty much have to keep to themselves. Packham is a case in point. I much prefer his earlier tweeted comments about the pilot culls:

It is both sad and shameful that when night falls and the setts of southern England stir their gentle folk will be needlessly slaughtered. That in spite of science and public will the wrath of ignorance will further bloody and bleed our countryside of its riches of life. That brutalist thugs, liars and frauds will destroy our wildlife and dishonour our nations reputation as conservationists and animal lovers. So I fear that tonight could be the darkest for British wildlife that we have witnessed in our lives. I feel sick, sad, disempowered, betrayed, angry and crushed by the corruption of all that I know as right. I feel rage. (link)

But a Tory MP complained that this breached BBC impartiality (even though it was from a personal twitter account) and as a result Packham, we can assume, got a slap on the wrist from his managers and was forced to promise not to speak about the issue any more, at least not on the BBC’s time:

On his website, Mr Packham said: “My views on the badger cull are well known and have been widely voiced and published.

“They are opinions based on a pragmatic and objective consideration of the current science concerning its efficacy as well as concerns about animal welfare. Because of the prominence of my comments it is obviously impossible for me to be considered impartial as a BBC presenter on this topic.

“Impartiality is a cornerstone of the BBC’s practice as a public service broadcaster and I am determined to protect this important aspect of its integrity. Thus I will not be taking part in any discussions about the cull during the forthcoming series of Autumnwatch and Winterwatch.

“Nor will I be presenting any items in the series about the badger cull because, as natural history programmes, they are not the right place for discussions about matters of national public policy. I will however continue to make my views known when I feel it is appropriate to do so. (link)

Another example of how thought is controlled in nominally democratic societies†.

————————

* – Professor John Bourne, chair of the Independent Scientific Group on Bovine TB: “I think the most interesting observation was made to me by a senior politician who said, ‘Fine John, we accept your science, but we have to offer the farmers a carrot. And the only carrot we can possibly give them is culling badgers’.” (link)

† – I’m indebted to David Edwards, especially his book Burning All Illusions (aka Free To Be Human) and his work with David Cromwell on the UK website Media Lens for this insight and others.

More Rewilding

August 10, 2013

I’ve been following the continuing debate on rewilding with interest. Some links:

An acrimonious exchange in the Guardian between Steven Poole and George Monbiot. Poole basically trolls Monbiot and other nature writers for their supposed ‘bourgeois escapism’ but accidentally points to an interesting line of discussion which I’ve touched on before – the strange emotional charge underlying designations of ‘native’ vs. ‘invasive’ species and what happens when we turn this logic back on ourselves. Monbiot unfortunately, but perhaps understandably, closed off any fruitful engagement by invoking Godwin’s law and beating Poole over the head with his superior scientific knowledge.

Mark Fisher, a longterm writer and advocate for rewilding in the UK has written a few responses to Feral in this piece, which details some specific examples of rewilding landscapes which he has visited in the US and Ireland. This part made me think of the similar way in which the wildwood must have been cleared over here in order to impose the same conditions of open land for livestock pasture and field agriculture:

There should not be some mystique about mountain folk, that they sought refuge to live in sympathy with the land. Many settlers were tenants of a few large landowners, but they and homesteaders all embarked on a common pursuit of exploiting the land, by ringing trees with their axes – a process called “deadening” – to clear fields for pastures and orchards; killing all the large carnivores so they weren’t a threat to their cows; and hunting out the white-tail deer, so that they had to be restored to the park when it was set up.

A long-delayed subscription to The Land Magazine earlier in the year rewarded me with a whole issue devoted to rewilding, with articles on  wolves, ponies, sheep, fescue, Chillingham cattle and a generous review of Monbiot’s book by Bill Grayson. I have mixed feelings about Simon Fairlie’s response, ‘Rewilding and Food Security‘, which is unusual as I mostly find his writing to be spot on, revealing and highly informative. On the one hand comments about the unfair competition between the unsustainable industrial food system and upland sheep farmers are unarguable and the concluding point is a strong and important one:

The more we rewild in Britain, the more food we will need to import and the more we are likely to dewild land in countries that provide us with substitute food. Conserving our natural environment at the expense of other people’s is a neo-colonialist agenda. There is an environmental price to pay for having so foolishly allowed England to become one of the most overpopulated countries in the world, but that price should not be paid by people and environments in other countries.

(Although this is again blinkered by not considering rewilded landscapes as habitat for feral humans on the way to a wild nativeness of their own.)

However the contention in the editorial piece, ‘Zone Five’, that ‘What this particular island produces most abundantly is, of course, grass’ seems flatly wrong, or at least resting on a dubious conception of the meaning of abundance. Surely the most abundant spontaneous expression of this land comes in the form of trees and dense, extensive woodlands. Anything else requires a massive, devastating initial effort and continuing vigorous management every year from then on to prevent reversion to what the land actually wants to do (as we saw before). And this comment is a strong contender for the Agrarian-Fundamentalist-Asshole-Remark-Of-The-Year award:

Sheep also play a role in bringing us the sunlight which would otherwise be hogged by a blanket of forest. If you have no grazing animals to keep trees down, then to admit sunlight on any scale you have to use either fossil fuels or fire, both of which are less sustainable than the “woolly mowers”. Wind turbines and solar farms are dependent upon keeping land open to wind and sunlight and so probably is the health of the human psyche. Of course trees are a “good thing”, but you can have too much of a good thing, whether that be trees or sheep.

You heard it right – our mental health depends upon mass deforestation and the maintenance of an ‘open’ landscape where we can do as we please. Well, I guess it’s still revealing… Likewise the discussion of the former practice of folding sheep sheds light on the totalitarian control that civilised man insists upon  everywhere in his domain:

But the most crucial role for sheep in many traditional agricultural economies has been to harness surplus nutrients from the saltus — the outlying wasteland too poor or distant to cultivate — and transfer them to the ager, the arable fields.1 This is still the case in parts of France and other European countries where flocks are shepherded by day and brought back to the bergerie at night to deposit their manure. It used to be the case through much of Southern England where sheep were grazed on downland by day and folded at night on fallow arable land. In South Wiltshire in 1794 “the first and principal purpose for which sheep are kept … is undoubtedly the dung of the sheep fold.” In Dorset in 1812 “the Sheep-Fold is held in as high estimation in this country as in any part of the world. It is considered by most of the farmers … as an indispensable requisite in the cultivation of the arable land.” In Bedfordshire “the manure of sheep is worth a farthing each per sheep per night”.2

Hear that? It’s all for us. As much as we can take. As far as we can reach. We are justified in taking it all, and any other creatures who might depend upon those nutrients for survival can go fuck themselves. Duh, it’s the food chain:

The Food Chain

Oh dear, I seem to have contracted some of the Guardianista penchant for sneering reductio ad absurdum… I recognise that the above talk carries less weight than it would if I had many years’ firsthand experience of working the land and had the meaning of all those relationships built into my being, rather than speaking from the alienated position of dilettante prehistorian who gets most of his food from the supermarket*. Still… it’s true, isn’t it?

Anyway, still missing from the debate is any discussion of domestication and the role of civilised man in ‘de-wilding’ the world (and himself) in the first place. To reiterate: What about rewilding humans? I am therefore delighted to see my friend Steve announce the formation of a ‘Rewilding Academy’ at this year’s (possibly final) Dark Mountain ‘Uncivilisation‘ festival in the woods in Hampshire from August 15-19, to which I’ve just bought tickets (still available via that link). He writes:

For the last two Uncivilisation festivals, I have run sessions that sought to provide a different kind of rewilding: one that acknowledges that is not enough to turn domesticated humans out into the wild and expect them to immediately recover their buried instincts and feelings; one that recognises that we have all been conditioned by civilisation into certain persistent patterns of thought, behaviour and physical restraint; one that makes use of our remaining capacity for play, curiosity and learning to open a small crack in the armour, to give a brief glimpse of the path that can slowly lead us back to experiencing the fullness of our human nature.

I’m also excited to attend the ‘Arcadia: a flawed objective?’ discussion:

[…]can Arcadia can ever be the bastion of peace and tranquillity that it is projected to be when it depends upon agriculture: arguably the foundation of all gigantist and destructive civilisations? In this open discussion, Marmaduke Dando places our traditional pastoral utopias under the magnifying glass in an attempt to find out whether simply getting ‘back to the land’ goes back far enough; and what the implications of these questions might be for all of us.

I’ve never really written about it explicitly but my personal perspective on this has been shaped by reading the writings and exploits of the ‘primitivists’ and ‘green anarchists’ in America and elsewhere that some are all-too keen to dismiss. I’ve taken up some of the projects they’ve enthused about such as fox-walking, nonviolent communication, wide-angle vision, E-Prime / E-Primitive etc. with varying degrees of success, and my focus on learning everything I could about the edible & medicinal plants that grow all around me over the past however-many-years-it’s-been was largely sparked by their efforts.

Broadly I subscribe to the philosophy many of them have articulated, namely that the domestication of plants and animals is a relationship of domination and subjugation that has wrecked the planet since it was born in the Agricultural Revolution some 10,000 years ago, and that rewilding is a process that every creature undertakes spontaneously, if given half a chance (kids are born as basically wild humans and must be subjected to a massive, traumatic programme of indoctrination at the hands of their parents and the schooling system in order to be made to fit to the dominant culture). The civilised culture has acted as a bulwark against this process, compelling its members to resist their own innermost tendencies and remain essentially an invasive species rather than ‘going native’ or becoming indigenous to their locality. It has been like a military occupation since the beginning, with the farmers staying safe within an expanding ‘green zone’ of acceptable domestic species and raining destruction on anything outside that circle of influence until it comes to conform to the grand design of domestication – that of total human control.

Thus it is the human civilised culture that most desperately needs rewilding. Some have called for a mass resistance movement against it, but really it is Civilisation that is the only resistance movement, and the major task is to break up and dissolve that resistance and allow the masses of people to return to a sane and healthy relating to the rest of the beings on this planet, as well as to their own selves. The dandelion does not consciously attack or attempt to destroy the concrete. Rather, it is the concrete that resists the growth of the dandelion, and its eventual yielding and crumbling away is practically inconsequential to the desire of the plant. It just wants to grow, live and give birth to more of its kind. The conditions are either right for that or they aren’t. Yet.

Further reading:

Anthropik Jason’s ‘Rewilding Humans
Peter Bauer’s ‘Rewild or Die
Willem Larson’s ‘College of Mythic Cartography
Miles Olson’s ‘Unlearn, Rewild
The (now largely inactive) rewild forums

Finally, I’ll republish an excerpt from the now defunct rewild.info wiki because I think it’s a good piece of (E-Prime) writing and it looks like it’s in danger of dropping off the edge of the internet:

What does rewild mean?

As a verb

The term “rewild” acts as a verb which implies an action, a motion. It does not symbolize point A (Civilized) or point B (Wild) but the space between. As a verb, it symbolizes a process of undoing domestication, not the endpoint. It may look like a woman breast-feeding her child. It may look like a group of people collecting wild edibles. It may look like someone turning off their TV for an hour a day. It may look like hanging out with your friends. It may look like refusing to pay rent or buy food. It may look like killing a deer for the first time, using a rifle. And it may look like using a bow & arrow. It may look like reading a book and changing the way you see Civilization. It may look like refusing to send your children to school. It may look like stealing from the cash register at your wage slave job. It may look like tearing up the streets with a sledge-hammer to plant crops. It may look like investing in “green” technology. It may look like taking down civilization. It may look like frustration at the current state of the world. Everyone has various comfort zones, social networks or friends who can show them things. Rewilding does not exist just for the small elite class of purists who band together and head for the woods to live a 100% primitive life. It serves as an umbrella term for all those who strive to undomesticate themselves, even if only in the smallest way they can.

As a life project

For most green/anti-civilization/primitivist anarchists, rewilding and reconnecting with the earth is a life project. It is not limited to intellectual comprehension or the practice of primitive skills, but instead, it is a deep understanding of the pervasive ways in which we are domesticated, fractured, and dislocated from our selves, each other, and the world, and the enormous and daily undertaking to be whole again. Rewilding has a physical component which involves reclaiming skills and developing methods for a sustainable co-existence, including how to feed, shelter, and heal ourselves with the plants, animals, and materials occurring naturally in our bioregion. It also includes the dismantling of the physical manifestations, apparatus, and infrastructure of civilization. Rewilding has an emotional component, which involves healing ourselves and each other from the 10,000 year-old wounds which run deep, learning how to live together in non-hierarchical and non-oppressive communities, and deconstructing the domesticating mindset in our social patterns. Rewilding involves prioritizing direct experience and passion over mediation and alienation, re-thinking every dynamic and aspect of our reality, connecting with our feral fury to defend our lives and to fight for a liberated existence, developing more trust in our intuition and being more connected to our instincts, and regaining the balance that has been virtually destroyed after thousands of years of patriarchal control and domestication. Rewilding is the process of becoming uncivilized.[2] (source, for now)

————————-

* – 2nd thoughts after sleeping on it: Actually I do make my living – and thus am starting to know about this through a deeper lived experience – from the not-entirely-dissimilar practice of creating and maintain open spaces in peoples’ lawns and flower borders. This too requires constant vigilance and regular high-energy intervention to discourage the ‘weeds’ (sometimes including tree seedlings) and basically ensure that the spontaneous process of succession towards forest is continually frustrated and reset to zero. Perhaps this provides a more ‘abundant’ or productive vegetative growth (although I’m noticing that at this time of year the grass is doing better when protected by the shade of trees) as the land struggles to recover from the emergency we’ve brought to it, but I’ve got the strong sense that things can’t continue this way for long. Lawns, beds and borders soon need fertility brought in from external sources to make up for the nutrients taken up by hungry annual plants and/or regular cropping. I for one can tell you that it’s exhausting! I’m sure the soil finds it similarly so.

70%, 60%

June 22, 2013

***Updated July 6th***

A highly distressing new report from Friends of the Earth Europe: ‘Weed killer found in human urine across Europe‘. If you live in the UK there’s a 70% chance that you have Glyphosate, the active ingredient in Monsanto’s herbicide, Roundup, in your body. What’s it doing to you while it’s in there? How long does it stay? How can you get rid of it or at least build up a personal resistance as the superweeds have done? Answers to these questions are not available because of the usual industry-sponsored silence.

I definitely have it in me because we carry it around in the back of our work van all week (garden maintenance). I’ve refused to use it personally but my coworkers aren’t so scrupulous. I’ve worked on a Roundup-sprayed driveway at least once, suffering mild headaches, dulled awareness and difficulty engaging with the outside world for a number of hours afterward. (I figure I’m basically a plant person now so it’s bound to affect me more than the average post-industrial human being…) One of my colleagues has developed the recent worrying tendency of suggesting we reach for the weed-killer when this proves more economical for our time than weeding by hand, although the cost of the chemical – in more ways than one – gets passed on to the client. They responded to news of this recent report with tangential comments about the safety of drinking water, ignoring the threat sitting right there, a few feet away. I really don’t want to be around when they commit these atrocities, if I can’t first persuade them to not do it. My boss, who has previously worked with Monsanto and accepts their safety claims at face value, is broadly sympathetic to my decision (he doesn’t spray it on his own garden, possibly in part because of the concerns I’ve expressed) but insists that the herbicide has a place in the service we provide, again for economic reasons when it’s cheaper to do the requested work that way, eg: clearing weeds [sic] off driveways, patios etc.

Anyway I recommend reading through some of the different pdf sections via the above link to educate yourself a little about this chemical and the corporations pushing it on you. It’s not just direct contact you have to worry about. As they say, ‘All volunteers who gave samples live in cities, and none had handled or used glyphosate products in the run up to the tests’ and:

Once applied, glyphosate and its break down products are transported throughout the plant into the leaves, grains or fruit [5]. They cannot be removed by washing, and they are not broken down by cooking [6]. Glyphosate residues can remain stable in foods for a year or more, even if the foods are frozen, dried or processed [7]. (‘Human contamination by glyphosate‘ – pdf)

Even if you’ve found a way to avoid ingesting GM foods you’re probably not safe thanks to an insane practice used by farmers called ‘dessication':

glyphosate-containing herbicides may be sprayed just before harvest onto non-GM cereals, pulses, sunflowers and oilseed crops. This is done to remove weeds and dry out the grains (ibid.)

ie: to kill the plant and pump it full of poison just before it gets isolated from the environment and passed on for consumption by humans. Genius.

But it’s not all about us of course. I found the ‘environmental impacts of glyphosate‘ (pdf) to be the most harrowing read. Turns out that, contrary to Monsanto’s lies*, glyphosate does not biodegrade, stay where you put it, cause no harm to mammals, birds, fish, pets, children, gardeners… In fact it fucks up the lives, lifecycles, hormones, body development and ecological feeder relationships of birds, butterflies, frogs, fish, mussels, invertebrate insects, ocean- and river-dwelling microfauna, and, of course, plants – ‘undesirable’ or otherwise. Anything it touches, basically. Read this and weep, made especially compelling after the recent news that 60% of species in the UK are in decline:

Common weeds can be important food sources for insect, bird and animal species in agricultural areas. Weeds provide food and nectar sources for insects, which in turn feed birds. Weed seeds can also be vital winter foods for many declining bird species, such as corn bunting and skylarkxxxi. Farm Scale Evaluations (FSE) of GM crops in the UK between 1999 and 2003, examined the number of weeds and their seed production in non-GM intensively-managed sugar beet fields, compared with those in GM glyphosate resistant sugar beet cropsxxxii. The results showed a significant loss of weeds and weed seeds in the GM glyphosate resistant sugar beet, compared to the conventional crop. The UK government’s scientific advisory committee spelled out the significance of the results, stating that ‘if [GM glyphosate resistant] beet were to be grown and managed as in the FSEs this would result in adverse effects on arable weed populations [which] would be likely to result in adverse effects on organisms at higher trophic levels (e.g. farmland birds), compared with conventionally managed beet.’xxxiii

A follow-up modelling project concluded that the effects of GM glyphosate resistant crops could affect different species, depending on their feeding and life cycle requirements. The authors noted that, in the results of their model, “Skylarks showed very little response to the introduction of GMHT rape. By contrast, the consequences of introducing GMHT sugar beet were extremely severe, with a rapid decline, and extinction of the skylark within 20 years. This contrasts with the cirl [sic] bunting, which showed little response to the introduction of GMHT beet, but severe consequences arose as a result of the use of GMHT rape”xxxiv.

Join the dots, people.

I think I’m going to start wearing a black armband with the extinction symbol on it:

Extinction Symbol

Otherwise, I believe the roots of dock, dandelion and burdock are the place to go to get support for an overloaded liver and kidneys. But I consider it insufficient to merely adapt to the new toxic status quo in this way. What I’d like to see is the toxic behaviour of Monsanto et al cut off at the source so the planet no longer has to deal with the cascading negative effects of their appalling chemical weapons in the first place. Here’s a petition for starters, but I don’t think it’ll be enough on its own.

Oh, and this is what happens after long-term exposure to Roundup and/or Roundup-Ready GM crops (industry regulations only required a 90-day trial):

GM corn fed rats with cancer tumors during study headed by French biologist Gilles-Éric Séralini‘One of the rats fed GM maize NK603 for two years. The animal has developed an abdominal cancer tumour. Photograph: Tous des cobayes/J+B Sequences’ – source

In a peer-reviewed US journal, Food and Chemical Toxicology, [Professor Gilles-Eric Séralini, professor of molecular biology at Caen university in France] reported the results of a €3.2m study. Fed a diet of Monsanto’s Roundup-tolerant GM maize NK603 for two years, or exposed to Roundup over the same period, rats developed higher levels of cancers and died earlier than controls. Séralini suggested that the results could be explained by the endocrine-disrupting effects of Roundup, and overexpression of the transgene in the GMO.

Less toxic than table salt my arse.

—————————

* – A brief reminder of the claims made in adverts which a New York attorney forced Monsanto to pull back in 1996 – exhibits A through J:

a) Remember that environmentally friendly Roundup herbicide is biodegradable. It won’t build up in the soil so you can use Roundup with confidence along customers’ driveways, sidewalks and fences …

b) And remember that Roundup is biodegradable and won’t build up in the soil. That will give you the environmental confidence you need to use Roundup everywhere you’ve got a weed, brush, edging or trimming problem.

c) Roundup — biodegrades into naturally occurring elements.

d) Remember that versatile Roundup herbicide stays where you put it. That means there’s no washing or leaching to harm customers’ shrubs or other desirable vegetation.

e) This non-residual herbicide will not wash or leach in the soil. It … stays where you apply it.

f) You can apply Accord with … confidence because it will stay where you put it … it bonds tightly to soil particles, preventing leaching. Then, soon after application, soil microorganisms biodegrade Accord into natural products.

g) Glyphosate is less toxic to rats than table salt following acute oral ingestion.

h) Glyphosate’s safety margin is much greater than required. It has over a 1,000-fold safety margin in food and over a 700-fold safety margin for workers who manufacture it or use it.

i) You can feel good about using herbicides by Monsanto. They carry a toxicity category rating of ‘practically non-toxic’ as it pertains to mammals, birds and fish.

j) “Roundup can be used where kids and pets’ll play and breaks down into natural material.” This ad depicts a person with his head in the ground and a pet dog standing in an area which has been treated with Roundup. (link)

—————————

UPDATE:

I portrayed my boss too generously. Weedkiller came up in conversation between us during a lunch break and I mentioned this report and its main findings. At first he wanted to know, reasonably enough, what concentration of glyphosate the research found in peoples’ urine. I didn’t know at the time but went away and looked into it (results below) and may pass on my findings at some point. But after a short spell of silence I was treated to a barrage of denial, justification and misdirection. Highlights included ignorant smears against FoE (a leftist conspiracy against Monsanto: “They’re like a dog with a bone”, “They’re anti-business”, “They hate success”), evidence-free assertions that glyphosate isn’t as bad as some of the other chemicals out there (“I’m sure there are much worse things on my driveway”, “What about all the petrol fumes and machine oils?”), strong implications that there’s nothing you can do about it and you just have to accept & cope with it as best you can, blaming consumers for demanding cheap food with disregard for the consequences (an old disagreement – I think the manufacturing processes call the tune and people adjust their habits accordingly, largely because they have no choice. If it’s all demand driven why the need for so much advertising?) and reiterating the supposed economic imperative of the company needing to use Roundup because “If we don’t someone else will – they will get the work and we will lose out”.

I couldn’t think of any way to respond productively to all this, so I did my usual bit of listening while The Man With Experience lays out The Story of  How Things Are, while making a conscious effort to keep it at arms length and not internalise it all automatically, reserving my own conclusions for a later date. For now, apart from having the usual Upton Sinclair quote ringing in my ears (‘It is difficult to get a man to understand something, when his salary depends upon his not understanding it’) I’m thinking this ‘If not me someone else – but worse’ is a bullshit excuse that has probably been used by every tyrant and holocaust-facilitator in history. But what’s the truly responsible course of action? Personal boycotts might be morally satisfying but they don’t really have an effect on the system as a whole unless coordinated and specifically targeted (so why not conspire against Monsanto :D ). Otherwise I think it’s broadly true that you just take yourself out of the competition, leaving another to take what would have been your share. You may not consider it to be worth taking in the first place, but that’s irrelevant if your concern lies with how things play out in the bigger picture. My unscrupulous colleague has more earning potential than me by not ‘turning down work’ in this way. One day this may be the crucial difference between us if the boss decides to lay one of us off. Whatever happens those driveways will continue to get sprayed in the meantime…

Maybe the answer lies in talking to the clients and wider public, ensuring this information gets out to them and perhaps persuading them to change their habits. Comparing the garden sheds of older and younger generations offers some hope – you often find a massive cocktail of lethal, long-expired chemicals in older sheds and much less in the younger ones, indicating a growing distrust of these industrial poisons and a greater inclination towards organic principles. But then, if this process of change is in reality driven by manufacturing practices and mass PR indoctrination rather than consumer demand, appeals to reason and emotion might not cut it. Answers on a postcard as usual!

Here’s the stuff on urine concentration:

***

Having checked out the original paper, I see that, of the ten samples from the UK, seven had a level of glyphosate higher than 0.15μg per litre of urine (the ‘Limit of Quantitation (LOQ)’ below which the chemical is apparently considered to not be present) – hence the 70% detection rate, which could actually be 100% as far as I can make out. The mean average is 0.47μg/L, second only to Malta at 0.82μg/L, with the lowest averages coming from Switzerland, Macedonia and Hungaria at 0.09μg/L. There were two UK results over 1μg/L with the highest coming in at 1.64μg/L, second only to the unfortunate individual from Latvia with 1.82μg/L (see table 4 on p.12). The paper gives a ‘reference value’ of 0.8μg/L but I don’t understand what this is meant to indicate and can’t make head or tail of their explanation:

The reference values for Glyphosate and AMPA are only tentative. They were derived from an urban collective (n=90) and are defined as the 95. percentile of the measured values. They were established by Medical Laboratory Bremen in 2012 during the process of the method validation. Strictly speaking they are only valid to the region of Bremen.

Any enlightening comments from someone from a more scientific background much appreciated! It doesn’t seem like regulators have decided on a ‘safe’ level of glyphosate in human urine. The main focus (and controversy) revolves around something Orwellian called ‘Acceptable Daily Intake’ relative to the total body weight rather than the fluid content of urine. In the EU this has been set at 0.3 mg  per kg of body weight (mg = 1000x greater than μg) but there is a stink about the way in which they arrived at this figure – from the FoE report, ‘Concerns about glyphosate’s approval‘ (pdf):

One of the core purposes of pesticide safety assessment is to set the ‘acceptable daily intake’ (ADI) for people’s everyday exposure to the chemical, for example through residues in food. In its 1999 evaluation of glyphosate, the German authorities proposed a high ADI for glyphosate of 0.3 mg per kilogram of body weight. They calculated this figure by reviewing the industry feeding trials using glyphosate and choosing the one they felt to be most sensitive to the effects of the chemical. In this case, the German authorities considered the most sensitive test to be a rat feeding trial. From this they calculated the ‘no observed adverse effect level’ (NOAEL). The ADI was then set at 100 times lower than this [10]. This ADI of 0.3 mg/kg was agreed by the European Commission, and is now law. But even four of the companies applying for approval of glyphosate differed in their interpretations of the industry feeding trials – based on the same studies; they suggested the ADI should be lower, ranging from 0.05mg/kg to 0.15 mg/kg [11].

In 2012, the ADI for glyphosate was re-examined by a group of scientists (including four professors) from universities in the UK and Brazil [12]. When they looked at the industry-funded feeding trials assessed by the German authorities, they noted some studies showed adverse effects at lower doses than in the rat feeding trial, but these findings had been ruled out for various reasons. They claim this led to “significant bias” in the data used. They commented that, if all the industry-funded studies had been included, a “more objectively accurate” ADI would be 0.1 mg/kg bodyweight per day. The group then examined the findings of independent trials of glyphosate published in scientific journals since 2002. Based on these, they concluded the ADI should correctly be 0.025mg/kg bodyweight per day, or “12 times lower than the ADI… currently in force in the EU”.

The ADI for glyphosate is not monitored.

I don’t know how the concentration of glyphosate in urine would relate to the concentration coming in the other end. What seems obvious is that the approach of finding an ‘acceptable’ level of any poisonous substance favours the industry manufacturing that substance at the expense of those humans and nonhumans who get lumbered with the job of storing it in their bodies. ADI? Try UDI!

Rewilding the British Isles

June 10, 2013

The Soča river valley in Western Slovenia. Photo by Padraic Giardina/Getty‘The Soča river valley in Western Slovenia. Photo by Padraic Giardina/Getty‘ – source

George Monbiot can be an ass but there’s loads of useful stuff in his latest subject material concerning the rewilding of landscapes and (to a lesser extent) people. The book is called Feral: Searching for Enchantment on the Frontiers of Rewilding and it looks like it’ll be worth a read. There’s an interesting review and discussion here, with Monbiot pitching in quite constructively in the comments. Otherwise there’s a short video on youtube, a Radio 4 walking interview with a well-known sports commentator (who seems quite blindsided by the whole affair), and an excerpt from the book, ‘Accidental Rewilding‘ published by Aeon magazine and putting forward the observation that disasters for human civilisations often leave room for the rest of the ecosystem to flourish on its own self-willed terms (compare Derrick Jensen’s comment that the recovery of wildlife in Chernobyl proves that the ‘The day-to-day workings of civilization are worse than a nuclear catastrophe‘). But this RSA talk: ‘A New Future For Nature‘ and Q&A seems like the best place to get a feel of where he’s coming from and take a hit of his infectious enthusiasm and obvious passion for the topic (apparently the video will only be available for two weeks):

As usual I don’t buy the line about human hunters alone causing the extinction of all the European megafauna, although I’d like to see his evidence. Obviously I see limitations in his conception of what it might mean for humans to rewild, which looks more along the lines of hands-off ecotourism for ‘ecologically bored’ city-dwellers rather than any real embedding of feral human cultures in these ecosystems as a species in their own right. This comment in the Grauniad thread says it all, really:

I’m not advocating rewilding as an alternative to civilisation. Here’s what I say in the book:

“While some primitivists see a conflict between the civilised and the wild, the rewilding I envisage has nothing to do with shedding civilisation. We can, I believe, enjoy the benefits of advanced technology while also enjoying, if we choose, a life richer in adventure and surprise. Rewilding is not about abandoning civilisation but about enhancing it. It is to “love not man the less, but Nature more”.”

…so he doesn’t know what he’s talking about on that front… [/charitable]

Also naturally I’m not happy with this only happening in the highlands with the agricultural monopoly continuing on the best lowland soils, but I guess you can’t have everything right away… Don’t know what to make of his elephant theory either, but I suppose it’s just crazy enough to be true. Fantastic stuff about the turtles, sea grass, wales and phytoplankton relationships and the ‘trophic cascades‘ by which which the removal (or reintroduction) of even just one particular keystone species can cause huge transformations throughout the ecology. But again, he could have mentioned the importance of having human beings in a beneficial keystone role. Possibly he mentions it in the book, but I’ve heard dark murmurings that the next step after reintroducing wild wolves to Yellowstone Park might be to reintroduce wild people, ie: the indigenous Indians who were excluded when the national park was created. Now where are we going to find some of those over here, I wonder?

Some positive steps over all though, in my humble opinion. Good if this stirs a wider debate.

More striking visuals

January 16, 2013

via Shaun – Speaking of grass as an invasive species (see previous post), check out this video animation of changes in ‘global land cover’ over the last 8,000 years, detailing the loss of ‘natural vegetation’ during that period:

The problem remains of how to define ‘natural’. If it simply means the presence of human beings  then practically nowhere on the map should be coloured dark green even at the start because a) all the continents except Antarctica were populated by humans by at least 14,000 years ago, b) there’s no way to inhabit a landscape and not affect it and c) hunter-gatherer peoples are known to have shaped plant and animal communities, sometimes drastically, even before the onset of full-scale cultivation. If ‘non-natural’ vegetation means that native species have been gradually replaced by non-natives then this gets us a little closer to the above depiction but you then have to define what you mean by native, a task that runs into difficulties as soon as you observe that 1) no species has been around since the dawn of time and, 2) they have all come to the space they currently occupy through, if not physical migration, then a journey into existence through evolutionary design space. Also, wouldn’t you have to admit that the various crops and weeds responsible for changing these ecologies had their own native ranges? Therefore, strictly speaking, China should stay green because of its subsistence on native rice, as should the Middle East (the home of wheat and barley) and the various regions in Africa and South & Central America who developed their own crops. Maybe the best description for what is being measured here is the spread of plant & animal domestication. Again, this runs into problems of definition, given that i) low-key forms of cultivation have been around in one form or another since the dawn of humanity ii) (again) there’s no way to inhabit a landscape and not affect it and iii) where exactly are you supposed to draw the line anyway? I suppose it would correlate pretty well with deforestation too. But, dammit, where do you draw the line between ‘pristine’ forest and planted fruit & nut orchards? It would help to know what data this was based on…

Anyway, what I meant to say originally was that it was interesting to watch this while reading Marvin Harris’ classic, Cannibals and Kings, which talks about the origins of ‘hydraulic societies’ (a term coined by the historian Karl Wittfogel) in ancient Egypt, Mesopotamia, India and China, each of which developed

[…] amid arid or semi-arid plains and valley fed by great rivers. Through dams, canal, flood control and drainage projects, officials diverted water from these rivers and delivered it to the peasants’ fields. Water constituted the most important factor in production. When it was applied in regular and copious amounts, high yields per acre and per calorie of effort resulted. (p.174)

These massive public works, which were necessary if the settled populations were to be fed (an important factor was the lack of opportunities for subsistence in the wilderness surrounding the floodplains – beyond a certain level of population density the people were trapped), led to the emergence of totalitarian hierarchies, enforced by bureaucracies acting out of self-interest for their share of the spoils of the wealth which was produced by the masses, most often living in a state of abject poverty a few steps removed from starvation.

Interestingly, Harris thinks that these states were initially quite self-contained and that the sickness took quite a while to reach the same ferocity in the Northern regions of Europe and Russia – a contention which the above animation seems to confirm. While he describes iron age societies in Britain, France and Germany as ‘secondary states called into existence to cope with the military threat of the Mediterranean empires and to exploit the possibilities of trade and plunder provided by the great wealth of Greece and Rome’ (p.183), the fact that meltwater and rain provided all a peasant farmer needed meant there was no need for a huge state superstructure:

Despite the rigidities introduced by serfdom into the feudal system, the post-Roman political organisation of Europe continued to contrast with that of the hydraulic empires. Central bureaux of internal and external plunder and of public works were conspicuously absent. There was no national system for collecting taxes, fighting wars, building roads and canals or administering justice. The basic unit of production were the independent, self-contained rainfall-farming manorial estates. There was no way for the more powerful princes and kings to interrupt or facilitate the production activities that took place in each separate little manorial world.

Unlike the hydraulic despots, Europe’s medieval kings could not furnish or withhold water from the fields. The rains fell regardless of what the king in his castle decreed, and there was nothing in the productive process to necessitate the organization of vast armies of workers. (pp.185-6)

Indeed, he even goes as far to say that ‘Long after the great river valleys were packed from horizon to horizon with human settlements, northern Europe stood to the Mediterranean and the Orient as America was later to stand to Europe: a frontier still covered by virgin forests’ (p.183) – forests into which they could escape if the going got too rough. At least until iron axes, saws and ploughs became cheaply & widely available enough to allow mass felling and the instatement of the open field system….

Okay, next: a cool little animation by Steve Cutts, simply titled ‘MAN’*:

And, one I’ve been saving – You know you’re making progress when a video about the chemical extermination of unwanted plants and the whole culture built around this act upsets you more than a documentary about the Nazi holocaust. Witness Dow Chemical’s 1947 advertisement / propaganda piece for 2,4-D herbicide (later used in Agent Orange as previously discussed), ‘Death to Weeds':

OMFG I nearly crapped my pants when I saw this footage in a BBC/Discovery documentary series, ‘Human Planet‘. If you think I’m exaggerating when I describe agriculture as an all-out war against the rest of the living world, just … wait for it:

(There’s some context missing from this clip. You can watch the whole Grasslands episode here, with the relevant passage starting from 24:30. Count how many military metaphors the narrator uses.) This is what I mean by my talk about ‘wealth redistribution’. Brief wikipedia research tells me that the Red-billed Quelea ‘is the world’s most abundant wild bird species’ with a total population of up to 10 billion individuals all living in sub-Saharan Africa. They feed mainly on ‘annual grasses, seeds and grain’, although they apparently feed their chicks with caterpillars & insects for a few days before switching to the seed diet. Here’s the telling passage:

Being such a considerable part of the savanna biomass, Red-billed Quelea flocks and colonies attract huge numbers and diverse types of predators and scavengers. Birds known to live extensively off queleas include herons, storks, raptors, owls, hornbills, rollers, kingfishers, shrikes and corvids. Additionally, snakes, lizards and several types of mammals, especially rodents and small carnivores, are regular predators.

And why do they form ‘such a considerable part’ of the biomass? Because human farmers have made available highly concentrated stores of food that support their population at numbers massively higher than they would otherwise be! I think there’s a message to be read in the huge swarms of these ‘locust birds': If you grain farmers keep on hoarding all of the land’s productivity for yourselves, we will be forced to descend upon you in great numbers, ruining your efforts and returning the biological wealth to those you stole it from; those who will now feed on us.

I could be wrong…

Finally, a hero:

pole-sitter(source – please ask me to take it down if it’s not okay for me to republish)

Later in the day a quick-thinking defender scaled this time not a tree but a telegraph pole on the other side of the road to where the chainsaws were felling. Work had to stop because of the potential danger and this time security climbers found it impossible to evict the defender, unable as they were to find a higher point to secure on to. Instead, a bunch of coppers closed off the road (which was unecessary, and no doubt intended to hack off the locals) and stood about ready to nick the pole-sitter when he came down. Holding out until the contractors had beaten a retreat a valiant attempt was made by supporters to “de-arrest” the defender upon his descent, but were met with the full force of sussex police, who piled out of a nearby riot van screaming “pepper spray them, pepper spray them all”, and duly dispensed their canisters. In the ruckus the pole-sitter cut open his leg and, after being nicked, was taken to hospital for 8 stiches. He was released in the early hours and, just as in the previous arrests, bailed off site. He was charged with obstruction of the public highway (that is, the same public highway that the police themselves closed…?!). (link)

Protestors are resisting the construction of a new road between Hastings and Bexhill (near the south coast of England) which will carve through a valley containing a peaceful water meadow and pockets of ancient woodland. Go to: Combe Haven Defenders for more information and to see how you can help.

————————

* – Obligatory nit-pick: these actions do not represent all of humanity. As Daniel Quinn wrote:

Man was born MILLIONS of years ago, and he was no more a scourge than hawks or lions or squids. He lived AT PEACE with the world … for MILLIONS of years.

This doesn’t mean he was a saint. This doesn’t mean he walked the earth like a Buddha. It means he lived as harmlessly as a hyena or a shark or a rattlesnake.

It’s not MAN who is the scourge of the world, it’s a single culture. One culture out of hundreds of thousands of cultures. OUR culture.

War on badgers; war on wildness

October 15, 2012

Badger and cow
(source)

For the record: I oppose DEFRA’s proposed badger cull, which I recently read ‘could wipe out 100,000 badgers, a third of the national population’. I’ve signed the petition calling for it to be stopped, and apparently this now has enough signatures (over 100,000) to force a parliamentary debate on the subject. However, I don’t accept the unspoken premise underlying even much of the criticism that has been voiced: namely that if it can be proved that the continued, relatively undisturbed existence of wild badger populations poses any kind of threat to the vast population of domesticated cattle in this country then a cull is justified. This agrarian fundamentalist* logic is the main driver behind the current Holocene Extinction in which between 150-200 species are now being driven extinct every day through the actions of farming cultures destroying diverse wild communities in order to impose a chosen few domesticated plant and animal species upon the land – with the purpose of channeling as much of the planet’s biological wealth into the growth of the human population as possible and/or enslaving it to the economic machinations of the vampiric global mega-civilisation. Farmers and capitalists see economic value in cows. They see none in badgers, just like they saw none in wolves, bears, wild boar or aurochs (each driven extinct in Britain over recent centuries and millennia as a consequence of active policies of extermination and secondary effects of other activities such as destruction of habitat, most often related to agriculture) – therefore, on the slightest pretext and with the flimsiest of justifications, they have to go. Witness the insanity with which this topic is debated on national TV, hosted by a household-name naturalist:

Can you hear the sublimated hatred of all things wild – all things living according to an independent will; all things damaging to our religion of total control; all things reminding us of that which we fought (and continue to fight) so hard to put down in ourselves – the coldhearted militaristic language (‘take them out’), the tight grip of irrational fear (those ‘reservoirs’ of disease), the refusal to countenance reality and plough on regardless (‘No, I’m afraid culling will have to take place.’)? Do you see these things as clearly as I do? Do you find them as disturbing?

A while ago I read this article on the badgerland website, talking about the supposed threat posed by badgers to domesticated cattle. This passage in particular made sense to me, supporting Brian May’s contention in the above footage:

Some respectable scientists [citation needed], believe that cattle must meet several conditions before they can catch TB. The argument goes that rather than getting TB immediately they are first exposed to the TB bacteria, the cattle must have most of the following conditions: climate history, certain vitamin deficiencies, compromised immune system, intensive living conditions, high-stress lifestyle, lack of natural immunity to infection and disease, and multiple-exposure to the TB bacteria in a short space of time. In other words, cattle which are raised in natural field-based conditions, with minimum use of anti-biotics and other drugs, low-stress organics lifestyle are much less likely to succumb to TB infection. In organic terms, the higher incidence TB in cattle in the south-west of England is more likely to be due to more intensive cattle-rearing and animal husbandry, than the presence or otherwise of TB-infected badgers.

Another aspect is that TB can be passed from one individual to another by contact with infected breaths, coughs or sneezes, or infected urine or faeces. A very good place for badgers to catch earthworms and dung beetles, is in cow-pats. Perhaps, the argument goes, it is the cows who have TB, who pass it to badgers when the badgers snuffle through cow-pats looking for worms and beetles.

I bet this is the way it works in most, if not all, instances where wild creatures get the blame for the problems plaguing domesticates. I think that, despite what we hear all the time about ‘weeds’, ‘vermin’ and other undesirable interlopers in the grand schemes of human cultivation†, diseases, parasites and other pathological conditions are actually far less prevalent among robust & resilient wild individuals than among the sheltered, dependent, inbred and highly concentrated populations of domesticated plants and animals. As appears to be the case with endemic Bovine TB, the trouble only comes when the conditions have been created for it through the aforementioned hoarding of biological wealth. The disease manifests as ever more forceful attempts at wealth redistribution.

I’ve only seen badgers on a couple of occasions, but that was enough to utterly endear me to their character. I think going after them in this crass, viciously stupid manner (or allowing others to do so when we might have prevented them) can only serve to alienate ourselves further from the wild world at a time when we desperately need to start learning the lessons it has to offer. If we wish to someday beg a home in the spontaneous ecology of this country – ie: woodland – then we will need to apprentice ourselves to those who know how, having done so for many thousands, if not millions of years through an unbroken ancestral lineage. How likely are we to find willing teachers among those whose last contact with somebody who looked like us was through the sight of a gun?

Oh, I forgot to say: I support those engaging in direct action against any attempted badger culls.

———————-

* – hat-tip: Urban Scout

† – you could even apply this to the cultivation of human cultures: as we touched on before, think of all the diseases attributed to ‘inferior’, ‘mongrel’ groups of people such as Jews, gypsies, homosexuals and any strange immigrant culture. How often has this prejudice been used as a justification for campaigns of persecution, even genocide?

The Revolution comes to Britain

April 24, 2012

Forgive me for posting another video (I’ve got quite a bit of original stuff waiting on the production line but am having some trouble engaging the machinery needed to crank it out) but last night I watched the second episode of ubiquitous Scot, Neil Oliver’s BBC series, ‘A History of Ancient Britain‘, and thought it provided a pretty decent exploration of the arrival of intensive agriculture in the British Isles some 6,000 years ago – an important subject to me for obvious reasons. Anyway, some kind soul put the whole thing up on youtube, so when you’ve got an hour to spare…

I wasn’t aware of the theory about multiple ‘first contact’ with farmers in Kent, Ireland and even the Orkneys (voles in grain sacks, you say? – well okay, unless they arrived on driftwood or hitched a ride with a friendly eagle), or that the Carnac stones in Brittany were put in place by hunter-gatherers in the Mesolithic (‘We will be remembered’, eh? – reminds me more of the civilised preoccupation with stamping a mark on the landscape in the form of dead monuments rather than preserving a living legacy in thriving ecosystems, but I could be wrong…)

I spotted the old trope of hunter-gatherers ‘struggling for survival’, even alongside evidence of the backbreaking nature of the farming lifestyle – cutting down all the trees, killing all the wild animals & plants, building walls to protect livestock, yearly ploughing, the ‘daily grind’ of an hour or more of processing wheat for a family’s daily bread, the insecurity of next year’s crop being dependent on this year’s harvest…etc. He also says they stuck to the coasts and waterways and perceived the forested interior as a ‘dangerous, forbidding world’ [8:06] after making it clear that they derived a large proportion of their subsistence from hunting woodland animals and saying himself that ‘these people didn’t just live close to nature – they were part of nature’ [2:36]. I would’ve thought it was the farmers who were far more likely to see the forests in that way. As Luther Standing Bear put it:

We did not think of the great open plains, the beautiful rolling hills, the winding streams with tangled growth, as ‘wild’. Only to the white man was nature a ‘wilderness’ and only to him was it infested’ with ‘wild’ animals and ‘savage’ people. To us it was tame. Earth was bountiful and we were surrounded with the blessings of the Great Mystery.

Although he does his best among the Carnac stones and with the meditation at the end on how ‘sad’ it was that the farmers were trying to ‘separate’ themselves from the wild, undomesticated world (or rather, I would say, trying to impose their way of doing things and thus destroying that world), I thought Oliver’s account was rather ‘embedded’ in the experience of those oh-so courageous pioneer farmers. He could have looked at examples throughout the historical record of clashes between hunter-gatherer and farming cultures to convey the likely attitudes of prehistoric British tribes towards the people clearing the land of all the species necessary for their subsistence. I even saw an exploration of this on the BBC in the form of Marco Bechis’s film, ‘Birdwatchers’, about the struggle of the Guarani Indians in Brazil who are in the process of being displaced from their land by cattle ranchers and sugar cane farmers:

I was struck by the stark contrast in the visuals throughout the film of lush, green rainforest on the one hand next to bleak, brown farmland on the other. There must have been a similar disparity between the early wheatfields and stone-walled livestock enclosures of Neolithic Britain and Ireland and the vast, peopled Wildwood they too were setting out to conquer. At one point in the film a Guarani shaman instructs his pupil to not eat the meat from a domestic cow the tribe has just poached, because such a beast does not belong to that landscape in the way that the rainforest species – considered brothers and sisters by the Indians – do. After showing us [55:33] the difference between the ankle bone of domesticated and wild cows in prehistoric Britain, I wish Oliver had followed in the footsteps of Jared Diamond and Weston A. Price in showing us the difference between domesticated and wild humans. Is the evidence here consistent with evidence around the world indicating that hunter-gatherers lived longer, were taller, healthier, stronger, less stressed, more … human than their genetically identical farming counterparts? Who most truly belongs to the British landscape; to any landscape – Homo sapiens domestico-fragilis or Homo sapiens neo-aboriginalis?

(hat-tip to C)

Altogether, though, I want to applaud Oliver’s effort here in shedding light on this important transition, putting modernity into its ancient context and going some considerable distance towards rescuing what was surely an epic, richly meaningful drama from the precious few scraps of evidence that survive.

Toby Hemenway: ‘How Permaculture Can Save Humanity and the Planet – But Not Civilization’

November 23, 2011

A good piece of counterrevolutionary propaganda to watch while shelling your acorns:

I can’t believe it took me so long to sit down and watch this talk. It encapsulates so many of my reasons for doing what I do in such a remarkably concise and easily-digestible form. It seems a lot of the ideas presented have been recycled from the writings of Jason Godesky on the old Anthropik site (which influenced me greatly at the time as well) – for example, ‘Agriculture or Permaculture: Why Words Matter‘. That doesn’t take away from Hemenway’s original style and many pertinent observations, though.

See also his classic 2006 essay, ‘Is Sustainable Agriculture an Oxymoron?

For (the beginnings of) an interesting discussion on the differences between agricultural and horticultural subsistence strategies and their various merits and drawbacks, see this thread on the rejuvenated rewild forums and this Leaving Babylon post which spurred it.

Acorns & Good Times Bread

November 17, 2011

As promised, I here present Ian’s step-by-step guide for processing acorns. If you like, watch this Ray Mears video to get yourself in the mood (starts at 3:36; continues in pt.2 from 8:34):

Step 1 – Gathering. Find a tree! Not all Oaks will crop heavily (and if it’s not a ‘mast’ year you might struggle to find a single acorn). As previously discussed your best bet will be to find a specimen with lots of space around it and a canopy open to the sun, especially on the South facing side. Stand-alone trees or those on the edge of woodland normally produce more nuts than those in the middle of the deep, dark forest. Some of the best I found this year – a) in front of H’s driveway:

b) a young fella on the common, branches still low enough for me to climb up into him and do a ‘shakedown':

c) a gaggle on a golf course:

d) street-corner guardians:

(I think these were all English/Pedunculate/Common Oaks, Quercus Robur, though I’m not sure I could differentiate this from Britain’s other native Sessile Oak, Q. petraea. Not that this would matter particularly as, while more bitter than their managed American or S. European cousins, the acorns of both species are equally edible after processing.) You should be able to find at least one tree that drops a good quantity of large, sound acorns. As you can see from the above pictures, it’s useful if the ground is reasonably clear, but also soft enough to not damage the nuts after their fall from a great height. Tarmac makes things easy, but a lot of the acorns from tree a) and other ‘street trees’ I gathered from had extensive ‘bruising’ where the nutmeat had hardened and blackened at the point of trauma and along fracture lines. This got progressively worse the longer I kept them before processing, I assume because the black colour is caused by oxidisation which is limited when the whole nut still has its thin inner skin surrounding it. I’m not sure if the hardened/blackened acorns are unusable (I spent quite a while cutting out the ‘bad’ bits just in case) but I found they were also the most likely to spoil and/or go mouldy.

Gathering was speediest throwing handfuls onto a tarp or jacket before funneling into a plastic bag, but just placing them in the bag directly worked out fine too. I did try raking directly into bags, leaf-litter, twigs & all, but this just meant I had to pick out the good nuts back at home anyway. It doesn’t matter if the acorns have been lying under the tree for quite a while – the hard outer shells are designed to last them through the winter before weathering finally wears them down enough for the sprouts to push through in the spring. They also protect against insects, moulds, bacteria etc. but not small mammals who sometimes take a nibble (or, if you’re lucky, large ones who eat them whole). A little ‘rain leaching’ might give you a head start for Step 5 too! However watch out for little holes in the acorns – these are the work of the acorn weevil which uses sharp mandibles to chomp into and lay eggs in the acorn when it’s still young & tender. A little white grub then gorges on the nutmeat for the next couple of months before chewing its way out and trying to find somewhere safe to pupate. Sometimes you’ll catch these little blighters in the act – probably giving them the fright of their lives! – inside acorns you previously thought were sound. Unfortunately they don’t leave much for you, but they make a good snack for the birds (or maybe they’d be tasty if you fried them up directly?) Otherwise I tend to only go for the dark brown glossy nuts, just because they somehow look more ‘healthy’ to me, even though they dry to the same light tan colour after a couple of weeks in storage. I also avoid cracked or damaged shells as these won’t keep so well. Here’s a load I picked up just yesterday afternoon from around tree d). It took me around twenty minutes to gather just under 8kg:

Step 2 – Storage.

Keep in a warm, dry place, preferably in open-sided containers that allow the air in to circulate. If the nuts were particularly sodden when you picked them up, maybe give them a head start against any cheeky moulds by putting them in a low oven or up against a radiator for a spell. If you want to make acorns your staple food you might have to take this part a bit more seriously:

My family and I have been known to gather tons of acorn. In the past my Great Aunt Mary had a room in her house where we would deposit all of the acorn we gathered. This was a 10’x12′ room, with a four foot board across the doorway. This room was always full of acorn. As children we used to fight for the right to jump into the acorn and stir them up. Anyone bigger than a child would crack the hulls. This had to be done twice a week so that moisture didn’t build up and that the acorn dried properly. Traditionally our people stored acorn in ‘Chukas’, acorn graineries made of cedar and California laurel. These are cylinder in shape and raised above the ground on stakes about three feet. Lacking a spare room for my acorn, I store mine in gunny sacks and hang the filled bags from the rafters in my garage. My sisters living on the rez, use the huge army surplus bins my parents bought. They keep them covered and stir them twice a week. No matter how you store your acorn it is essential that you add a generous amount of California laurel with the nuts. Laurel or bay leaf is a natural insect repellent and keeps the bugs away from the acorn. […] We let the acorn dry or season at least for a year, this assures that the nuts are well dried. (Kimberly R. Stevenot, Northern Sierra Miwok – link)

Step 3 – Shelling. This is a pain if you try to do it straight away with fresh acorns. If you let them dry for a bit the nutmeats shrink away from the outer skin, allowing you to open big cracks along the length with a quick hammer-blow to the head, which then makes it easy to prise the innards out whole with a knife. Here’s a picture of my set-up, along with my favourite anvil:

This part of the process takes up the most time. I like sitting down in the evening and listening to music, watching online documentaries or crappy comedy shows on the TV while I do this. It gets nice & hypnotic after a while… Mind your fingers!

Step 4 – Grinding. I ‘cheat’ and use a food processor for this stage. The idea is to increase the overall surface area in preparation for Step 5, which will go faster in relation to how fine you grind the acorns. I like to leave them in rough milimetre cubes, as I’ll be fine-grinding them later anyway and hopefully would like to keep some of the nutrients in there in the meantime. Of course, I’d prefer to do this part ‘aboriginally’ but on my own it feels too much like hard work. Apparently acorn-based ‘balanocultures’ used social technology to lighten the load:

At the edge of the village a group of women sit together grinding acorns. Holding the mortars between their outstretched legs, they sway back and forth, raising the pestles and letting them fall again. The women are singing together, and the pestles rise and fall in unison. As heavy as the pestles are, they are lifted easily – not so much by muscular effort, but (it seems to the women) by the powerful rhythm of the acorn-grinding songs. The singing of the women and the synchronized thumping of a dozen stone pestles create a familiar background noise – a noise that has been heard by the people of this village every day for hundreds, maybe thousands of years. (Malcolm Margolin, quoted in Suellen Ocean’s Acorns And Eat’empdf)

If you want you can keep the nuts whole, as the ancient Europeans appear to have done (ibid.), although this will leave you with a different foodstuff at the end.

Step 5 – Leaching.

Soak the acorn meal in cold-tepid water to leach out the tannins, using a thin-weave material to keep the solids separate (I used an old pillowcase). Change twice a day until the water stops turning a deep brown and/or the acorns lose their bitterness. This can take from 3 days to over a week. You can speed up the process by using boiling water which you pour off repeatedly, but cooking denatures the starches/sugars, and you’ll also lose much of the oil content, so I prefer not to. Other methods vary from dunking the meal in a stream as Ray Mears does in the above video, burying caches of whole acorns in boggy ground, cooking in a ‘lye’ made from the wood ash of deciduous trees or with iron-rich soils/clays, and even putting them in the (cleaned) cistern of a flush toilet 8O Other Native American methods include pouring water onto ground acorns in a sand ‘colander':

(source)

And this one, which I probably won’t be trying:

The aboriginal people of the Columbia River valley used urine to cure acorns. The settlers of European origin in that region gave the dish the name Chinook Olives. About a bushel of acorns were placed in a hole dug near the entrance of a house. The acorns were then covered with a thin layer of grass and then 6” of earth. Every member of the family regarded this hole as the special place of deposit for his urine, which was on no occasion to be diverted from this legitimate receptacle. In this hole the acorns are allowed to remain four or five months before they are considered fit for use… the product is regarded by them as the greatest of all delicacies. (‘Indigenous Acorn Facts‘)

If you want you can leach the acorns whole, just pouring the water off and re-filling. This will take a lot longer, though (unless you use boiling water). Mine started to bubble and smell slightly ‘fermented’ after about five days, so I finished them off with a slow roast in the oven:

They are tooth-breakingly hard by the end, but cook up to an acceptable squishy texture in porridge (and – I’m guessing – in stews, soups, etc.)

Step 6 – Drying, re-grinding. If you want to keep the acorn matter for a long time and don’t want to use it immediately as a ‘mush’ or in a soup etc. then you’ll need to dry your acorn grounds. If you get freakishly lucky with the Autumn weather you can leave this job to the sun, but mostly I have to put them in a low oven for a couple of hours to speed up the process.

They will tend to clump up during this stage, no matter how finely you ground them originally, so if you want a flour (as opposed to ‘grits’) you’ll have to grind them again. Tip: you can often find old-style manual coffee grinders in charity shops.

Leave out someplace warm & dry for another day or so to evaporate the last bits of moisture, then store in glass jars or paper bags. Some people say the fat/oil content will make the flour go off after a couple of months, but I still have some left over from my first experiments over two years ago, and it still looks and tastes just as good as it did back then. Maybe the final heating in the oven stabilises it somehow?

Step 7 – Eat! Most people say to treat it like corn/maize flour, for example mixing it 50:50 with regular flour to make breads, muffins, pancakes, tortillas…etc. It doesn’t contain gluten so will need to be mixed with something else that does, or with a different ‘sticking agent’ (e.g. egg). It’s a lot denser than wheat flour, so if you’re using it to make bread you’ll need much more yeast to make it rise – my one attempt at a 50:50 loaf two years ago, while deliciously rich & nutty, did not rise at all.

This year I’ve had some success with a recipe for ‘Hard Times Bread’ from The Wildfoods Cookbook by Joy Spoczynska, which she ‘unearthed’ from ‘an eighteenth century cookbook’ that traced the recipe back to ‘early pioneers in America’. She says they turned to it ‘when wheat flour was difficult to obtain or cost more than the pioneers could afford’. I’m guessing they adapted this from the Indians.

Naturally I want to change the name to break the association with famine and last resort measures to stay alive, and present this instead as a desirable alternative to the Staff of Death Bread made from farmed grains. Sure, it takes less effort for us affluent first-worlders to work a wage-job and buy a sack of flour from the supermarket, but this embeds us in an exploitative system whereby someone else, human or non-human (including the chemical remains of long dead non-humans) has been enslaved to do all the work in our stead. It’s easy, from our ‘privileged’ position, to forget just how hard it is to get something resembling food from the annual grains. Try to bake your lawn, or just watch this guy go about his business (sorry about the background music – mute and try this as an alternative):

Then have a look at this and ask yourself where the astronomical quantities of energy have come from to build, operate and maintain all those machines:

Suddenly, simply letting trees grow and crop in the Autumn for you to harvest and process through the above steps doesn’t seem so inefficient or energy-intensive, does it? Yes, you’ll find it hard work if you never had to take care of your own subsistence needs before, but I bet even ‘Seed to Loaf’ Steve would back me up in saying that we miss out on basic feelings of satisfaction from leaving this most fundamental biological activity for other people to sweat over. Also, as the wise people say: No security without food security. In other words if you depend on getting flour (or any other staple food) from the supermarket, that means they’ve got you by the balls/ovaries – you’ll comply to the demands of whoever controls the price of wheat because you have to eat. Unless you have another option…

So without further ado, here’s my adjusted recipe for ‘Good Times Bread’ – I’ve halved the original quantities:

Ingredients: 250g acorn flour, 50g maize flour, 2 tbsp butter, 1 egg, 1 tsp salt, 150ml buttermilk (or regular milk mixed with 1 tsp lemon juice or vinegar and allowed to sit for 5 minutes), 1/2 tsp baking soda (sodium bi-carbonate)

Step 1 – Sift the flours, salt and baking soda together, beat the egg and melt the butter in a frying pan (preferably cast iron) or griddle.

Step 2 – Gradually mix in the buttermilk, followed by the butter and lastly, the egg. Knead until ‘of a fairly soft dropping consistency, like a very stiff batter, but not sloppy’ (Spoczynska, p173).

Step 3 – Squish into balls and flatten on a level surface to desired size & thickness. Then add a bit more butter to the pan and cook on a medium-high flame, flipping to the other side after a couple of minutes when it turns slightly golden-brown.

Et voila. More than enough for a hearty breakfast to keep you going through the day:

These, unsurprisingly, had a delicious roast-nuttiness to them and the texture of a heavy scone. The salt made them a bit too savoury for jam, though – a later experiment with added brown sugar, chopped walnuts and dates went down a treat. I’m not sure how long they keep, but I was putting these in the toaster and they were still tasty after three days.

Aboriginally, I would be inclined to roast them by the fire on a flat stone like these guys:

For more inspiration and recipes, check out these sites:

Finally, submit your own acorn experiments to Butterpoweredbike’s ‘foraging recipe challenge’ on the Hunger and Thirst blog (thanks Annie!) Looks like there’s some great stuff up there already – I don’t think my ‘back to basics’ approach stands much of a chance of winning though…

So that’s about all I’ve got for now. Hopefully this didn’t come too late to fire you up in time for this year’s harvest. If you’re visiting SE England, I’ve still got plenty of acorns you can come help me process :) Email address buried in the comments on the ‘About’ page…

Balanophagy for Beginners

November 4, 2011

‘Balanophagy is the practice of eating acorns. Acorns are more than just food for birds, squirrels, and hogs. They have been used for food by millions of humans over the ages. Acorns compare favorably in nutrition with common grains, though acorns contain more fat. (That was not a bad thing during most of human history.) If you have any ancestry among people of the northern hemisphere, there is a reasonable chance that you have some ancestors who ate acorns.’ – Kelli Kallenborn

‘The oak tree, today revered primarily for its beauty, may once have been the central food bearer around which entire societies (balanocultures) built their diet and lifestyle. Recent evidence shows that tools used for grinding and pounding food existed long before corn became popular and may have been used to process acorns into meal. Factors such as the domestication of goats and the burning of oaks for fuel may have contributed to the movement away from balanoculture. By the end of this century severe crises in agriculture world-wide may make a return to some modified form of balanoculture a viable alternative.’ – David Bainbridge (apparent coiner of the term)

So, Bill, you say that the European ‘Dark Ages were ages of forest culture'; that

[…] the trees were highly valued, highly selected, had high yields. You paid for the use of land based on the richness of the tree crop. From the forest, they derived all their bread, all their butter. The butter was made out of beechnuts — highly selected beechnuts. There are still casks and casks of beechnut butter in Europe, buried in the peat, still in good condition. All the bread and cakes in Tuscany and Sardinia and a few other places are still made from chestnuts. Corsican muffins are made of chestnuts, not wheat flour. All the bread was made from the trees, and all the butter was made from the trees. There are your basics.

In your American southwest, the pinion pine nut is a staple Indian food. In one day a family of six can gather thirty bushels of pine nuts, and that’s a year’s supply. In South America, six trees support a family of Indians. Those great supports are a source of staple food. One white oak, in its year, will provide staple food for about six families. A good old American chestnut — how many pounds did we get off one of those trees? At least four or five hundred pounds. There’s a couple of families’ food for a year, with no hacking and digging and sowing and reaping and threshing. Just dash out in autumn, gather the nuts and stack them away. […]

When the forests were managed for their yield and their food equivalence, they were highly managed. Now there are only a few remnants of this in the world, in Portugal, and southern France. In Portugal, you can still find highly selected, highly managed oak trees, often grafted, and olives. The pigs and the goats and the people live together in a very simple little 4,000 yard area in which nobody is racking around with plows. In that economic situation, there is no need for an industrial revolution.

A few of these tree ecologies still remain up on steep mountain slopes, where it has been difficult to get up there to cut the trees down for boat building and industrial uses. The whole of Europe, Poland, and the northern areas once were managed for a tree crop, and the forest supplied all the needs of the people. (from Bill Mollison’s design course, ‘Forests in Permaculture’)

This sounds pretty good to me – something akin to the ‘better reasons’ for preserving woodland I started looking for last December. What state do we find Quercus Robur, the mighty Pedunculate or English Oak – our national emblem – today in ‘the most wooded county in England’ (Surrey – 22.4% coverage, compared to a 11.8% UK average, 8.4% for England and 14.1% for the South East)?

Mostly I find stand-alone specimens like this glorious creature (who I believe substantially outdates the ‘development’ now grown around him) in agricultural fields, parks, suburban street corners, some gardens. I don’t know that many places where they’ve been allowed to get together and form communities like they used to. A few golf courses, perhaps, and some patches here & there in the parks and on downland. Beech tends to predominate nowadays near where I live, although I’m told we used to have much more Oak woodland before the ship-builders and iron-smelting industrialists got their way. (Interestingly, current expert opinion suggests that actually Small-Leaved Lime was the most common tree in the Southern Lowland areas of the prehistoric, post-ice-age ‘Wildwood’ of the British Isles, while the big Oak forests lay to the West and to the North.) But now we don’t use them for anything. We get timber mostly from overseas sources, and even then we rarely use it for building, fuel, toolmaking or any other of the myriad uses which the forest was once put to. So the survivors of centuries of over-exploitation are allowed breathe a sigh of relief, look pretty, grow massive and provide for the 400+ associated species of insect, bird and mammal which we’re willing to tolerate. And yet, perhaps I’m just projecting my own insecurity, but to me they look slightly uneasy – “If the humans aren’t getting anything from us why would they think twice about chopping us down on the flimsiest of pretexts and, especially when times get hard, for the most marginal short-term gain?” I think we need to use – in fact depend on – the trees in order to really safeguard their future. Probably ours too.

Since we’re talking about Balanophagy  – ‘a compound formed from the Greek roots βάλανος (bálanos = acorn) and φαγεῖν (phageîn, infinitive of ἔφαγον, used as 2nd aorist of ἐσθίω, meaning to eat’ (source) – let’s look at some of the edible uses of the the Oak tree’s fruit, the humble acorn.

Here’s William Cobbett writing in the early 19th century about one form of Balanophagy previously widespread among European peasantry – processing acorns and other woodland nut-masts through pigs:

The only good purpose that these forests answer is that of furnishing a place of being to labourers’ families on their skirts; and here their cottages are very neat, and the people look hearty and well, just as they do round the forests in Hampshire. Every cottage has a pig or two. These graze in the forest, and, in the fall, eat acorns and beech-nuts and the seed of the ash; for these last, as well as the others, are very full of oil, and a pig that is put to his shifts will pick the seed very nicely out from the husks. Some of these foresters keep cows, and all of them have bits of ground, cribbed, of course, at different times, from the forest: and to what better use can the ground be put? (source – ‘Rural Ride’, Forest of Dean nr. Bollitree, Nov. 14th, cited in Roger Deakin’s Wildwood, p.131)

A more intensive version of this still survives in the Portuguese practice of montado (aka dehesa in Spain) whereby:

Oak tree forests were gradually thinned out and the land was ploughed to provide room for livestock grazing. The oak trees that remained grew larger and produced more acorns, which in turn provided additional food for the grazing animals. To further enhance acorn production, the trees were periodically pruned, and the trimmings were then used as fuel or fodder for the animals. (link)

This works out better for the land than conventional agriculture because the trees ‘protect against soil erosion by decreasing the amount of water runoff as they absorb rainfall; their roots reach nutrients deep in the soil and bring them up closer to the surface, making them accessible to other vegetation; and they also prevent desertification by enhancing the structural complexity of the landscape’ while at the same time maintaining habitat for wildlife. The pigs also presumably get a taste of their wild ancestry which they seem to like, judging by average weight gains of 30kg after living with the trees for one season between October and January.

La Dehesa

Did the peasants ever cut out the middle man, as it were, and eat the acorns directly themselves? In ‘An Iberian perspective on Upper Paleolithic plant consumption‘ Jonathan A. Haws writes:

In his book, “Prehistoric Europe: The Economic Basis” (1952), Grahame Clark discussed prehistoric acorn consumption in the Mediterranean. Citing the geographer, Strabo, he noted the Lusitanians, in what is now Portugal, were observed to eat bread made of ground acorns for three-quarters of the year. Although in later times acorn flour was milled and made into “famine breads” when grains were scarce, many people appear to have subsisted off acorns for centuries (Jørgensen, 1977). Numerous citations from classical sources suggest acorns were viewed as the basis for all of civilization (Clark, 1952; Mason, 1995; Vencl, 1996; Sieso and Gómez, 2002). In fact, the genus name  “Quercus” is derived from two Celtic words meaning “beautiful tree” suggesting its importance in early times (Sánchez Arroyo, 1999). Acorn-eating, or balanophagy, survives today in Iberia where sweets are made from acorns. In Algarve, people eat raw acorns from the evergreen oaks. On Sardinia, local people still gather acorns and process them using traditional methods. Acorns are mixed with a special iron-rich clay and boiled to absorb the tannins (Johns, 1990). In the western Rif of Morocco, acorns are eaten raw, toasted, soaked in water or sun-dried (Peña, 2000). (pp.55-6)

I find it intriguing to speculate that montado/dehesa practices may have hung over from the subsistence economies of earlier cultures. Did the new farmers learn the techniques from the hunter-gatherer peoples they supplanted (viz. Indians teaching the first European colonists how to grow corn)? Or perhaps these were the same people, doing their best to hang on to the proven old ways while the Neolithic revolution swept through them? Haws lays out some tantalising possible scenarios of earlier practices:

Hunter-gatherers incorporating simple forest management techniques such as pruning, burning or possibly intentional planting could have created improved foraging areas for wild boar, deer, chamois and even wild aurochs. Spring pruning in the dehesa /montado is the primary method for increasing acorn yields per tree however this would be difficult if not impossible to detect archaeologically. There is evidence of prehistoric fire management of European woodlands by people during the Mesolithic (Mellars, 1976; Mason, 2000). Much of this burning has been perceived as a means of encouraging new growth for browse to support deer and other ungulates. However, as Mason (2000) points out, burning can encourage the proliferation of desirable forest species for human subsistence. In this case, fire may have been used as a tool to manage oaks or other fruit / nut-bearing vegetation. Fire may permit more light to reach the crown thus increasing acorn yield for individual trees (Mason, 2000). Comparisons between Holm oaks in managed stands and natural forests showed that unmanaged trees are generally shorter, found closer together and have smaller canopies (Pulidoet al., 2001). (pp.58-9)

Other extant Balanocultures show similar evidence of burning, pruning and other extensive management to maximise acorn production. In her 2005 book, Tending the Wild, Kat Anderson builds a picture of techniques used by Indians in California, some still within living memory. Acorns provided a ‘principle staple’ for the people there, with records of charred shell remains going back at least 10,000 years (p.287). This sounds like fun:

Individuals of many tribes harvested acorns by climbing the trees and cutting the limbs, a process Galen Clark recorded among the Yosemite Miwok: “In order to get the necessary supply [of acorns] early in the season, before ripe enough to fall, the ends of the branches of the oak trees were pruned off to get the acorns, thus keeping the branches well cut back and not subject to being broken down by heavy snows in the winter and the trees badly disfigured, as is the case since the practice has been stopped.” The Mono elder Lydia Beecher remembered the former pruning of oaks: “My grandpa Jack Littlefield would climb black oak trees and cut the branches off—just the tips so that many more acorns would grow the next year” (p.139)

As with practically all the other plant communities they ‘tended’, the Indians used fire to manage Oak trees. Apparently this served various purposes such as: helping to facilitate gathering, suppressing pests and diseases, encouraging the growth of long, flexible new shoots (useful for basketry etc.), keeping forest debris levels down so fires wouldn’t rage out of control, and fostering the growth of edible grasses, herbs and mushrooms between the trees (pp.288-9). As ‘Klamath River Jack from Del Norte County’ put it:

Fire burn up old acorn that fall on ground. Old acorn on ground have lots of worm; no burn old acorn, no burn old bark, old leaves, bugs and worms come more every year…. Indian burn every year just same, so keep all ground clean, no bark, no dead leaf, no old wood on ground, no old wood on brush, so no bug can stay to eat leaf and no worm can stay to eat berry and acorn. Not much on ground to make hot fire so never hurt big trees, where fire burn. (p.146)

As late as 1991 ‘Rosalie Bethel, Nork Fork Mono’ could still recall her elder’s stories from the 1800s:

Burning was in the fall of the year when the plants were all dried up when it was going to rain. They’d burn areas when they could see it’s in need. If the brush was too high and too brushy it gets out of control. If the shrubs got two to four feet in height it would be time to burn. They’d burn every two years. Both men and women would set the fires. The flames wouldn’t get very high. It wouldn’t burn the trees, only the shrubs. (p.177)

The resulting ‘Oak Savanna‘ habitats look strikingly similar to the Iberian landscapes pictured above, and were often compared to parkland by early European observers (p.175):

http://oaksavannas.org/photos/savanna-unit12b-0312.jpg

As well as the fact that, ‘Open country is much easier to travel in than country with thick underbrush; it is easier to find game and harder for enemies to sneak up on the camp’ (p.288), fire management would only leave the oldest, most productive trees standing and leave enough space for rounded canopies with more access to the sun (p.179). As I’ve observed over here when on the hunt for acorns and beechnuts, trees in the middle of woodland tend not to crop very heavily, whereas those in clearings, on edges or out on their own are much more likely to carpet the ground with large, sound nuts. Even on individual trees I’ve noticed that the best pickings are usually found on the South-facing (or open-canopy) side. This makes sense from the tree’s point of view too: What’s the point of dropping seeds in the middle of a shady wood? You’re far more likely to succeed in propagating your kind on the edge of the forest or where a fallen tree opens a clearing, allowing more sun in to increase the chances of germination and/or swift, healthy growth.

Unfortunately (for me) there doesn’t seem to be a whole lot of evidence for acorn consumption in pre-agricultural Northern Europe. The abstract of the Mason paper, ‘Fire and Mesolithic subsistence — managing oaks for acorns in northwest Europe?‘ cited by Haws above (anyone got access to the full article?), particularly the number of question marks in the subheadings, suggests a fair amount of conjecture, though the attempt to ‘to extend and apply the model for Mesolithic burning suggested by Moore (in 1996) to two pollen and microcharcoal sequences from Mesolithic Britain’ sounds fascinating. Haws notes:

In the Near East there is solid evidence that acorns were used as food as early as 19,000 bp at Ohalo II (Kislevet al ., 1992). At La Sarga, an Epipaleolithic site in València, a painted rock art scene shows several figures collecting acorns as they fall from the tree (Fortea and Aura, 1987). However, inadequate recovery techniques and/or preservation biases inhibit an understanding of the role acorns may have played in European hunter-gatherer subsistence. (ibid. p.56)

I’m still not clear on how far back acorn remains are found in the archaeological records of the more Northern regions, though. In a 2000 dissertation, ‘Food production and food procurement in the Bronze Age and Early Iron Age’, Anne Evelyne de Hingh writes that:

Finds of concentrations of charred acorns are not at all exceptional and occur from the Mesolithic through to historic times throughout Europe. In Northern France, acorns are found from the Mesolithic up until the Middle Ages (Marinval/Ruas 1991, 420). Several authors have listed (pre- )historic finds of acorns in Europe (see e.g. Knörzer 1972; Karg/Haas 1996)’ (from chapter 11, ‘The collection of wild plants: risk reduction?’, p.200 – pdf)

However the table she provides only lists finds back as far as ‘Neolithic’ digs. Now, farming arrived in Greece around 6500 BC, spreading North and West to the British Isles by 4000 BC, yet archaeologists reckon Mesolithic hunter-gatherer cultures continued to occupy land unsuitable for cultivation (eg: mountainous areas), in some places living alongside agriculturalists for upwards of 1,000 years (source: Wikipedia). One way or another it seems the early farmers either acquired or maintained the knowledge of how to subsist on acorns:

Archaeological evidence for the roasting of acorns is known from the German Rhineland. A pit dating from the Late Bronze Age and doubtlessly intended for roasting activities is known from Moers-Hülsdonk in the German Rhineland (Knörzer 1972). The large pit (4 metres wide and 2,4 metres deep) produced burnt loam and other traces of fire in the filling as well as a red-burnt floor surface. Charred remains of apple, hazelnut and large quantities of acorns were found inside the pit. All evidence points towards the interpretation of a roasting or drying pit for the roasting of acorns and other fruits. (p.200)

Interestingly the Northern Europeans all seemed to have preferred this roasting technique (possibly soaking in water or a lye of wood ash beforehand):

The finds of carbonised acorns from our samples consist solely of kernels, often split into halves. […] This proves that in Northwest European prehistory, acorns were roasted before consumption, which contrasts with North American traditional communities for example, where they were cooked or rinsed (p.201)

Where did this knowledge come from? Maybe they sought help from the people in the hills during times of famine? Or maybe crop failures occurred often enough to ensure that these cultures remembered – and continued to practice – their own old ways? I don’t suppose we’ll ever know… De Hingh is of the opinion that ‘The principal role of Quercus in the agricultural regimes of prehistoric communities should be found in its properties as “reserved food”, which can be eaten in cases of an emergency, like major harvest failures.’ (p.201) So the peasants still maintained relationships with the trees, relying on them to diversify their subsistence base as a ‘risk buffering’ strategy.

This association of acorn-eating with famine and ‘hard times’ lives on in the European imagination. Most of the wild food literature talks about ground, roasted acorns being used as a coffee substitute when importing the real stuff got too difficult (eg: during WW2), although one American source suggests that this practice was invented by ‘industrial economists’ of the 19th Century French Consulate who, rather ironically, marketed it as ‘indigenous coffee’. There are also many references to peasants eating acorns during later famines, though these practices sound much more desperate, perhaps owing to the progressive deforestation of Europe, if not the loss of the old knowledge. Here’s a snapshot provided by a letter from the Governor of the Province of the Dauphine to Jean-Baptiste Colbert, the Minister of Finances for King Louis XIV during the French famine of 1675:

Sir, — I can no longer delay in letting you know the poverty to which I see this province reduced; commerce here is absolutely at a standstill, and from all quarters people come to me to let the king know how impossible it is for them to pay the taxes. It is asserted — and I speak to you because I am well informed thereon — that the greater part of the peasants of the said province have lived during the winter only upon bread made from acorns and roots, and that at the present time they may be seen eating the grass of the fields and the bark of the trees. (from The Economic Transition in India by Theodore Morison, p.101 – link)

No commerce, no taxes, subsisting entirely on foraged foods? Sounds like my kind of heaven! It doesn’t look like the peasants had much fun at the time, though… Here’s an account of the earlier 1528 famine:

The stock of provisions was already so far consumed in the first year that people made bread of acorns, and sought with avidity all kinds of harmless roots, merely to appease hunger. These miserable sufferers wandered about, houseless and more like corpses than living beings, and finally, failing even to excite commiseration, perished on dunghills or in out-houses. The larger towns shut their gates against them, and the various charitable institutions proved, of necessity, insufficient to afford relief in this frightful extremity (Justus Friedrich Carl Hecker – The Epidemics of the Middle Ages, p.219 – thanks, e-books!)

(Though in this instance they may have been suffering of ‘trousse galant’ – erroneously attributed to acorn consumption but actually thought to refer to a form cholera that killed young men – rather than simple starvation.) All of which provides the lesson that you can’t reintroduce a foraging culture at the drop of a hat when your crops fail and expect to support the same population levels for any length of time, especially if the ‘wild’ lands have been depleted by the various impacts of that same population. There has to be a wild food tradition already in place, preferably with management practices already established for maximising yields. As Mark Fisher impressed upon me, we urgently need to restore the ‘devastated landscape’ before sustainable human use becomes possible.

Indeed, shifting our subsistence strategy away from the annual grains and towards perennial plants and trees as the permaculture people suggest strikes me as an obvious first step towards ecosystem restoration without compromising the human food supply. Both Iberian and Californian sylvicultural landscapes host wide diversities of plant and animal life – including endangered species – all while producing human food on land often considered too marginal to support full-scale agriculture. In fact many of the sources I’ve come across compare yields from Oak and other nut trees favourably with those obtained from the common grains, with the bonus that they don’t require yearly ploughing or monocropping (two factors which eventually deplete the soil of essential nutrients) or, in more recent times, regular fertilisation and the chemical extermination of wildlife (aka ‘weeds’ and ‘pests’) with fossil fuel derivatives. In a 1984 Mother Earth News article, ‘Acorns: The Grain That Grows on Trees‘, David Bainbridge made the comparison between Corn and Oak species in terms of blunt productivity:

Corn yields generally range from 2,500 to 10,000 pounds per acre. In comparison, acorn yields in natural forests have been recorded as high as 2,000 pounds per acre from the live oak (Q. virginiana), and—in a good year—I’ve recorded black oak (Q. velutina) yields per tree that would amount to more than 6,000 pounds per acre in a pure stand. And J. Russel Smith, in Tree Crops: A Permanent Agriculture, cited an individual oak that produced a full ton of acorns annually. If a 100-foot spread is assumed for that tree, it seems possible that a yield of 10,000 pounds of acorns per acre could be achieved.

Of course this doesn’t account for all the other productive uses an Oak tree can be put to. I never saw a house built out of the withered remains of harvested corn… Also, if you reinstate Indian-style practices of encouraging the growth of seed-bearing flowers, perennial herbs and other edible plants under the Oaks you can further ramp up food production for years when the trees don’t crop so heavily (Anderson, pp.177-9).

Putting all of this information together you start to wonder how agriculture ever got started in the first place. (As ‘Leavergirl’ noted in a recent overview: ‘In the old days, anthropologists used to ask what took humans so long to become farmers. Now they are asking, what forced our ancestors into this difficult way of life when life as foragers was generally plentiful enough, healthier, and full of leisure compared to the new lifestyle?’) Farmers have spent centuries working hard with their domesticated plants in an effort to maximise the human food they produce, and this has translated into the work-until-you-drop modern insanity of growing economic production at the maximum possible rate, environmental & human costs be damned. But if forager cultures approached similar levels of productivity for thousands of years with a fraction of the effort, surely our end-results-obsessed culture would opt for more intensified versions of their practices rather than sticking with a model that eats the ecology and then fails every other year before finally collapsing in on itself? It doesn’t make sense, given the mantras we hear repeated every day. Unless those in charge are really less interested in total yields than they are in controlling the surpluses and concentrating the subsequent wealth & power… In which case I guess the superior storability (and in the globalised age, transportability) of grains might just give them the edge.

Intriguingly, various scholars have begun to posit that agriculture began among acorn-eating cultures – that the whole project of Civilisation got started when people turned their backs on the trees. This article, for instance, explores the contention that the ‘Natufian’ culture in the Levant, East of the Mediterranean Sea subsisted on acorns in a similar way to California Indians (they had a similar climate and distribution of forests) before shifting into one of the major global starting points for the agricultural revolution. (Check out this equally interesting reply, which challenges the original on various points.) Here’s David Bainbridge again, writing in another paper I wish I had full access to, ‘The Rise of Agriculture: A New Perspective‘:

Interest in and research into the origin and development of agriculture has increased sharply in the last twenty years, yet all of these studies have missed the common link between the areas where agriculture may have begun-the acorn. All three areas considered of significance to date-the Middle East, middle China, and Mexico-are, or were once, characterized by oak woodlands. The experience in California, where ethnographers and anthropologists have been able to study a fully developed balanoculture (from the Greek balanos-acorn) reveals the primacy of acorn use and the complex interaction between people and oak woodlands. The California balanoculture was in fact a very successful agroforestry system that prospered for thousands of years. Balanoculture provided the stable communities necessary for agriculture to develop. The lower time and work cost associated with acorn use suggests agriculture may have evolved as acorns became more scarce from the decline in the oak woodlands brought about by the adverse human impacts resulting from overgrazing, fuel cutting and cutting for timber, and field burning, exacerbated by climatic fluctuation. A reevaluation of the record is in order: agriculture may perhaps be better considered a regressive rather than a progressive evolutionary event.

It occurs to me that a grain-based culture would have a short-term competitive edge over a tree-based culture simply because it doesn’t take so long to establish. If a farming tribe wanted to conquer their balanocultural neighbours, they could cut down their trees, sow seed and be done in a year. If the acorn-eaters wanted to fight back, sure they could burn the wheatfields easily enough, but they’d have to wait several decades before new saplings started to fruit heavily enough to support them again.

Clearly the farmers can’t continue like this forever. You can only fight the inborn tendency of all living beings (including your own – why do rich people spend their lives cutting down the forests of poorer regions in the name of ‘development’ but insist on coming home to immerse themselves in acres of prime hunting woodland?) for so long. Certainly in temperate Europe the land wants to turn into forest – it’s our ‘climax ecology’. No wonder grain farming takes so much effort… Leave even the most completely altered environment alone for an average human lifetime and the various successional stages will revert it to woodland by the end, so long as the necessary seeds still exist and can get in from somewhere. The second we let up on our revolution the Great Rollback begins.

The 18th Century French writer François-René de Chateaubriand wrote that ‘Forests precede civilizations and deserts follow them’. I’d like to see this tide reversed and Civilisation pushed back into the desert of its own sick imagination. I’d like to see human beings allied to this irrepressible riot of diverse lifeforms, reclaiming the continent for our own.

*****

Some ideas for reinstating Balanocultures:

  • Quit throwing acorns away! I know plenty of people who just rake them up from their gardens or driveways and stick them in compost bins for the council to tow away. That’s food you’re wasting! I don’t know what happens to them in the ‘Community Recycling Centres’, but I bet they don’t get ‘recycled’ back into human stomachs, except maybe indirectly through compost. I’m not a fan of big centralised solutions, but if individuals haven’t got the time to organise this among themselves would it be too hard for these Centres (we used to call them ‘Dumps’) to separate out the acorns and maybe sell them on as feed to local pig- or chicken-farmers?
  • Look at what Oaks you have around you with a view to returning them to management. I’ve often seen farm or pasture fields in England with huge oaks in them (someone told me there was a law about this dating back to shipbuilding times), and I know a few suburban developments that kept the old trees from preceding land uses:These are already in prime conditions for heavy acorn cropping – rounded canopy, not too crowded, open to the sun – and I’ve found that they do in fact produce far more acorns of better quality than most trees in conventional woodland. I’d say they need a few more brothers and sisters though… Also, I don’t suppose they like being surrounded by all that concrete (acorns bruise like apples, especially if they land on hard surfaces). Even when grasses grow at the base, the habit of raking/blowing/’tidying’ away the annual leaf litter robs the tree of the nutrients it depends on from its own self-generated ‘mulch’. Either leave the leaves be, or you could consider introducing small-scale burns in Autumn/Winter which would release the nutrients much faster and allow other plants to grow from the ashes. Sure, you’d get an unsightly black scorch-mark for a while, but think of all the other interesting plants you could get growing in the place of yet-another-boring-lawn by the start of the next season.
  • Get in touch with your inner squirrel and start storing, processing and eating acorns yourself (more on how to do this in a subsequent post) – link your fate co-dependently with the trees.
  • Preserve the f*&%ing forests! When it gets too expensive to pour massive amounts of petroleum-based energy on the fields, and we run out of imperial leverage on the other countries who we rely on to supply our needs, Britain’s crops will fail and famines will return with a vengeance. This will open up more space for agro-forestry techniques to step in and take up the task of food production, but how much time will these take to get established? Far quicker & easier to step up management on existing trees than to wait for new ones to grow to maturity. This won’t work if we already cut them down for ‘necessities’ like free newspapers, biomass, office/toilet-paper etc…
  • Spread the word!

Follow

Get every new post delivered to your Inbox.

Join 32 other followers