By one relatively common view, brain growth was driven by meat consumption. At some point, human forebears grabbed rocks and smashed the bones that other animals wouldn’t eat (except the bearded vulture, which literally dissolves bones for nourishment), releasing the sweet sweet calorie-dense bone marrow within. The rich marrowy goodness provided extra calories that could go towards brain growth, and with that growing brain came ever more complex cognitive skills like tool use, hunting strategies, communication, and harnessing fire—all of which made it easier to find even more calories. Thus did eating meat make us human.
Wrangham’s central argument in Catching Fire is that the explosion in brain growth wasn’t from eating meat, but cooking it.
• • •
Cooking does more than kill bacteria or make food palatable: it also improves metabolic efficiency. Raw foods require more time and more energy to digest than cooked foods, in which the starches have been gelatinized and broken down and proteins have been denatured (in fact, soft foods are generally easier to digest than hard foods, and given a choice, most animals opt for soft foods over hard ones). Consequently, the body nets more calories from cooked compared to raw meat: 95% of the protein in cooked eggs is digested, versus just 50% from raw eggs. In dietary studies, people on raw-food-only diets can lose weight and can end up malnourished, even while people eating the same number of cooked calories will gain weight. In an ancestral environment where food sources were not abundant, maximizing metabolic efficiency was critical—you don’t have to find more food if you can make better use of what you do find.
Cooking helped our ancestors more efficiently convert potential food energy to realized food energy, which had ancillary effects. Our digestive system shrank, jaw muscles decreased in size and strength, and teeth got smaller, since we no longer needed the anatomical equipment for digesting raw foods (great apes use 10% of their daily energy fueling digestion). We also lost the musculature associated with tree climbing, possibly because fire could deter predators even better than heights.
Meat-eating was also said to have social, not just physiological, consequences. Big burly manly men hunted for elusive but calorie-dense meat, while the delicate womenfolk gathered ubiquitous fruits and grains. Being in groups spread the risk of hunger and starvation across many people, rather than the luck of individuals. Thus, human society tended towards a gendered division of labor and large, codependent social communities.
But, Wrangham says, that logic only holds when food is cooked. Why? Because of chewing. Raw food has to be chewed. That’s trivial when you’re consuming a raw banana, but most primates eat hard, pulpy fruits that take time and effort to pick apart, chew, swallow, and digest. Chimps spend more than six hours a day chewing; extrapolating by body size humans would need to spend 40% of their time chewing to take in a subsistence level diet of uncooked fruits and grains (though it’s unclear if this is current humans, or ancestral hominids with digestive systems equipped for raw foods). Without cooking, hunters who come back to camp bereft of mammoth meat might not enough have enough time to chew all the raw fruits and grains they need to survive. And because cooking usually meant foods were brought to a central location to be prepared—as opposed to eating berries right off a tree or bone marrow off a found carcass—cooking also created an impetus for group living.
• • •
I’ve been thinking a lot about the idea of metabolic efficiency and nutrition. The calorie counts on foods are potential energy—how much energy is released when the food is incinerated. But there’s not a strict correspondence between calorie counts on packaging and calories actually realized by your body. Consuming 100 calories worth of Oreos requires little digestive effort, and most of those calories are available to the body almost immediately. In contrast, the same amount of calories in cooked steak nets the body fewer calories over a longer time period. Of course what this really means is that, in a certain sense, processed foods aren’t so much processed as pre-digested.
This all reminds me of something learned in a long-ago bio class: folk wisdom of metabolism is backwards. We tend to think thin people who eat a lot have a “fast” metabolism, but really they are inefficient: they aren’t getting (or storing) as many calories as possible from what they eat. Nutritionists say that weight loss is a matter of calories in being less than calories out—losing weight on an all-Twinkie diet proves it—but maybe the proliferation of different diets (Atkins, South Beach, zone, juicing) and the variability in their success owes partially to how we don’t often account for the difference between calorie counts on packages and the calories we actually get from consuming the food. That might not be entirely possible—we all have different metabolisms and there’s evidence that metabolic efficiency can change depending on the ratio of proteins, carbs, and fats consumed. But maybe the Atkins diet doesn’t work so much because you eat a lot of protein and not a lot of carbs per se, but because it’s a diet where you body works harder to metabolize the calories you ingest.
Final musing: When rats eat sugar substitutes but not sugar, their digestive system learns that sweet things have no calories. When later given access to real sugar, these rats go hog-wild and gain all kinds of weight. In fact, their metabolism is altered such that they gain more weight than normal rats eating that much sugar. I wonder whether the ubiquity of processed foods has had conceptually similar effects: that is, do diets high in sugar or processed starches affect the brain’s “expectation” of how much energy is available in food? And if so, does that have long-term consequences on how food is metabolized and the excess energy is stored?
• • •
The book: Catching Fire: How Cooking Made Us Human, Richard Wrangham (2009)