Heads, ancient cities, ideas, and ill-considered TV shows: a roundup of things found…
• • •
1. Here’s a thing that happened: after his death in 1809, composer Joseph Haydn’s head was stolen. CRANIOKLEPTY. Why steal his head? The pair who paid the gravedigger to decapitate the corpse and abscond with Haydn’s brainbox were big fans of phrenology. Just coming into intellectual fashion in the early 1800s—and all the rage at your average ether frolic—the central underpinning of phrenology was a syllogism:
a) Behavioral tendencies and mental abilities owe to differences in brain anatomy;
2) Those brain differences produce noticeable differences in the size, shape, and geography of the skull
d) Conclusion: one’s mental abilities and faculties can be determined by measuring bumps on the skull.
The skull theft, and phrenology more generally, is an object lesson in what happens when deductive logic goes wrong. The phrenology enthusiasts conspired to steal the head—”already quite green, but still completely recognizable”—then de-brain it, bleach the bones, and make a study of the skull. They did exactly that, reporting back that indeed, the musical genius Haydn’s “bump of music” was “fully developed,” providing valuable confirmatory evidence for their completely wrong theory.
Thus began a decades-long odyssey for the disembodied head. It was displayed for more than a decade in a museum-quality shrine at the home of one of the thieves; lost; given to other phrenologists for safe-keeping; then finally given to a university professor. Prior to that, the thieves had been called out for their larceny, and returned a fake skull that was placed in Haydn’s tomb. The university held the actual skull for more than a century, until in 1954 a full orchestra played for a funeral procession transporting the skull to be reunited with its owner.
• • •
2. In southeastern Turkey, just north of Syria, you can go up on a hill and run across an archaeological excavation that has unearthed hundreds of stone pillars weighing up to 20 tons each, floors of polished lime, remnants of tools, and abstract shapes and animal reliefs carved into rocks. These features are littered across a range of 22 acres, in a ritual site that has no evidence of people living there, although only about 5% of it has been investigated. That is Göbekli Tepe.
What’s interesting about Göbekli Tepe is not so much the aesthetics but that it’s really, really old. It dates back about 11,000 years, meaning that it was older to the people who built stonehenge than stonehenge is to us. In fact, it’s so old that it challenges some commonly held assumptions about the development of human societies.
A standard narrative holds that major monuments, temples, and religious centers only came about once mass agriculture was developed. According to this narrative, cultivating crops allowed society to transition from bands of active and generally nomadic hunter-gatherers to, in archaeological parlance, “sedentary” farming communities that were larger, more stable, and more centralized. In the optimistic view, this change gave people enough free time to ponder existence and build religious monuments with no immediately practical purpose. By the cynical view, the introduction of mass agriculture created the situations of enforced scarcity and hoarding that suited the development of organized religion and the priestly class. But in either case, the order of causality is the same: religion came after agriculture.
But Göbekli Tepe is so old that it almost certainly predates mass agriculture and animal husbandry, suggesting the causality of that standard narrative may be backwards. The main archaeologist excavating the site put it simply: “first came the temple, then the city.” In fact, DNA analysis suggests that modern cultivated wheat derives from wild wheat that grows near the site—which suggests the intriguing possibility that not only did the temple predate agriculture, but might have been where mass agriculture began: the very seat of the Neolithic revolution.
So, we’ve had it backwards this whole time: humans didn’t build monuments and centralized, complex social structures once agriculture relieved the pressure of subsistence food gathering. The development of complex social structures—the kind that would create a huge monument and temple like this one—was the driving force for the discovery and/or development of grain cultivation and mass agriculture. Of course this raises the question of why nomadic groups would band together into larger communities. No one knows, but one possibility is that they gathered together to protect a major food source, such as wild grains, from being ravaged by wild animals.
• • •
3. Thomas Malthus’s 1798 essay popularized a concept later called a Malthusian catastrophe: if population increases geometrically but food sources only linearly, population will inevitably outstrip food supply and doom humanity to mass starvation. A priest and obviously a fan of Revelations, Malthus thought such inevitable mass death would be punishment for man living unvirtuously. Although Malthus met his maker in 1834, no good theory of human self-immolation ever dies, and concern over looming food shortages have been an omnipresent undercurrent in two-centuries of doom-saying, from famed chemist William Crookes in the late 1800s to Paul Ehrlich’s The Population Bomb and Soylent Green in the mid-1900s.
It turns out Malthus was a few years behind the curve set by Chinese polymath Hong Liangji (1746-1809). When Liangji wasn’t busy writing essays so critical of the emperor that he was being banished, he was noticing that the population of China had roughly tripled in 150 years—thanks to newer, more productive crops—and that such growth was unsustainable:
“Some people may propose that there would be wild land to cultivate and spare space for housing. But they can only be doubled or tripled, or at most increased five times, whereas the population at the same time could be ten to twenty times larger. Therefore housing and crop fields tend to be in scarcity, while the population tends to be excessive at all time.”
I was ready to scream “sic semper eurocentrism!” and jump from the balcony, but this might be less about who writes the history books and more about some common phenomena in the history of knowledge. The first is Stigler’s Law, which holds that no scientific discovery is named for its original discoverer. For example, Venn diagrams are named for Joseph Venn, but were used decades earlier by Euler. Here’s a real mindbender: the Playfair cipher is named for Lord Playfair but was created by Charles Wheatstone, who himself is the namesake of the Wheatstone bridge, which was actually invented by Samuel Christie. Name-jackers all! Wikipedia has a whole list of examples of Stigler’s law.
Stigler cited Robert Merton as the discoverer of Stigler’s Law, making the law an example of itself. Merton had written a seminal 1960 paper (pdf here, it’s a good read) on the idea of “multiple independent discoveries.” For most of the 19th and 20th centuries, the “heroic theory” of development held sway. A variant on the “great man” theory of history, heroic theory holds that great leaps forward in science and technology owe to the work singularly unique intellects; that great ideas drop “down from heaven through the agency of star-touched genius.” Indeed, according to Carlyle, “the history of the world is but the biography of great men” (all previous and future sexism in this section is [sic]): Darwin thought of evolution, Newton calculus, and Einstein relativity; Nobel prizes go to great men of genius, without whom we would still be in the caves.
The counterargument, put forward by Merton, is that people are products of time and place and context; that new ideas require some element of right place and right time; that even Newton knew he stood on the shoulders of giants; that the steamboat was inevitable once the boat and the steam engine existed. In his paper, Merton showed that ideas discovered independently by multiple people are actually the “dominant” pattern of science, rather than the outlier. The singletons—new ideas from an individual—are vanishingly rare. Newton and Leibniz invented calculus. Wallace and Darwin came up with evolution (and maybe stole it from an arborist?). The invention of the telephone involved a literal race to the patent office like a scene out of Cannonball Run. In his historical census of inventions, Merton even found two discoveries that could be attributed to nine separate people.
As a sort of retroactive proof of its own thesis, even the idea of multiple discoveries has been “discovered” multiple times. Perhaps the most influential of these was a 1922 paper—provocatively titled Are Inventions Inevitable—which ends with a list of some 150 ideas and inventions discovered multiple times, including: the planet neptune, sun spots (four times, all in 1611), the telescope, decimals, logarithms, calculus, the microscope, photography, the thermometer, the laws of inertia, the telegraph, microphone, telephone, phonograph, light bulb, the function of the pancreas, mendelian genetic inheritance, sewing machine, steam boats, the typewriter, and gas engines.
This “sociological” theory of invention infuriates some people who believe that the focus on (intellectual) context leaves no place for those with outstanding abilities. One total wiener, in a hilarious paper, attempted to individually debunk every multiple discovery, ending his too-short paper with the too-bold claim “The great man theory is vindicated” (he earlier says, with no apparent irony: “Sexual natural selection is simply the preference of a male for a pretty wife. It seems odd they took so long to think of this!”). Merton, though, anticipated this sort of claim, and specifically debunks it: luminaries exist, but what sets them apart is the frequency with which they develop novel ideas, not the magnitude of any one idea. Lord Kelvin, for example, was involved in thirty-two multiple discoveries. “The greatest men of science,” Merton says, “have been involved in a multiplicity of multiples.” Wikipedia again has a good list of multiple discoveries.
Side note: Am I the only one that thought Soylent Green would have been way more powerful if the final shot were Charlton Heston being made into soylent, demonstrating his own futile attempts to save humanity?
• • •
4. Refining raw trivium ore into human-safe trivia nuggets is a complex process requiring distillation, dephlogisticated air, mandrake root, unobtainium, x-ray crystallography, and hot extrusion. I often worry that I am, or will become, Robert Ripley, distributing Flintstones-chewable factoids, now context-free! I also spent a solid three hours last week reading about HR Pufnstuf, but no one’s perfect.
However middling my success in avoiding Ripley’s trap, I am heartened after my discovery of the History Channel series History’s Lost and Found. Each episode features the search for three lost objects from history, with zero consideration of and/or total apathy to which objects are jammed together, producing some mesmerizing and deeply troubling juxtapositions. The episode listings read like the world’s worst game of Tri-Bond:
- Episode 8: Eva Braun’s home movies; King Herold’s Alabaster Bathtub; the first issue of Mad
- Episode 12: Stonewall Jackson’s Raincoat; the first Monopoly game; the car Franz Ferdinand was assassinated in
- Episode 15: the hollywood sign; Gandhi’s bloodstained dhoti; the first gas mask
- Episode 38: Hitler’s skull; the original la-z boy recliner; duke kahanamoku’s surfboard
- Episode 48: Lockheed P38 Glacier Girl WWII fighter; original Uncle Tom’s Cabin manuscript; Ty Cobb’s dentures (Cobb was so virulently racist that he stood out for being racist in the deep south in 1900)
- Episode 57: Malcolm X’s diary; belmont’s subway car; ronald reagan’s favorite restaurant booth