hewing logs and lines

Our brains are developing well into our teens and are always malleable, but there are plateaus and critical periods along the way. For example, around the age of six, the neural architecture for language learning crystallizes. When that critical period passes, learning a second language becomes difficult and learning a first language becomes impossible (see e.g., the tragedy of Genie). English speakers may joke about Chinese speakers whose /l/’s sound like /r/’s, but a Chinese baby that’s never heard English can discriminate /r/ and /l/. A few months later, that ability is gone. People tend to take longer to recognize faces of a different race than their own (the cross-race effect), but that effect is reduced for people who grew up in ethnically diverse environments. Once we learn how things work in our particular environment, that “open book” closes and learned behaviors (and languages) become more fixed.

Slightly less well known is an apparent critical period in our processing of numbers. Most cultures and school curricula deal with numbers on a linear scale, in which all the numbers are evenly spaced across the number line:

number_line

On a linear scale, differences and magnitudes are equivalent: 1 and 2 are as near to each other as 101 and 102 or 1001 and 1002. But numerically equivalent differences may not always be functionally equivalent. For example, the difference between 100 and 110 degrees probably feels much smaller than the difference between 30 and 40 degrees (in Fahrenheit…but then again, maybe in Celsius too).

In contrast, logarithms are built on ratios and orders of magnitude. One simple way to think about logarithms is that the differences between small numbers are magnified and the differences between large numbers are minimized. Instead of taking evenly-spaced steps on the number line, start by taking big steps that get progressively smaller. So imagine I show you a number line from 0-100 and ask you to mark where 77 should fall. You would probably put it about 3/4 of the way to the right:

0———————————77———100

That’s correct, assuming a linear number line. The cool thing is that before about the second grade, US children will estimate that quantity logarithmically (Booth & Siegler, 2006). They’d answer something like this:

0—————————————-77-100

In linear terms, 77 is 77% of the way to 100. In logarithmic terms, it’s more like 94% of the way to 100, and that’s where young children place it. Testing children at different ages, researchers have even watched as they “grow out” of logarithmic estimates and switch to linear scales (or, if you prefer, watched as the school system stomped out their innate creativity). But that’s only in industrialized cultures that use the linear scale; in cultures without it, both adults and children estimate logarithmically (Dehaene, Izard, Spelke, & Pica, 2008).

I don’t know why we seem to innately prefer thinking logarithmically. Some of our sensory abilities work on logarithmic functions: Stevens’ power law, for example, is a logarithmic relationship between stimulus intensity and our perception of it—meaning that adding a little light to a dark room is perceived as a greater change than turning on a spotlight in an already-bright room. It’s also easy enough to make a just-so story for the evolution of logarithmic thinking, such as the difference between 1 and 2 sabre-toothed tigers being more meaningful to our survival than the difference between 10 and 11 (right? totally makes sense). Whatever the reason, it does suggest that without cultural influences, we’d all be thinking logarithmically.

Why thinking logarithmically might matter…

It’s easy to be surprised how early in human history certain concepts were discovered. The Babylonians had mostly figured out algebra, could solve quadratic equations, and knew the √2 to five decimal places nearly 4000 years ago, all while doing the calculations on clay tablets (it doesn’t matter mathematically, but they did all that with a base-60 counting system—the basis for dividing hours, minutes, and circles into units of 60). Egyptians of the same time could solve linear equations, had methods for finding prime numbers, and knew how to brew beer and mead. A little more than a millennium later, the Greeks were figuring out trigonometry, early forms of calculus, precise estimates of pi, and startlingly accurate estimates of the earth’s circumference.

On the other hand, it’s jarring how late some other concepts were recognized. Rules of probability, for example, were not formalized until nearly the same time as calculus. Doctors didn’t realize that maybe they should wash their hands before digging around inside a person until about 150 years ago. Doc Brown didn’t come up with the idea for the flux capacitor until 1955.

Another curious latecomer is the concept of zero. In ancient mathematical systems, zero did not exist as a number. The Babylonians used it as a placeholder (for distinguishing between 12 and 102, for example), but never on its own and not at the end of a number (thereby not distinguishing between 12 and 120—which had to be confusing). In other words, the digit zero, as a placeholder, was around long before zero was recognized as a number. By the second century AD, Ptolemy was toying around with the idea, but it was still another 4-8 centuries before Indian mathematicians formalized the use of zero as an actual number.

Why was zero so elusive? Perhaps it took so long to “figure out” zero because we were thinking logarithmically. If you’re building a logarithmic scale, in which ratios and orders of magnitude matter, zero (and, in fact, all non-positive numbers) aren’t really relevant; the log of 0 and all negative numbers is undefined. In contrast, when using a linear numeric scale, 0 is just another number. In fact, once you start working with negative numbers on a linear scale, 0 is what makes negative and positive sides of the scale symmetric. In a sense, zero has a purpose in linear scales, but not logarithmic scales. This is more of a musing than an actual hypothesis, but if zero really is incongruous in logarithmic scales, then I wonder if children (or adults in non-industrialized cultures) have intuitive understanding of zero and negative numbers. If they are thinking logarithmically, they might not.

Addendum: Radiolab had a show about innate understanding of numbers.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s