Brains and technology are interconnected. We create, update, and employ new technologies with brain power, but at the same time those technologies and tools influence and constrain how we think—and that’s especially true for technologies we grow up using. Playing video games, for example, may improve visual acuity, physical dexterity, reaction times, our ability to make statistical judgments, our ability to multitask, and may slow age-related cognitive decline. Billions are spent on the development and study of training and rehabilitation games for everyone from school kids to fighter pilots. Or consider how desktop OSs like Windows encourage multitasking, whereas tablets and phones almost demand mono-tasking (the tablet version of Windows should really be called Microsoft Porthole).
Because technology is ever-evolving, it shouldn’t be surprising that “kids these days” have different cognitive skills and abilities than adults. They grew up using computers that demand multitasking, so it’s likely they’re better at splitting their attention across multiple things than are their parents, who had physical books and three channels of TV. Those parents probably have different skills than their parents. Cognitive skills aren’t fixed or stable, either over lifetimes or over generations. Technology and cognition are linked because brains aren’t “hardwired” and there’s no “default”: the only thing they’re hardwired for is to learn to interact with the world. If that modern world is different than last generation’s, then modern brains will likewise be different.
The consequences of the brain’s plasticity run deep. It might be easy to see how experience and environment would shape “high-level” things like food or music preferences, hobbies, and religious beliefs. But they can also influence simple, basic perception. For example, East Asian individuals are more susceptible to context-based visual illusions like the Titchener illusion than are Westerners.*
*One explanation for this effect rests on the idea that Asian cultures encourage a holistic, collective worldview, whereas the “west” is more individualistic. The illusion is more powerfully perceived by Asian individuals because it is caused by contextual information (the surrounding circles), and Americans pay more attention to the central point of focus.
Recognizing that cognitive skills may differ between generations or cultures almost inevitably leads to the question of which skills are “better” than others. Is it “bad” that kids have difficulty focusing on a single task when compared to their parents, but are proficient multitaskers? Well, Davidson says, the only real way to ask that question is this: it depends on what we ask people to do. There’s no abstract answer for whether monotasking is better than multitasking—there is only the “goodness of fit” between a skill and a task to be accomplished. And when you start thinking of cognition not as a set of abstract abilities, but as deeply contextual, interactive, goal-driven skills, you start to see how many areas in modern society are wedded to the past. Modern kids in industrialized nations are multitasking machines, but schools usually teach subjects one-at-a-time, and work on assignments the same way. That’s not a good fit: we’re relying on 19th-century conceptions of education that ill suit the modern brain.
Does it make more sense to bend our brains to meet existing conceptions of what’s “best” in schools and workplaces, or to bend those spaces to match how our brains have developed? To Davidson, the answer is clear: schools and workplaces need to be adaptive and flexible and receptive to what we’ve learned about how the brain works and learns. If video games have benefits, let’s figure out how to use them to teach. If kids are good at multi-tasking, then let them do that, rather than forcing them to mono-task because that’s what their great-grandparents did when everyone was writing on chalkboards. If some people see the gorilla and some people don’t, then our differing cognitive skills make collaboration beneficial. There is no one default “best” way to do things.
We can accomplish that flexibility in many ways. Maybe it’s as simple as giving people four jobs to complete in a week, however they want to divide them up, rather than enforcing one-at-a-time. IBM has mostly eliminated the prototypical “Powerpoint presentation” at business meetings, replacing it with a less structured collaborative meeting (participants engage in a group chat session while the meeting is ongoing). Attendees are able to flesh out nascent ideas without interrupting the speaker, and the format takes advantage of employees’ abilities to multitask and divide attention.
At its best, using brain science to inform flexible “policy” can produce a real synergy (not the tossed-off marketing-speak version), in which technology is employed and environments are structured around the cognitive skills of students and workers.
• • •
There’s a tendency for pop science books to pick up on a finding and stretch it out so that it explains everything, in one easy-to-digest package. Mirror neurons become the magic beans that explain our very humanity; the amygdala explains political differences. Now You See It features some of those same exaggerations and overextensions—mirror neurons included—but they never overshadow the fundamental premise. People criticized Moneyball by focusing on the specific application (finding baseball players with a high on-base percentage) and ignoring the underlying principle driving those decisions (to exploit market inefficiencies and find undervalued players). Focusing on specifics rather than principles is the wrong criticism to make, whether it’s Moneyball or this book. It may be that inattention blindness, mirror neurons, multitasking, and video games aren’t the “best” concepts for reforming education and the workplace. But even if they’re wrong, the foundational principle of Now You See It isn’t: use brain science to inform, or at least make us think critically about, how we structure schools and workplaces. Be flexible, and don’t assume what was right for the world 20 years ago is right now.
If the principle is sound, how is it best applied? The US educational system is virtually defined by underfunded schools and huge achievement gaps for children in poverty, children of color, and non-English speaking students, and yet most of the interventions Davidson discusses don’t seem plausible or useful for those schools and students (one exception is her argument to eliminate the perverse incentive structure of No Child Left Behind, which is almost diametrically opposed to a principle of greater flexibility). Besides simple morality, there are compelling pragmatic reasons to focus on making even incremental gains to underfunded schools first: namely, it’s more efficient. “Cash for Clunkers” was only for low-MPG cars, and that’s because if you go from 10 rods/hogshead to 20, you reduce gas use by 50%, whereas going from 30 to 40 is only a 25% reduction. Farmers gets more from the first ton of fertilizer than the third (diminishing returns). I’m loathe to put things in purely economic or statistical terms, but in this case it matters.
Everyone benefits more by working to pull schools at the bottom up than by trying to squeeze just a bit more out of schools already at the top. Giving an iPod to Duke students might improve learning and pedagogy, but probably only because so many other educational benefits and privileges have already been realized for those students. Giving a drag-reducing suit to an Olympic swimmer shaves off a meaningful 1/10th of a second from their lap time, but the same suit isn’t helping a novice. If the school is short on desks, teachers, computers, textbooks, or hot lunches, they aren’t at a point where a “drag-reducing” iPod matters.
I’m not staking out bold new territory, and my argument is slightly unfair because Davidson never suggests that brain science is some kind of educational cure-all. I agree that applied brain science and greater flexibility in goals and methods can improve education, but that has to be tempered with consideration of who is being helped—we can’t simply cleave off the question of “the science” from its implementation. The latter is the hard part.