Note: Supplemental materials are not guaranteed with Rental or Used book purchases.
Purchase Benefits
Looking to rent a book? Rent Mind Wide Open Your Brain and the Neuroscience of Everyday Life [ISBN: 9780743241663] for the semester, quarter, and short term or search our site for other textbooks by Johnson, Steven. Renting a textbook can save you up to 90% from the cost of buying.
What is included with this book?
Preface: Kafka's Room | 1 | (18) | |
1 Mind Sight | 19 | (28) | |
2 The Sum of My Fears | 47 | (24) | |
3 Your Attention, Please | 71 | (35) | |
4 Survival of the Ticklish | 106 | (29) | |
5 The Hormones Talking | 135 | (23) | |
6 Scan Thyself | 158 | (25) | |
Conclusion: Mind Wide Open | 183 | (34) | |
Notes | 217 | (40) | |
Bibliography | 257 | (6) | |
Acknowledgments | 263 | (2) | |
Index | 265 |
The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.
The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.
How pathetically scanty my self-knowledge is compared with, say, my knowledge of my room....There is no such thing as observation of the inner world, as there is of the outer world.
-- Kafka
The idea for this book began with a nervous joke -- a handful of nervous jokes, to be precise. A few years ago, thanks to a lucky convergence of events and a long-standing curiosity, I found myself in the office of a biofeedback practitioner, lying on a couch with sensors attached to my palms, fingertips, and forehead. As we talked, the two of us stared into a computer monitor, where a series of numbers flashed on the screen like some kind of low-budget version of the CNBC ticker tape. The numbers documented precisely how much I was sweating and updated several times a second. I've never taken a lie detector test, but something about having a stranger ask me questions while keeping a close eye on my sweat glands put me on edge. And so I started making jokes.
Getting a little tense was partly the point of the exercise. The machine I was attached to was tracking changes in my adrenaline levels, the "fight-or-flight" hormone secreted by the adrenal glands in situations that require a sudden surge of energy. Increased adrenaline can be detected through a number of means: because the hormone diverts blood from the extremes of the body to the core, drops in temperature at the extremities often suggest a release of adrenaline (hence the sensors on my fingertips). Sweating is also a telltale sign of heightened adrenaline levels. Because damp skin conducts electricity more effectively than dry skin, the electrodes on my palms could track how much I was sweating by monitoring changes in conductivity over time.
Biofeedback systems are designed to give you a new kind of control over your body and mind by making physiological changes visible in a new way. After a few sessions, biofeedback users learn to "drive" their adrenaline levels up or down almost as though they were deciding to lift a finger or bend a knee. The brain, of course, is constantly adjusting adrenaline levels anyway -- it's just that you're not usually aware of the process other than as a background sense of increased energy or calm.
For the first five minutes of the session, my adrenaline levels remained at the midpoint of the scrolling chart, bouncing around ever so slightly, but with no real pronounced variation. And then something in the situation -- I can't remember now what it was -- caused me to make an offhand joke. We both chuckled at my remark and then noticed that a huge spike had appeared on the monitor. Making the joke had triggered a surge of adrenaline in me. Or was it the reverse? Perhaps the rise in adrenaline was me mentally revving the engines before launching my joke into the environment. Whatever the causal chain, my joke-telling and my adrenaline levels were locked in some kind of chemical embrace.
The extent of that link became clear at the end of our session, when the therapist handed me a printout of my adrenaline levels plotted over our thirty-minute encounter. It was, simply put, a timeline of my attempts at humor: a flat line interrupted by five or six dramatic spikes. I looked at that paper and thought: I've caught a glimpse ofmehere, viewed from an angle that I've never experienced before. I'd known for many years that I had a tendency to crack jokes compulsively in certain social situations, particularly in situations where the formality of the setting made humor a riskier bet. But I'd never thought about those jokes as triggering a chemical reaction in my own head. Suddenly, they seemed less like casual attempts at humor and more like a drug addict's hungering for a new fix.
I knew those adrenaline surges were just the tip of the iceberg. The creation and appreciation of humor is a remarkably complex neurological event, involving many parts of the brain and a host of chemical messengers. Doctors at the University of California Medical School, for example, recently located a small region near the front of the left brain that appears to trigger the feeling of mirth; while treating a sixteen-year-old epileptic patient, they applied a tiny jolt of electric current to the area, which caused the patient to find humor in whatever she happened to be looking at. This wasn't merely a physical reflex of laughter: things genuinely seemed funny to her when the region was stimulated. ("You guys are just so funny -- standing around," she told her startled doctors.) Laughter itself involves a complex array of muscle actions, and there is increasing evidence that it triggers the release of small amounts of endorphins, the brain's natural painkillers. (The next time you visit a comedy club, think "opium den.") But making jokes in conversation also requires a subtle sense of one's audience, a feel for their sense of humor and state of mind. Such outer-directed imagination is itself governed by another part of the brain, a part believed to be damaged in autistics and that accounts for their strained social interactions.
This is what came to my mind as I thought about my nervous jokes on the biofeedback practitioner's couch: that with each of those jokes somewhere in my head there was an elaborate electrochemical ballet unfolding, one that had been evolving since my first smile, or before. And now I had glimpsed a subsection of that inner performance as it happened. I found myself wondering how many of these little chemical subroutines are running in my brain on any given day? At any given moment? And what would it tell me about myself if I could see them, the way I could see those adrenaline spikes on the printout?
And so biofeedback started me on my quest. I set out to track down as many charts, real-time displays, and 3-D models of my mental life as I could find. I talked to some of the world's leading neuroscientists, asking them the question I'd been asking myself: "How had understanding the brain changed the way they thought about themselves?" I also found technology startups and armchair enthusiasts who had embraced brain science as a tool for self-exploration. It was a propitious time to make this journey. Over the past three decades, science has given us extraordinary glimpses of the brain's inner geography, illuminating the amazing extent to which different tasks activate clearly defined regions: recognizing the face of a loved one, or planning a grocery list, or stringing together a sentence. Thus far, these new scientific tools have been employed mostly to observe people who have suffered neurological damage and to assess the mental maps shared by all human brains. But brains are like fingerprints -- each of us possesses a unique neurological topography. We now have the technology in place to picture that inner landscape, in itself as it really is. These are tools, in other words, for exploring our individual minds, with all their quirkiness and inimitability. These are tools for capturing who we are, on the level of synapses and neurotransmitters and brain waves. Every human brain is capable of generating different patterns of electrical and chemical activity. The promise of these new tools involves being able to figure out whatyourpattern looks like. And then figuring out what that pattern tells you about yourself.
It's likely that you've thought about the patterns of your own brain's wiring before. The general movement of popular psychology over the past century has been one from deeply figurative descriptions of mental traits toward greater physiological specificity: the movement, in a sense, from Oedipus to the neuron. Adrenaline itself has entered our everyday lexicon, as has the notion of our body administering quick chemical fixes purely for pleasure: we do things, we say, for the adrenaline rush, or the endorphin high. Radio ads now tout various wonder drugs' ability to alter our neurotransmitter profiles as though they were selling dandruff shampoo. If you've readListening to Prozac,you've probably met a person who seemed depressed and thought:hmm, very low serotonin.But such responses are just hunches about our inner physiological states, and crude ones at that. There are dozens of so-called information molecules in your body -- neurotransmitters, hormones, peptides -- each playing a key role in your shifting emotional response to external events, triggering everything from the nurturing instinct in mothers to the agitated surge of a panic attack. Could tools that measure the minute-by-minute levels of those substances in your body and brain teach you something about your own emotional toolbox? Could they help you make sense of your dreams, or your phobias? We've learned to track our mood changes with a statistician's exactitude, to explore our childhood memories, to keep our minds alert with exercise. But your moods and memories and perceptions are themselves derived from electrochemical activity in your brain. What could you learn about yourself if you could catch a glimpse of that activity directly? If you could see what your brain looked like when it was remembering a long-forgotten childhood experience, or listening to a favorite song, or conceiving a good idea?
Brain-imaging tools are miracles of modern science, but they are not the only channels to your mind's inner life. Simply possessing a more informed understanding of your brain's internal architecture can change the way you think about yourself. Part of such a process involves separating out mental routines that you typically experience in unison. If you know nothing about what's actually happening in your head, the neurological activity you experience is invisible: it's just you being yourself. But the more you learn about the brain's architecture, the more you recognize that what happens in your head is more like an orchestra than a soloist, with dozens of players contributing to the overall mix. You can hear the symphony as a unified wash of sound, but you can also distinguish the trombones from the timpani, the violins from the cellos. To come to a comparable understanding of your own head, you don't need a million-dollar imaging machine. You just need to learn something about the brain's components and their typical patterns of activation. Sometimes those components come in the form of specialized brain regions; sometimes they come in the form of chemicals, like serotonin. Invariably, a certain mood that strikes you will contain a mix of both, the result of both neurochemical release and predictable activity in specific regions of your brain.
As you learn to detect these brain components, you start to recognize how much multitasking is really going on in your own head. You realize that the emotion you feel isn't simply a reaction to the world at that moment, but rather something closer to a drug, with a strange life of its own. There's what we used to call a "rational" you and an "emotional" you, and the two aren't always in sync. Brain science has now given us more accurate descriptions of these two sides of a personality, mapped onto specific regions of the brain. Instead of "rational" and "emotional," today we have the "neocortical" you and the "limbic" you.
Consider this situation, which you've probably encountered many times before. You're in a perfectly good mood, having a conversation with a friend or colleague. You're not particularly aware of your emotional state, but it's purring along behind the scenes, making your dialogue free and unencumbered. And then your friend makes a passing reference to something unsettling, maybe a little stressful. Not earth-shattering, not immediately life-jeopardizing, but stressful nonetheless. Maybe he's alluded to some upcoming corporate retreat you haven't been invited to, or a tax deadline you'd forgotten about. Whatever it is, the news triggers a falling sensation in your body; you feel deflated and on edge.
And then your friend says something that surprises or distracts you, and the depressing news flies out of your working memory, replaced by some other thought. At this moment, something uncanny happens in your head, not unlike the feeling of déjà vu. You feel the stress in your body and your head, but you can't remember what triggered it in the first place. The feeling has been separated from the thought. Or put another way, you've lost the thought, but the feeling keeps on churning. Normally in this type of a situation you end up rewinding the tape of the conversation in your head -- What were we just talking about? -- and you locate the original item after a few seconds, at which point your mental state seems to snap back into place, just like the feeling of déjà vu lifting and linear time reinstating itself. You're still stressed, but at least you know the reason why.
Discontinuities occur like this because your conscious, second-by-second processing of a verbal conversation happens in one part of your brain, while your emotional evaluations happen somewhere else. Most of your immediate focus on generating and comprehending spoken words takes place, broadly speaking, in the prefrontal lobes of the neocortex, the most evolutionarily modern part of the brain. (Two small regions are particularly crucial: Broca's and Wernicke's areas, the former largely focused on creating speech, the latter on processing incoming words.) But the emotions largely issue forth from areas located below the cortex, the region often called the "limbic system," while some of their bodily effects are triggered one layer below the limbic system, in the brain stem that lies at the top of your spinal column. The activity in the prefrontal lobes consists mostly of the flash of neurons talking to each other in a very small region of your head, while the limbic system starts a cascade of events that lead to the release of chemicals that travel throughout the body, including one called "cortisol" that is responsible for much of the physical damage caused by long-term stress.
So when you hear that stress-inducing sentence, two reactions go off in your head: your language centers and working memory decode the meaning and put it front and center in your consciousness; and a subcortical system triggers the stress response, releasing cortisol and other chemicals throughout your brain and body. The two systems operate at fundamentally different speeds, the prefrontal activity unfolding on the level of microseconds and the stress system on the level of seconds or even minutes. That's why the two can get out of sync with one another. You think of something stressful and just as quickly forget about it. The prefrontal lobes can move that fast. But your emotional systems lag behind -- there's still cortisol floating in your bloodstream thirty seconds after the news vanishes from your working memory. And so the feeling stays alive in you.
The question is: for that moment of disconnect, what exactly is in charge here? Your frontal lobes or your limbic system? And which one should you trust?
Brain science books sometimes suffer from a recurrent problem, one with no small measure of irony. The subject matter of a book about the human brain is, by definition, as close to home as you get. (These books are being read by human brains, after all.) But the deeper you delve into the details of brain anatomy, the higher the ratio of Latinate to English words becomes, and before long the lay reader is struggling to keep track of names like the "cingulate cortex" and the "nucleus accumbens." Some books try to scale this learning curve by starting off with a crash course in neuroanatomy. My approach is different: we'll start instead with a brain in action -- feeling fear, laughing at a joke, coming up with a good idea -- and tease out the underlying mechanisms as we go.
I've also tried to limit the terminology needed to read this book: a half dozen chemicals, a half dozen brain regions, and a rudimentary understanding of the way neurons communicate. It is one of my fundamental assumptions that you can get something useful out of neuroscience with this level of mastery. (For the aficionados and the extracurious, I've included more detailed explanations in the endnotes.) The brain contains multitudes, as Whitman said in another context, but you don't need to memorize them all to be a better user of your brain. If you know the landmarks, you can get your bearings. And when you're navigating a space as complicated as your own brain, getting your bearings can make all the difference.
If you've read a little about the brain over the past decade, you've no doubt encountered two topics that have dominated the public discussion of brain science. The first has to do with explaining consciousness, what the neuroscientist Antonio Damasio calls "the feeling of what happens." The second has to do with the field of evolutionary psychology, which argues that our brains contain a kind of mental toolbox selected over millions of years of evolution to help our ancestors survive and reproduce in challenging environments. Consciousness and evolution are each fascinating avenues for exploration, but this book will try to sidestep both, in slightly different ways.
Let's start with consciousness. Imagine you're seeing the face of a loved one after a long time apart, and feeling the pleasurable emotions triggered by that sight. We know a great deal about the path of incoming visual stimuli, shuttling information about the light bouncing off the contours of the face from your optic nerve to the sensory cortex. We know that this information resonates with memory storage systems controlled by the hippocampus, helping you remember details about your loved one. We also know quite a bit about the chemicals released in your brain that conjure up the feeling of emotional warmth. Thanks both to modern imaging technologies and studies of patients with localized brain damage, we can describe with truly remarkable precision the neurological ballet performed in your head when you gaze at the face of a child or spouse. But our scientific vision grows foggier when we try to explain how those patterns of neurochemical activity somehow create your first-person experience of that gaze: the "faceness" of your loved one's face, the "emotionness" of the emotional feeling. Consciousness theorists call these properties "qualia": the brain's representation of both the external world and the body's internal state -- the taste of red wine, the look of light shimmering on water, the feeling of sudden fear hijacking your body.
It seems preposterous at first, but there is a real question as to why we need qualia at all. We could theoretically have evolved brains capable of the entire range of human mental responses -- processing internal and external stimuli, evaluating situations as either emotionally positive or negative, executing long-term plans -- without actuallyfeelingany of these processes. We'd be like robots or zombies, indistinguishable from normal humans from the outside but empty on the inside. So the question becomes: how did this strange property of mind come about? The brain is ultimately just a big lump of atoms strung together in a particular configuration, no different in this sense from a teakettle or a crown of broccoli. Presumably the teakettle and the broccoli aren't conscious of themselves or their environment, so why should we be?
To simplify almost to the point of parody, there are four competing answers to that question on today's consciousness stage. The first is that the broccoli and the teakettle are conscious in some unimaginably different way from how we are. In other words, qualia is a property of matter itself, and the human brain is simply the most advanced qualia recording apparatus yet evolved. The second answer is that something unique exists in the configuration of cells that makes consciousness happen in brains and not in broccoli, though the nature of that something is a matter of great debate. The third answer implicates a mystery substance not yet understood by science -- quantum behavior, perhaps, or some kind of spiritual life force -- that turns a bunch of interconnected cells into a feeling brain. The fourth is the trick answer, proposing that one of the properties of consciousness is that it can't explain itself, and so we'll never get to the bottom of qualia no matter how scientifically and technologically adept we become.
These are all mesmerizing possibilities, even if they do tend to induce a kind of existential vertigo (or make you a little squeamish the next time you drop a piece of broccoli into a pot of boiling water). I wouldn't be at all surprised if one of the many theories of consciousness proposed in the past decade turns out to be largely correct. But science is very far from a consensus on this question right now, and I suspect it will remain in that state for the foreseeable future.
And so in this book, I've made it a matter of policy to avoid the question of consciousness as often as possible. Running away from the problem of qualia turns out to be a relatively healthy strategy, because there's a huge number of interesting and productive things that you can say about the brain without tackling the question of why consciousnessfeelsthe way it does. Think about my biofeedback session and my joke-telling adrenaline fix. Getting even that brief glimpse of my brain's chemical feedback system taught me something new about my personality and my conversational habits, and sharpened my awareness of the way making jokes changed my internal mood. (And explained why I sometimes had a tendency to make jokes inappropriately.) But despite these insights, I have no idea whatsoever why an adrenaline rush feels the way it does. I can describe its edgy uplift, compare it to the effects of exogenous drugs like caffeine, predict the ways it will change my subsequent behavior. But I can't tell you where the qualia of adrenaline comes from. It would be nice to know, of course, but fortunately it's not the only kind of knowledge that neuroscience can impart to us.
Then there's the evolutionary psychology debate, which runs parallel to -- and is often indistinguishable from -- the question of nature and nurture. Are our mental faculties simply the product of evolved genes, or are they shaped by the circumstances of our upbringing? Unlike the mysteries of consciousness, this question has a clear, and I believe convincing, answer: they're both. We are a mix of nature and nurture through and through, and it's precisely the interplay between evolved tools and cultural experience that creates the richness of the human condition.
In this book, I discuss some of the properties of the brain in terms of evolution, because a Darwinian perspective can sometimes illuminate features that might otherwise be shrouded in darkness, or help us understand drives and habits of mind that are unduly powerful or hard to shake. In chapter four, for instance, we'll look more closely at the brain science of laughter, and part of that analysis will touch on why laughter evolved in the first place, which in turn helps us understand something new about when and why we laugh in everyday life. (It has much less to do with humor than you might think.)
So evolutionary explanations will not be entirely absent from the chapters ahead, but neither will they be front and center. You can be agnostic about -- or downright hostile toward -- the premise of the evolved brain and still gain something from modern brain science, because on a basic level, the languages of nature and nurture are written in the same ink. My brain, for instance, may be releasing adrenaline with each successful punch line because millions of years of evolution endowed me with DNA that wired it that way. Or it may be that some unique set of circumstances from my childhood influenced that circuit in my brain. Most likely, of course, it's a bit of both: adrenaline release during laughter may be a common human trait, just a little exaggerated in my case. But whatever the original cause, the wiring is there in my head, releasing its adrenaline like some kind of neurochemical Old Faithful. It's fascinating to speculate whether a specific trait came from your ancestors or your fifth-grade teacher, but you don't need to have a convincing answer to learn about the inner life of your brain.
When public conversation turns to the way our biology shapes our behavior, we often encounter a quick denunciation of the entire premise: someone will claim that talking about minds in biological or Darwinian terms is "biological determinism," a highbrow, sanitized version of the old horrors of racism, eugenics, and social Darwinism. For the most part, these fears are unfounded. Evolutionary psychology addresses the shared characteristics of the human species, what unites us all irrespective of race or culture -- exactly the opposite of what a race-based inquiry into our biological roots would attempt to discover.
Of course, the one place in which the evolutionary psychologists have in fact emphasized differences over commonalities is the fraught world of the sexes. Because so much of natural selection is predicated on reproductive success or failure, and because men and women have such different biological stakes in the act of reproduction, and because the sexual divide has been evolving for hundreds of millions of years, and not hundreds of thousands -- it is inevitable that natural selection would craft slightly different toolboxes for each sex. Viewed with modern imaging technologies, men's and women's brains are nearly as distinct from each other as their bodies are. They have reliably different amounts of neurons and gray matter; some areas linked with sexuality and aggression are larger in men than in women; the left and right hemispheres are more tightly integrated in women than in men. And of course, those brains -- and the bodies they are attached to -- are partially shaped by two totally different kinds of hormones, the androgens and estrogens, which play a key role both in development and adult life experiences. Men and women are most certainly not from Mars and Venus, but it is entirely fair to say that they are on different drugs. A world in which the sexes were mentally indistinguishable might be a less conflict-ridden world, though also a little duller. But the truth is it is not the world we inhabit. Writing a book about brain science without describing some of these differences would be an exercise in bad faith, emphasizing politics over science in a way that does injustice to both.
In the past few decades, a certain type of science story has become commonplace in the media. You've probably encountered dozens of renditions of it: scientists announce that they have uncovered the roots of a particular human psychological attribute. The two standard variations of this story are the brain scanning version and the evolutionary psychology version. In the former, scientists pick some trait or behavior -- a craving for sugar, say -- and use a brain-imaging device to scan someone while they're experiencing that craving. The part of the brain that lights up during the scan -- the dorsal striatum, in this case -- is identified as the "craving center" of the brain, and before long a press release is being drafted.
The evolutionary psychology version of the same story follows a different path. Instead of locating neurological roots, the scientists discover historical roots: the evolutionary history of why one trait came to be selected. This is a more speculative science, but a powerful one nonetheless. It takes an explanatory approach, not just a descriptive one, trying to answer the ultimate question of why we are the way we are. So the evolutionary psychologists explain that we have sugar cravings because carbohydrates were rare on the savannahs of Africa where the modern human brain evolved. A rule of thumb that was adaptive in one environment (if you happen to find sugar, eat as much of it as you can) turns out to be maladaptive in an environment where Coca-Cola is practically in the water supply.
These two stories are intriguing ones, and there's much to be learned from both approaches. But neither story tells you something about your own present-tense experience that you don't know already. You're already familiar with your sugar cravings, and while it's nice to learn about their origins, knowing the role of the dorsal striatum won't help much the next time you're salivating over that Mars bar. If science is going to tell you something useful about your brain, it has to go beyond simply explaining the roots of some familiar mental phenomenon. Your brain is filled with a lively cast of characters sharing space inside your cranium, and while it's interesting to find out their exact addresses, that information is ultimately unsatisfying. Call it the "neuromap fallacy." If neuroscience turns out to be mostly good at telling us the location of the "food craving center," or the "jealousy center," then it will be of limited relevance to ordinary people seeking a new kind of self-awareness -- because learning where jealousy lives in your head doesn't make you understand the emotion any more clearly. Those neuromaps will be of great interest to scientists, of course, and doctors. But to the layperson, they'll be little more than trivia.
The best that the brain sciences offer comes in the form of genuine insights, insights in both senses of the word: a looking within and a new way of understanding. To that end, I have applied a test of sorts to the stories I've assembled for this book. I call it the "long-decay" test -- as with a sound wave that takes an extended time to trail off into silence (or a radioactive material with a long half-life). There are insights about the brain that prompt a quick burst of recognition -- "So that's where the food craving comes from!" -- and then just as quickly fade in the mind. These insights fail the long-decay test -- they don't stick with you in any profound way. To pass the test, the insight has to reverberate for weeks or months after you've first encountered it; it has to pop up in conversation or in moments of self-reflection; it may even change your behavior based on what it teaches you about yourself. Long-decay ideas transform as much as they inform.
For the most part, the long-decay ideas I've assembled here have direct relevance to ordinary minds, minds untroubled by the extreme conditions profiled in so much of the scientific literature: amnesia, Parkinson's, Alzheimer's, manic-depression, the many forms of aphasia. The most powerful theories of mind have always had something useful to contribute to generally healthy minds and not just troubled ones. Freud developed his theories partially by analyzing the debilitating disorders of hysterics and schizophrenics, but psychoanalysis ultimately attracted such a large audience because you didn't need to be mentally ill to find something useful in it. You could explore your Oedipal complex and analyze your dreams even if you weren't worried about your sanity. I believe modern neuroscience deserves to be seen the same way: as relevant to the healthy as it is to the ill, as relevant to those of us wrestling with the small triumphs and tragedies of everyday life as it is to those battling more forbidding demons.
Enough disclaimers. I've tried to write what follows not as a polemic or a broadside, but as a kind of appreciation. Think of the way an art historian or a musicologist can help you discern new qualities in a great painting or symphony; your perception widens when you look through their eyes or listen with their ears. Brain experts can help us do the same with our own mental life. Under their tutelage, we start noticing reflexes and patterns hitherto invisible to us. Knowing something about the brain's mechanics -- and particularlyyourbrain's mechanics -- widens your own self-awareness as powerfully as any therapy or meditation or drug. Brain science has become an avenue for introspection, a way of bridging the physiological reality of your brain with the mental life you already inhabit. The science and technology today are no longer limited to telling us howthemind works. They also have something to say about howyourmind works.
Unlike so many technoscientific advances, the brain sciences and their imaging technologies are, almost by definition, a kind of mirror. They capture what our brains are doing and reflect that information back to us. You gaze into the glass, and the reflection says to you, "Here is your brain." This book is the story of my journey into that mirror.
Copyright © 2004 by Steven Johnson
Chapter One: Mind Sight
"He that has eyes to see and ears to hear may convince himself that no mortal can keep a secret. If his lips are silent, he chatters with his fingertips; betrayal oozes out of him at every pore."
-- Freud
I'm gazing into a pair of eyes, scanning the arch of the brow, the hooded lids, trying to gauge whether they're signaling defiance or panic. Just a pair of eyes -- no mouth or torso, no hand gestures or vocal inflections. All I have to go on is a rectangular photo of two eyes staring at me from a computer screen. When I've made my judgment -- it's defiance, after all -- another set pops on the screen, and I start my examination all over again.
This reverse eye exam is part of an ingenious psychological test devised by the British psychologist Simon Baron-Cohen. The test presents you with thirty-six different sets of eyes, some crinkled in mirth, others gazing off to the horizon deep in thought. Below each image are four adjectives, such as:
Or:
It's your job to choose the adjective that best fits the image. Is that raised eyebrow a sign of doubt? Or is it rebuke? The eyes themselves are a demographic mix: some weathered and ancient, others accented with mascara and eyeliner. The subtlety of the expressions is astonishing; as I scroll from image to image, I'm seeing the human eye with a fresh perspective, feeling a newfound amazement at its communicative range.
This test, though, is not ultimately about the eye's capacity to signal emotion. It's about something just as impressive, and just as easily overlooked: the brain's ability to read those signals, to peer into the inner landscape ofanothermind, while relying only on the most transient of cues. You won't find exam questions like these on the IQ test, or the SATs, but the mental skills being measured here are as eanotherssential as any in our cognitive toolbox. It turns out that one of the human brain's greatest evolutionary achievements is its ability to model the mental events occurring in other brains.
Chances are you've had an experience roughly like this: you're at a social gathering with colleagues or peers -- say it's an office holiday party -- and you run into a coworker with whom you have an unspoken rivalry. It's one of those relationships that is chummy on the surface, but right beneath there's a competitive energy that neither side acknowledges. When you first encounter your colleague, there's the usual pleasant banter, but before long he's confessed to you that something has gone wrong with his career trajectory: either he's lost a big account at work or the fellowship didn't come through or the last batch of short stories got rejected. Whatever it is, it's bad news. It's the sort of news that a friend should perhaps greet with a concerned, doleful expression, which is exactly the expression that you deliberately contort your face into as he delivers the news.
The trouble is, you're only a friend on the surface. Below the surface, you're a rival, and a rival wants to grin at this news, wants to relish theschadenfreude.And so for a split second, as you're hearing the fateful syllables roll off his tongue, his tone foreshadowing his disappointment before the sentence is even complete, you let out the slightest hint of a grin.
And then an intricate dance begins. As your face wraps itself up in dutiful concern, you detect a flash of something inhisface, a momentary startle that says, "Were you just smiling right there?" Perhaps his eyes suddenly lock on to your pupils, or he pauses in midsentence as though something has distracted him. In your mind, an interior closed-captioning emerges: "Did he see that grin?" As you offer your condolences, you can't help wondering if your words sound cruel rather than comforting. "Is he thinking that I'm faking all this sympathy? Maybe I should tone it down a notch just in case."
The silent duet of those two internal monologues should be familiar to you, even if you're the sort of person who never, ever gloats at another's downfall. (Henry James made a literary career out of documenting these subtle interactions.) It needn't be a Cheshire cat grin that provokes the interior monologues: imagine a conversation between two potential lovers, in which one worries that a facial expression has betrayed his love before he has summoned the courage to make a formal declaration. Sometimes the closed-captioning can overshadow the main dialogue, which can make for stilted conversation, with each participant second-guessing the other's thoughts.
This silent conversation -- a passing grin, a sudden look of recognition, a lurking question about another's motivation -- comes so naturally to us that most of the time we're not even aware that we are locked into such a complex exchange. The internal duet comes naturally because it relies on parts of the brain that specialize in precisely this kind of social interaction. Neuroscientists refer to this phenomenon as "mindreading" -- not in the ESP sense, but rather in the more prosaic, but no less impressive, sense of building an educated guess about what someone else is thinking. Mindreading is literally part of our nature. We do it more effortlessly, and with more nuance, than any other species on the planet. We construct working hypotheses about what's going on in other people's heads almost as readily as we convert oxygen into carbon dioxide.
Because mindreading is part of our nature, we don't bother to teach it in schools or test our aptitude for it in placement exams. But it is a skill like any other, a skill that is unevenly distributed throughout the general population. Some people are deft mindreaders, picking up subtle intonational shifts and adjusting their response with imperceptible ease. Others mindread with the subtlety of a Mack truck, constantly second-guessing themselves or interrogating their conversational partners. Some are simply "mindblind," shut off entirely from other people's internal monologues.
Even though we don't teach this particular skill in school, and we barely have a vocabulary to describe it, our mindreading abilities play a key role in our work and relationship successes, our sense of humor, our social ease. But to understand these consequences, you have to stop taking the internal duet for granted. You have to slow it down, explore its underlying processes, recognize the duet for the marvel that it is.
Our growing appreciation for the art of mindreading was accelerated in the late 1990s by the discovery of "mirror neurons" in the brains of monkeys, neurons that fire both when a monkey does a particular task -- grabbing a branch, for example -- and when the monkey sees another monkey do that same task, suggesting that the brain is designed to draw analogies between our own mental and physical states and those of other individuals. At the same time, researchers explored the premise that autistic people suffer from a kind of mindblindness, preventing them from building hypotheses about others' internal monologues. In related studies, evolutionary psychologists began to think about the Darwinian rewards of mindreading in a social species, examining chimp populations for signs of comparable internal duets. Yet other scientists speculated on the connection between mirror neurons and the origins of language, since all forms of communication presuppose a working model of the object you're attempting to communicate with. For language to evolve, humans needed a viable theory about the minds of other people -- otherwise, they'd just be talking to themselves.
Let's now go back to that silent duet at the office party, to the moment that half-concealed grin leaks out of the side of your mouth before you can replace it with the look of sympathy. What's happening here? Most of the time you walk around with the assumption that you're the boss of you, that you have a unified self that controls your actions in a relatively straightforward way. But your telltale grin challenges most of our assumptions about this selfhood, because at that moment at the office party, you are trying your hardest to do the exact opposite of smiling; you're trying to look concerned and upset, full of compassion. But your mouth wants to smile. Whose mouth is it anyway?
The answer is that your mouth has several masters, and some of them are brain subsystems that regulate emotional states. Smiling at times of genuine pleasure is not a learned behavior; every recorded culture on the planet represents the internal mental state of happiness with a smile. Deaf-blind children start smiling on the exact same developmental timetable as children who can see and hear. Cultures certainly differ in their assumptions about what makes people happy, as the popularity of frog's legs and Steven Seagal movies in France will attest. And cultures also differ in their production of fake smiles, as in the beaming "bye-bye nows" of American flight attendants. But genuine happiness -- whatever the details of its origin -- expresses itself as a smile in all normal homo sapiens.
Ironically, the forced smile of the flight attendant demonstrates just how innate the smiling reflex really is. A century and a half ago, the French neurologist Duchenne de Boulogne began studying the muscular underpinnings of people's facial expressions, using the then-state-of-the-art technologies of photography and electricity. Duchenne photographed his subjects in various emotional states, and tried to automatically simulate their expressions by activating specific muscles with a small jolt of electric current. (The images from Duchenne's experiments look like something from a Nine Inch Nails video.) In 1862, he published his findings in a volume titledMechanism of the Human Physiognomy,which Darwin drew upon extensively ten years later in his best-sellingThe Expressions of the Emotions in Man and Animals.But Duchenne's research soon fell into oblivion, only to be discovered more than a century later by the University of California at San Francisco psychologist Paul Ekman, now generally considered to be the world's leading expert on facial expressions.
The most widely cited discovery in Duchenne's work involved smiling. Using his crude tools, Duchenne established that genuine smiles and fake smiles utilize completely distinct ensembles of facial muscles -- most visible in the eyes, which crinkle in real smiles but remain unchanged in the faux ones. (As a tribute to his long-neglected forebear, Ekman began referring to the genuine article as a "Duchenne smile.") The muscle that controls eye-smiling is called theorbicularis oculi,and its activation has proved to be a reliable indicator of internal happiness or mirth. Modern brain scans show that pleasure centers in the brain light up in sync with theorbicularis oculi,but show no activity during fake smiles created with the mouth alone. The next time you want to know if your beaming waiter truly wants you to have a nice day, check out the outer edges of his eyebrows; if they don't dip slightly when he smiles, he's faking it.
Duchenne's insights into the muscular underpinnings of the smile make it easier to detect counterfeit good cheer, but they also teach us a more important lesson about selfhood and the emotions. Duchenne smiles are not willed deliberately into existence. You can consciously paint a fake smile on your face, but a real one erupts through a process that your conscious mind controls only in part. This is demonstrated most vividly in studies of stroke victims who suffer from a disturbing condition known as central facial paralysis, which prevents them from voluntarily moving either the left or right side of their face, depending on the location of the neurological damage. When these individuals are asked to smile or laugh on command, they produce lopsided grins: one side of the mouth curls up, the other remains frozen. But when they're told a joke or they're tickled, full smiles animate their face.
This is why the smile has more than one master: sometimes it is triggered by the emotional systems, other times by areas that control voluntary facial movement. (Of course, depending on the brain region, the smile will differ slightly in its expression.) So that inadvertent grin that slips out at the news of your rival's misfortune? It's the result of two brain systems vying for control of the same face. The part of the brain that controls voluntary muscle movement -- called the motor cortex -- sends a command instructing the face to appear sympathetic. But your emotional system is requesting a toothy grin. Your face can't satisfy both requests at the same time, so what results is a little bit of both: a grin that swiftly morphs into an expression of worried sincerity.
And herein lies lesson one of that office party encounter: your brain is not a general-purpose computer with one unified central processor. It is an assemblage of competing subsystems -- sometimes called "modules" -- specialized for particular tasks. Most of the time, we only notice these modules when their goals are out of sync. When they work together, they coalesce into a unified sense of self. The idea of multiple selfhood is not, strictly speaking, a discovery of the brain sciences. There's a long tradition of artists and philosophers documenting how fragmented we are below the surface, most notably in the modernist writers that pried open the psyche a century ago. Here's Virginia Woolf describing the struggle between the two models of self inMrs. Dalloway:
How many million times she had seen her face, and always with the same imperceptible contraction! She pursed her lips when she looked in the glass. It was to give her face point. That was her self -- pointed; dartlike; definite. That was her self when some effort, some call on her to be her self, drew the parts together, she alone knew how different, how incompatible and composed so for the world only into one centre, one diamond, one woman who sat in her drawing-room and made a meeting-point...
Freud famously envisioned the psyche as a battleground among three competing forces: id, superego, and ego. The modern understanding of the brain shatters that earlier vision into dozens of component parts, some specializing in core survival tasks, such as heartbeat regulation and the fight-or-flight instinct, others focused on more prosaic skills, such as face recognition. Your personality is, in a real sense, the aggregate of the differing strengths of each of these modules -- as they have been shaped both by nature and nurture, by your genes and by your lived experience. In other words:you are the sum of your modules.
If the modular nature of the mind is often hidden to us, how can we see behind the curtain of the unified self and catch a glimpse of those interacting components? Several avenues are available to us. There are the studies of pathological cases popularized by books such as Oliver Sacks'sThe Man Who Mistook His Wife for a Hat,in which we detect the existence of modules through patients who have suffered targeted brain damage that takes out one or two modules but leaves the rest of the brain functioning normally. Or we can experience the modularity of the brain more directly by taking drugs that throw a monkey wrench into its machinery, causing individual modules to take on a new autonomy (which is why people on drugs often feel as though they hear voices). Or you can gaze inside your brain directly, using today's brain-imaging technologies.
Another more entertaining way into the modular mind is through the back door of illusions and various tricks of the mind. Optical illusions help reveal modules by triggering conflicts between different submodules in the visual system: modules for distinguishing between background and foreground, recognizing borders between objects, or locating objects in 3-D space. Remember the childhood game of spinning in place and then stopping quickly to feel the spinning continue? In this game, as you turn, objects in the room pass by you in a counterclockwise direction. But when you stop, you feel a sense of vertigo, and the room seems to be spinning around you in the reverse direction, as though you were standing at the motionless center of a merry-go-round. Why does the room seem to spin after you've stopped moving? And why does it appear to spin in the other direction?
This staple of early childhood play reveals the brain's modular approach to detecting motion. The part of the brain that evaluates whether you're moving relies on two primary sources: information from the visual field and information from the fluid sloshing around in your inner ear. Most of the time, those two lieutenants concur in their assessments to their commander, but when you stop suddenly after spinning clockwise, the liquid in your inner ear continues to move around for a few seconds more, while your vision responds instantly to the cessation of movement. So the haptic centers of the brain are taking in conflicting data: the inner ear reports you're still moving, while the eyes report that you're at rest. The only way the brain can resolve this conflict is to assume that both reports are correct: youarestill spinning, but it doesn't seem that way because the world around you is spinning right along with you. The illusion of the world rotating is actually a brilliant on-the-fly interpretation that your brain makes to reconcile the conflicting data it receives. It's not the correct interpretation, of course, but it's a revealing one.
Module disagreement is not a bad way of describing the ultimate cause behind that inadvertent grin at the office party: part of your brain wants to smile, and part of it wants to show sympathy. The result is a kind of "slip of the face": the mouth and eyes betraying an emotion that the social self wants suppressed. The lesson here is that the control structures between modules often matter as much as the strength or weakness of each module itself. The brain is a network, and the way that each node in that network communicates with other nodes is an essential part of its higher-level properties. Even among the macrostructures of the brain, the connections made are as important as the individual structures themselves. One notable difference between male and female neuroanatomy is the communication channel that connects the left and right hemispheres, called the corpus callosum, which is much larger in women than in men. We now believe that this increased connectivity enables women to do a better job than men at reconciling the sometimes conflicting interpretations offered up by each hemisphere.
Some people are good at suppressing grins, while others are lousy at it. Some modules are better at overriding other modules; some are more submissive. Understood in the broadest sense, the process of growing up can be seen as the slow subjugation of emotional centers -- such as the amygdala, which plays an essential role in fear responses -- by the more recently evolved regions of the brain located in the prefrontal cortex that control voluntary actions, long-term planning, and other higher functions. Infants are born with relatively well-developed amygdalas, which is why they're so good at being frightened right out of the gate. But their prefrontal regions take most of childhood to mature.
So not only is the mind a network of distinct modules, but those modules sometimes compete with each other. The brain's modular system cannot be imagined as a neurological report card, with a B+ for face recognition and a failing grade for mindreading. This is because the modules interact with each other, sometimes inhibiting, sometimes amplifying, sometimes translating or interpreting in novel ways. The brain is much more like an ecosystem than a list of stable personality traits, with modules simultaneously competing and relying on each other. Hence lesson two:It's a jungle in there.
So if we now understand something about that renegade grin, what can we say about its detection? The silent duet of mindreading begins in your colleague's brain when he first thinks to himself, midsentence, that you might be quietly celebrating his bad news. It's fitting that the telltale sign is the crinkling of your eyes, as yourorbicularis oculibetrays your inner state. Mindreading is in many ways a kind of eye-reading -- we learn a great deal about the content of other people's thoughts by watching their eyes. Eyes are essential to building what brain scientists call a "theory of other minds."
The connection between mindreading and eye-reading begins early in child development -- so early, in fact, that it is unlikely to be the product of learned behavior. In their first year, most children will become adept at something called "gaze monitoring": they see you looking off toward the corner of the room; they turn and look in that direction; then they check back to make sure the two of you are looking at the same thing. Because we do it so well, gaze monitoring doesn't seem like much of an accomplishment, but it requires an elaborate understanding of the human visual apparatus, too elaborate to be purely the product of cultural learning.
Think about what's implied in gaze monitoring. First, you have to understand that people have their own perceptions of the world, distinct from yours. Second, some of those perceptions flow into their mind through their eyes. Third, you can determine the objects people perceive by drawing a straight line from the black circles in the middle of their eyes outward. Fourth, when those black circles shift, that means the gaze has shifted to another object. Consequently, if you want to know what another person is perceiving, you follow the movement of those black circles, and then shift your own gaze toward the object they're focused on.
If the gaze-monitoring skill were purely a learned behavior, it would take a month of school and a four-year-old's brain to master it. Infants can barely be taught how to use a spoon, much less how to track retinal movements and deduce inner mental states. They can'tlearngaze monitoring, but they do it nonetheless -- because their brains contain a cheat sheet of sorts that prepares them for the underlying principles of gaze monitoring, a kind of psychological physics: people have minds; people's minds perceive different things; part of that perception happens through the eyes; if you want to know what someone's thinking, look at his eyes. These biological cues start early in life: one study found that two-month-old infants were more likely to stare at the eyes than at any other part of the face.
As we grow older, we scrutinize people's eyes for subtler cues: not just what they're looking at, but what they're thinking and feeling. Because our emotional systems are wired directly to our facial muscles, á la the Duchenne smile, we often get accurate portraits of other people's moods just by scanning their eyes or the corners of their mouth. As our office party exchange shows, sometimes that portrait gives a more accurate testimony than people's verbal descriptions of their moods. Who are you going to believe -- me or my lying eyes?
Gaze monitoring and emotional expression recognition are two of the fundamental mindreading systems, but we also use other tricks. We monitor speech intonation carefully for emotional nuance. We put ourselves into other people's mental shoes -- what the cognitive scientists call the "simulation theory" of mindreading, according to which your brain is effectively running a mini-simulation of someone else's to anticipate how the other person might feel.
Your brain runs all these routines any time you interact with other people. It takes careful training, or massive distraction, to stop your mind from inferring other people's mental states as you talk to them. Mindreading is a background process that feeds into our foreground processes; we're aware of the insights it gives us but usually not aware of how we're actually getting that information, and how good we are at extracting it.
The sophistication of our mindreading skills is part of our heritage as social primates; our biology contains cheat sheets for building theories about other minds because our brains evolved -- and continue to evolve -- in complex social environments where being able to outfox or cooperate with your fellow humans was essential to survival. So just as some animals evolved nervous systems that were adapted for sudden movement or sonar, our brains grew increasingly sophisticated at modeling the behavior of other brains. An entire host of neurological systems revolve around the expectation that you will spend much of your life managing social relationships of one sort or another. Your brain is wired to expect an environment with oxygen, gravity, and light. It's also wired to expect an environment populated by other brains. Hence lesson three:Deep down, we're all extroverts.
We're all extroverts, except those of us whose brains have developed without the normal mindreading systems. There are dozens of neurological disorders that compromise social skills, but few are more common than the family of conditions that we generally call "autism."
Autistic people possess many skills lacking in the normal population: they often have nearly photographic memories and astonishing mathematical abilities. Their ease with mechanical systems, including computers, can be extraordinary. But autism impairs social skills dramatically. While autistic people can usually learn and communicate using language, there is something missing in their exchanges with other people, some strange distance in their social demeanor. They seem emotionally remote, disconnected.
Many experts now believe that this distance derives from a distinct neurological condition: autistics are mindreading-impaired. The social distance associated with autism is a vivid example of the brain's modular nature: autistics generally have above-average IQs, and their general logic skills are impeccable. But they lacksocialintelligence, particularly the ability to make on-the-fly assessments of other people's inner thoughts. Autistic peopledohave to go to school to read facial expressions -- learning to intuit another person's mood is at least as challenging for them as learning to read is for the rest of us. When you're engaged in conversation, you don't think to yourself, "Aha! His right eyebrow just crinkled up. He must be happy." You just sense that there's a happy expression on his face. But autistics have to perform precisely that kind of deliberate analysis, memorizing which expressions are associated with which emotions and then studying people's faces actively as they talk, looking for signs. One of the early predictors of autism in toddlers is an inability to perform gaze monitoring. It's as though autistics are born without the social physics that the rest of us possess innately, as though they were mindblind.
Simon Baron-Cohen believes that the symptoms of autism exist on a continuum: while some people clearly suffer from extreme cases, millions suffer only from minor cases of mindblindness. (Because autism is ten times more likely to develop in boys than girls, Baron-Cohen has argued that the disorder should be considered simply an extreme version of the male brain's tendencies, rather than a disconnected aberration.) The history of mathematics and physics is populated by borderline autistics: people with great number skills but limited social grace. We all know bright people who perform poorly in social situations, seem disengaged in conversation, or fail to pick up on our emotional cues. Even if you're a particularly astute mindreader, you probably have your own "autistic moments" in passing, when you're conducting a conversation on autopilot, lost in your own internal monologue. If you spend enough time with the literature, you can't help dividing up your friends and colleagues into the talented mindreaders and the mind-dyslexics. You start evaluating your own prowess as you engage with other people. Mindreading becomes a part of your basic vocabulary for evaluating yourself and others: some people have a sharp sense of humor, some are quick learners, some are good mindreaders.
If autism exists on a continuum, then it's possible to locate yourself on that continuum. You can take a simple test called the Autism Spectrum Quotient that Baron-Cohen and his colleagues created -- answer fifty questions about yourself on a Web page, and a simple program spits out a number between 1 and 32. The higher the number, the closer you are to autism. (The median result is 16.4.) It's not exactly hard science because it relies on self-evaluation and the questions themselves are relatively broad. But if you trust your ability to assess the general areas of your personality, the test provides a rough sketch of your autism quotient (otherwise known as "AQ").
The questions are phrased as statements with which you can "definitely agree," "slightly agree," "slightly disagree," or "definitely disagree."
If you've read something about autism, or the theory of other minds, these questions will seem predictable enough. When I took the test -- if you must know, I scored a 15, just slightly less autistic than average -- I flipped through the questions with a kind of jaded awareness: here's the facial expression question, here's the number memory question. It was only when I went back and reviewed the exam that I realized my familiarity with the topic had blinded me to something fascinating about the test itself.
Think about those last two statements: "I am not very good at remembering phone numbers" and "I don't usually notice small changes in a situation or a person's appearance." Now, if you come to the test knowing something about autism, you'll instantly deposit those two statements on opposite ends of the AQ spectrum. An autistic person, you'll think, will be good at remembering phone numbers and bad at noticing small changes in someone's appearance. But if you don't know anything about autism, if you're just coming to the test with a commonsense understanding of human psychology, then those two attributes will hardly seem like opposites. You'd probably think someone with a good memory for phone numbers would bemorelikely to notice small changes in appearance: she'd be detail-oriented, good at keeping track of small things. Certainly these don't seem like traits that would naturally be opposed to one another. But if you know something about the brain science behind autism, the fact that the two traits are inversely related makes perfect sense, because number skills and mindreading skills aren't simply the result of general intelligence; they're specialized modules, modules that for some as of yet unknown reason have been yoked together in the brain's wiring.
This is one of the key insights that neuroscience brings to our sense of self: stren
Excerpted from Mind Wide Open: Your Brain and the Neuroscience of Everyday Life by Steven Johnson
All rights reserved by the original copyright owners. Excerpts are provided for display purposes only and may not be reproduced, reprinted or distributed without the written permission of the publisher.