The Science of the Human Brain

It is the most complex living system in the known universe, built of hundreds of billions of cells, each as complicated as a city.

It is the primary author of the deeply personal story we tell ourselves about who we are and why we are here.

And it never, ever, shows us the world as it truly is — only as we need it to be.

This is the conundrum of the human brain, which is why understanding its peculiar science is a prerequisite towards our ability to imagine, and then build, a better world.

Consider this: Human beings — or, for that matter, any other life form on earth — see only what has helped them survive in the past. The way we make sense of the present is still being shaped by the innovations, fears and assumptions our ancestors made hundreds, thousands, even millions of years ago. And yet our success as a species has occurred not in spite of our inability to see reality, but because of it.

As neuroscientist Beau Lotto puts it, human beings didn’t evolve to see the world objectively; they evolved to “not die.”  And all living things have managed to “not die” thus far by developing different perceptual overlays on the same planetary backdrop.

Bats developed echolocation.

Cephalopods learned to change colors.

Birds got GPS.

And we developed a wet computer with a parallel processor — otherwise known as a bi-hemispheric brain.

Why did our brains evolve that way, with a singular blueprint stamped out twice? No one knows for sure, although psychiatrist Iain McGlichrist believes it happened to help us “attend to the world in two completely different ways, and in so doing to bring two different worlds into being,” and two different ways of surviving life’s slings and arrows.  

“In the one,” he explains, “we experience — the live, complex, embodied, world of individual, always unique beings, forever in flux, a net of interdependencies, forming and reforming wholes, a world in which we are deeply connected. In the other we ‘experience’ our experience in a special way: a ‘re-presented’ version of it, containing now static, separable, bounded, but essentially fragmented entities, grouped into classes, on which predictions can be based.

“These are not different ways of thinking about the world,” McGilchrist claims. “They are different ways of being in the world. If the left hemisphere is the hemisphere of ‘what’, the right hemisphere, with its preoccupation with context, the relational aspects of experience, emotion and the nuances of expression, could be said to be the hemisphere of ‘how.’”

Who we perceive ourselves to be is a mashup of these different perceptual overlays — ostensibly equal parts logic and emotion. And yet the reality of the modern world, says McGilchrist, is that our way of seeing has gotten out of balance.

For a number of reasons, we have become left-hemisphere heavy.

“My thesis is that for us as human beings there are two fundamentally opposed realities, two different modes of existence; that each is of ultimate importance in bringing about the recognizably human world; and that their difference is rooted in the bihemispheric structure of the brain. It follows,” he asserts, “that the hemispheres need to cooperate, but I believe they are in fact involved in a sort of power struggle, and that this explains many aspects of contemporary Western culture.”

Is it possible to bring ourselves back in balance? Harvard neuroscientist Jill Bolte Taylor would say yes — and she would know. In 1996, a blood vessel burst in the left half of her brain. “In the course of four hours,” she explains, “I watched my brain completely deteriorate in its ability to process all information.”

Because Taylor had spent her adult life studying the brain’s intricate architecture, she was uniquely suited to observe how the stroke was affecting her. What she discovered has huge implications for how all of us think about who (& why) we are in the world.

But first, a refresher course:

On one side of our brain, there is the right hemisphere, which is designed to help us remember things as they relate to one another. “To the right mind,” Taylor says, “no time exists other than the present moment, and each moment is vibrant with sensation. By design, our right mind is spontaneous, carefree, and imaginative. It allows our artistic juices to flow free without inhibition or judgment.” And it gives us the sense that we are indistinguishable, infinite, and interconnected.

By contrast, our left hemisphere organizes the world for us in a linear and methodical way. “Here,” says Taylor, “the concept of time is understood as either past, present or future, and our left brain thrives on details. Here, we use words to describe, categorize, define, and communicate about everything we see.” And here, we find the part of ourselves that feels most distinct — the part that proclaims, ‘I am,’ apart from the world.

Because Taylor’s stroke occurred in the left side of her brain, her right hemisphere was left unchecked by its usual counterbalance. As a result, she experienced a drastically different way of seeing the world (and her role in it). “I no longer perceived myself as a single, a solid, an entity with boundaries that separated me from the entities around me. Everything around us, about us, among us, within us, and between us is made up of atoms and molecules vibrating in space. Although the ego center of our language center prefers defining our self as individual and solid, most of us are aware that we are made up of trillions of cells, gallons of water, and ultimately everything about us exists in a constant and dynamic state of activity.

“My left hemisphere had been trained to perceive myself as a solid,” she explained, “separate from others. Now, released from that restrictive circuitry, my right hemisphere relished in its attachment to the eternal flow. I was no longer isolated and alone. My soul was as big as the universe and frolicked with glee in a boundless sea.”

For Taylor, the lesson is not that one hemisphere is better than the other; indeed, there is a reason our brain evolved this way — which is, simply, to integrate two contradictory yet complementary ways of seeing the world: one holistic and boundless, the other segmented and bound.

Rather, the lesson comes when we accept, as Beau Lotto puts it, that “the world out there is really just our three-dimensional screen. Our receptors take the meaningless information they receive; then our brain, through interacting with the world, encodes the historical meaning of that information, and projects our subjective versions of color, shape and distance onto things.

“Meaning is a plastic entity,” he reminds us, “much like the physical nature of the brain, which we shape and reshape through perceptual experiences. It is critical to understand that the meaning of the thing is not the same as the thing itself.”

Believing is seeing, more than seeing is believing. And so we will never change the story of how we learn and live unless we become aware of the way our brains are always trying to ensure that we “not die” — by providing us with a coherent story about our lives, a story that our brain works around the clock to create.

A story that is never, ever, accurate.

To see differently, we must learn to see seeing differently. “The more aware I remain about what my brain is saying and how those thoughts feel inside my body,” Taylor tells us, “the more I own my power in choosing what I want to spend my time thinking about and how I want to feel. If I want to retain my inner peace, I must be willing to consistently and persistently tend the garden of my mind moment by moment, and be willing to make the decision a thousand times a day.”

The clearer we are about the science of the human brain, in other words, the greater the chance we can appreciate the art of individual identity.

It is the most wondrous thing we have discovered in the universe, and it is us.

A Year of Wonder: The Neuroscience of Empathy

By announcing last month that I wanted 2016 to be a year of wonder, I put friendly pressure on myself to pursue on all the big questions that occurred to me. We’ll see how well I’m able to sustain the energy over the course of the rest of the year, but my first riddle was this: ‘If empathy is what makes us distinctly human, what do we know about the neuroscience of empathy itself?’

If a person wishes to wonder deeply about the world, which ingredient is more important – the person, or the world?

Until recently, our answer was clearly the latter.

For the great majority of our time on this planet, human beings have viewed the world almost entirely through the prism of “we,” not “me.” As foragers, we lived in unquestioning obedience to the unknowable marvels of the natural world. And in the earliest civilizations, we lived to serve the needs of our Gods in Heaven – and then, later on, their hand-chosen emissaries on Earth.

In these long chapters of the human story – which together make up more than 93% of our history as a species – our ancestors were most likely to find comfort, and a sense of identity, through their ability to fit usefully and invisibly into a larger community.

To stand out from the crowd was undesirable, since, in reality, doing so could mean ostracism or death.

To walk in someone else’s shoes was unnecessary, since, in effect, everyone wore the same shoes.

And to wonder about the world was to focus one’s gaze outward, or upward.

Over time, however, the human gaze has shifted. Beginning with the rise of the great religions, continuing through the citizen revolutions in France and the Americas, and running right up to and through the age of social media and the Selfie Stick, we humans have begun to increasingly look inward – and to find an equally endless source of awe and wonder as we do.

At the same time, a wave of new discoveries in fields ranging from neuroscience to psychology have taught us that our need to wonder is more than just a desire to daydream; it is the way we deepen our empathic capacity to connect with our fellow creatures.

“What do we human beings do all day long?” asks neuroscientist Marco Iacoboni. “We read the world, especially the people we encounter.”  And according to Iacoboni and his colleagues, we do so by relying on “mirror neurons” – a special subset of the more than 100 billion neurons that are busily and ever at work in the most complex structure in the known universe: the human brain.

They’re called mirror neurons to describe the ways that observing the behavior of someone else – from eating a peanut, to yawning, to experiencing sudden pain – can trigger the same brain activity in the observer as in the observed. “Our brains are capable of mirroring the deepest aspects of the minds of others at the fine-grained level of a single brain cell,” Iacoboni explains. “This is utterly remarkable. Equally remarkable is the effortlessness of this simulation. We do not have to draw complex inferences or run complicated algorithms.

“When we look at others, we find both them and ourselves.”

Similarly, a growing chorus of researchers has begun to suggest empathy is a foundational building block in our process of developing social cognition. “The brain is a social organ, made to be in relationship,” explains psychiatrist Daniel Siegel. “What happens between brains has a great deal to do with what happens within each individual brain . . . [And] the physical architecture of the brain changes according to where we direct our attention and what we practice doing.”

And yet, as far as words go, empathy is a new one – it didn’t even appear until the early 20th century. It comes from the English translation of the German word einfühlung, which was used to describe the relationship between a work of art and its subject; it was later expanded to include interactions between people.

Those interactions, according to social theorist Jeremy Rifkin, are what give rise to a deeper human capacity for making sense of the world. “Empathic consciousness starts with awe,” he contends. “When we empathize with another, we are bearing witness to the strange incredible life force that is in us and that connects us to all other living beings.

“It is awe that inspires all human imagination. Without awe, we would be without wonder and without wonder we would have no way to exercise imagination and would therefore be unable to imagine another’s life ‘as if’ it were our own.”

In other words, we have slowly flipped the paradigm of human understanding: strictly speaking, it is not the world that makes us wonder; it is our wondering that makes the world. Or, even more specifically, as the Chilean biologist-philosophers Francesco Varela and Humberto Muturana point out, “the world everyone sees is not the world but a world, which we bring forth with others.”

This epiphany is changing more than just our understanding of the brain. In recent years, scientists in fields ranging from biology to ecology have revised the very metaphors they use to describe their work – from hierarchies to networks – and begun to realize, as physicist Fritjof Capra says, “that partnership – the tendency to associate, establish links, and maintain symbiotic relationships – is one of the hallmarks of life.”

The downside of all this navel-gazing? A heightened risk of narcissism, consumerism, and reality television.

The upside? A steadily increasing empathic capacity, anchored in our development of a shared sense of vulnerability, and a paradoxical desire to seek “universal intimacy” with the world.

“We are learning,” Rifkin writes, “against all of the prevailing wisdom, that human nature is not to seek autonomy – to become an island to oneself – but, rather, to seek companionship, affection, and intimacy. We have been sending out radio communications to the far reaches of the cosmos in the hope of finding some form of intelligent and caring life, only to discover that what we were desperately seeking already exists among us here on Earth.”

 

 

The social origins of intelligence

There’s a fascinating new study out in which researchers studied the injuries and aptitudes of Vietnam War veterans who suffered penetrating head wounds. Among their findings? That “the ability to establish social relationships and navigate the social world is not secondary to a more general cognitive capacity for intellectual function, but that it may be the other way around. Intelligence may originate from the central role of relationships in human life and therefore may be tied to social and emotional capacities.”

Let me repeat that: cognitive intelligence is not separate from social intelligence. In fact, our capacity to deepen the former is dependent on our ability to be deeply grounded in the latter.

For anyone who has spent time as an educator, we’ve always intuitively known this to be true. As the saying goes, unmet social needs lead to unmet academic needs. Or, put more simply, the three most important words in teaching and learning are Relationships, Relationships, Relationships.

And yet the recent flood of cognitive research that confirms this intuitive truth is striking, especially when one considers how slowly it has made its way into the minds of our nation’s policymakers. Indeed, as lead researcher Aron Barbey put it, “the evidence suggests that there’s an integrated information-processing architecture in the brain, that social problem solving depends upon mechanisms that are engaged for general intelligence and emotional intelligence. This is consistent with the idea that intelligence depends to a large extent on social and emotional abilities, and we should think about intelligence in an integrated fashion rather than making a clear distinction between cognition and emotion and social processing.

“This makes sense,” Barber continues, “because our lives are fundamentally social. We direct most of our efforts to understanding others and resolving social conflict. And our study suggests that the architecture of intelligence in the brain may be fundamentally social, too.”

So what would the next generation of education policies need to look like in order to be aligned with the emerging consensus about how the brain works, and how people learn?

They would need to start incentivizing the conditions that support holistic child development and growth, and stop disproportionately weighting literacy and numeracy.

They would need to start crafting policies in concert with other departments, from health to housing to labor, as a way to try and systemically support our country’s poorest families.

They would need to ensure that teacher preparation and evaluation programs are grounded in the latest neuroscience, not our traditional notions of what teaching looks like and requires.

What else?

The Neuroscience of Democracy

In the ideal educational future, is there a single design principle that matters most in establishing the optimal learning environment for children?

That seems like a pretty important question to consider. And if you were to go by today’s leading reform strategies, you might conclude that the answer is, variably, greater accountability, better use of data, more strategic use of technology, or more personalization (all good things, by the way). Yet for my money, the design principle that matters most is the one modern reform efforts care about the least – the extent to which schools are creating true laboratories of democratic practice.

Continue reading . . .

Want to Get Smarter? Be More Childlike.

Interesting piece on NPR this morning in which Shankra Vedantam reviews some of the recent research in neuroscience. You can listen to it here, and you should because it highlights something simple and significant — that the best way to keep learning over one’s life is to keep hold of the boundless inquiry that characterizes early childhood.

“Using mathematical techniques that allow researchers to disentangle the effects of genetic and environmental influences on individuals,” Vedantam reports, researchers “noticed that kids who had higher IQs to begin with seemed to have an extended period in adolescence during which they retained the ability to learn at a rapid pace, just like much younger children.

“I found that twins that had a higher IQ were showing a more childlike pattern of influence during adolescence,” said one of the researchers, Penn State’s Angela Brant.

If that’s true, it would make sense to structure learning environments for children that are proactively designed to unleash each young person’s inherent sense of wonder and curiosity. And yet, here in DC and elsewhere across the country, we are doing the opposite. It’s true — too many young people are arriving in school with extreme deficits when it comes to literacy and numeracy. And it’s true — those things matter. But the best way to help all children thrive is not by making Kindergarten resemble a 10th grade honors class; it’s by making that 10th grade honors class more like Kindergarten.

That’s something educators have known for a long time. Now they have the research to boot.

Story Time

The other night, at a friend’s house for an early evening barbeque, I tried and failed repeatedly to get my 3-year-old son to eat his dinner.

It didn’t matter that the other kids at the table were eating. It didn’t matter that these were hot dogs we were talking about. And it definitely didn’t matter whether I pleaded or demanded that Leo fill his belly. He was, quite simply, not having it. And there was nothing I could do to change his mind.

Sensing my exasperation, my friend Jeremy leaned over and whispered: “Watch this.”

“Would anyone like to hear a story?” he asked. Leo stopped what he was doing, nodded, and listened intently as Jeremy spun a tale about a little boy lost in the forest who followed a single firefly, discovered a Sembar tree where all the other fireflies gathered to light up the night sky, and gained entrance to a secret, magical world.

Although there was a moral to Jeremy’s story, its message was not so symmetrical as to suggest that good boys clean their plates. And yet for the duration of the story, Leo listened, fully engaged in the wonders of an imaginary landscape, and absent-mindedly ate his dinner.

I was grateful for Jeremy’s clever parenting – and annoyed I didn’t think of it myself. After all, a convergence of recent research has confirmed something we have always instinctively known to be true: when we follow the trail of a well-crafted story, our brains light up like a Sembar tree.

Dr. James Zull is a professor of Biology at Case Western University, and the author of the book The Art of Changing the Brain. As he puts it, “We judge people by their stories, and we decide they are intelligent when their stories fit with our own stories. Recalling and creating stories are key parts of learning. We remember by connecting things with our stories, we create by connecting our stories together in unique and memorable ways, and we act out our stories in our behaviors.”

Zull says using vivid metaphors is a particularly effective way to foster new connections between the more than 100 billion neurons in a human brain. These connections are called neuronal networks, and once they’re made, they possess specific physical relationships to each other in the brain, and thus embody the concept of the relationship itself. “If you believe that learning is deepest when it engages the most parts of our brain,” Zull adds, “you can see the value of stories for the teacher. We should tell stories, create stories, and repeat stories, and we should ask our students to do the same.”

Of course, the same can be said for parents, and not just before bedtime.  If we want our children to develop the internal hardware to understand the world – and then imagine that world through the eyes of experiences of others – we should help them make sense of their surroundings through the stories we read and share. It is, quite simply, how people learn – and oh by the way, it may even help your child finish his dinner.