Walk with Me, Talk with Me

In Aaron Sorkin’s acclaimed series, The West Wing, there was a recurring plot device. Characters, when faced with a thorny problem, often went for a walk and talked it out. The camera would capture it all in a long tracking shot.

Sorkin, who penned most of these scenes, used them to highlight the frenetic energy and pace of the White House. The characters exchanged rapid-fire, Sorkinesque dialogue while moving through spaces crammed with busy people buzzing in the background. The technique was – at the same time – both expository and transitional. It would move the story from location to location, often introduce additional characters as they joined the walk, then would veer off to do something important while it also furthered the story line with new details. It was the physical embodiment of multi-tasking, adding urgency to the pace, “There is so much to do and so little time to do it in.”

While Aaron Sorkin might not have intended it, there is also some solid neuroscience backing up the practice of walking and talking. And, as it turns out, you don’t even need to be walking with someone else to realize the cognitive benefits of a good stroll around the block.

German philosopher Friedrich Nietzsche wrote “All truly great thoughts are conceived by walking.” Nietzche seemed to be on to something. To come up with something new, the brain must do two different types of thinking: divergent and convergent. Divergent thinking could be defined as “thinking outside of the box.” Convergent thinking would be gathering up all those divergent thoughts and stuffing them “back in the box” to analyze the best option. According to a 2014 study from Stanford University, walking gives a significantly positive boost to divergent thinking but is less effective with convergent thinking.

Walking appears to open up the brain to new ideas. There is a positive “mind-body” effect that comes from just being active while you’re thinking, but walking also puts you in a different environment with varying stimuli. In the Stanford experiment, some participants walked outside and some just walked on a treadmill. Those that were outside realized the greatest creative boosts. 

But what if you’re walking with someone else? That’s where the benefits of walking really kick into high gear for certain kinds of brain activities. First of all, both the walkers are benefiting from the creative boost that walking gives you. But it also appears that walking allows you to connect with your fellow walker on both a physical and psychological level that operates at the subconscious level. 

Another study (Cheng, Kato, Saunders, Tseng, 2020) found that walkers soon synchronize their walking and this creates a physical bond between them. Those that walked together each evaluated the other person more highly after the walk than those that simply sat in the same room together. And, in case you’re wondering, the two didn’t even need to talk to each other. In the case of this study, both walkers were specifically instructed to stay silent during their walk.

That’s the “walk” part. But what about the “talk” part? As it turns out, walking brings its own benefits to that as well, and it’s not just the multi-tasking saving of time that Aaron Sorkin showed in the West Wing. 

Think about where you’re looking when you walk. The person you’re walking with is beside you but you’re looking ahead. You’re not looking them in the eye. For some types of communication, eye-to-eye might be the optimal mode, but for divergent thinking, this combination of being physically “in step” with the other person but also being free to let your eyes and mind wander a bit, enticed by what’s happening around you, turns out to be a very effective creative incubator. Your flow of fresh thoughts are not restricted by picking up negative micro-expressions from the other person. You’re not picking up any body language that may cause you to repress any creative ideas for fear of rejection. Soon, you’ll start to riff off each other’s ideas, adding to the idea generation process. 

There’s one more thing about walking. If you do need to just think for a while to process a new idea, those silences are a lot less awkward if you’re walking than if you’re across from each other at a boardroom table.

When Did the Future Become So Scary?

The TWA hotel at JFK airport in New York gives one an acute case of temporal dissonance. It’s a step backwards in time to the “Golden Age of Travel” – the 1960s. But even though you’re transported back 60 years, it seems like you’re looking into the future. The original space – the TWA Flight Center – was designed in 1962 by Eero Saarinen. This was a time when America was in love with the idea of the future. Science and technology were going to be our saving grace. The future was going to be a utopian place filled with flying jet cars, benign robots and gleaming, sexy white curves everywhere.  The TWA Flight Center was dedicated to that future.

It was part of our love affair with science and technology during the 60s. Corporate America was falling over itself to bring the space-age fueled future to life as soon as possible. Disney first envisioned the community of tomorrow that would become Epcot. Global Expos had pavilions dedicated to what the future would bring. There were four World Fairs over 12 years, from 1958 to 1970, each celebrating a bright, shiny white future. There wouldn’t be another for 22 years.

This fascination with the future was mirrored in our entertainment. Star Trek (pilot in 1964, series start in 1966) invited all of us to boldly go where no man had gone before, namely a future set roughly three centuries from then.   For those of us of a younger age, the Jetsons (original series from 1963 to 64) indoctrinated an entire generation into this religion of future worship. Yes, tomorrow would be wonderful – just you wait and see!

That was then – this is now. And now is a helluva lot different.

Almost no one – especially in the entertainment industry – is envisioning the future as anything else than an apocalyptic hell hole. We’ve done an about face and are grasping desperately for the past. The future went from being utopian to dystopian, seemingly in the blink of an eye. What happened?

It’s hard to nail down exactly when we went from eagerly awaiting the future to dreading it, but it appears to be sometime during the last two decades of the 20th Century. By the time the clock ticked over to the next millennium, our love affair was over. As Chuck Palahniuk, author of the 1999 novel Invisible Monsters, quipped, “When did the future go from being a promise to a threat?”

Our dread about the future might just be a fear of change. As the future we imagined in the 1960’s started playing out in real time, perhaps we realized our vision was a little too simplistic. The future came with unintended consequences, including massive societal shifts. It’s like we collectively told ourselves, “Once burned, twice shy.” Maybe it was the uncertainty of the future that scared the bejeezus out of us.

But it could also be how we got our information about the impact of science and technology on our lives. I don’t think it’s a coincidence that our fear of the future coincided with the decline of journalism. Sensationalism and endless punditry replaced real reporting just about the time we started this about face. When negative things happened, they were amplified. Fear was the natural result. We felt out of control and we keep telling ourselves that things never used to be this way.  

The sum total of all this was the spread of a recognized psychological affliction called Anticipatory Anxiety – the certainty that the future is going to bring bad things down upon us. This went from being a localized phenomenon (“my job interview tomorrow is not going to go well”) to a widespread angst (“the world is going to hell in a handbasket”). Call it Existential Anticipatory Anxiety.

Futurists are – by nature – optimists. They believe things well be better tomorrow than they are today. In the Sixties, we all leaned into the future. The opposite of this is something called Rosy Retrospection, and it often comes bundled with Anticipatory Anxiety. It is a known cognitive bias that comes with a selective memory of the past, tossing out the bad and keeping only the good parts of yesterday. It makes us yearn to return to the past, when everything was better.

That’s where we are today. It explains the worldwide swing to the right. MAGA is really a 4-letter encapsulation of Rosy Retrospection – Make America Great Again! Whether you believe that or not, it’s a message that is very much in sync with our current feelings about the future and the past.

As writer and right-leaning political commentator William F. Buckley said, “A conservative is someone who stands athwart history, yelling Stop!”

Do We Have the Emotional Bandwidth to Stay Curious?

Curiosity is good for the brain. It’s like exercise for our minds. It stretches the prefrontal cortex and whips the higher parts of our brains into gear. Curiosity also nudges our memory making muscles into action and builds our brain’s capacity to handle uncertain situations.

But it’s hard work – mentally speaking. It takes effort to be curious, especially in situations where curiosity could figuratively “kill the cat.” The more dangerous our environment, the less curious we become.

A while back I talked about why the world no longer seems to make sense. Part of this is tied to our appetite for curiosity. Actively trying to make sense of the world puts us “out there”, leaving the safe space of our established beliefs behind. It is literally the definition of an “open mind” – a mind that has left itself open to being changed. And that’s a very uncomfortable place to be when things seem to be falling down around our ears.

Some of us are naturally more curious than others. Curious people typically achieve higher levels of education (learning and curiosity are two sides of the same coin). They are less likely to accept things at face value. They apply critical thinking to situations as a matter of course. Their brains are wired to be rewarded with a bigger dopamine hit when they learn something new.

Others rely more on what they believe to be true. They actively filter out information that may challenge those beliefs. They double down on what is known and defend themselves from the unknown. For them, curiosity is not an invitation, it’s a threat.

Part of this is a differing tolerance for something which neuroscientists call “prediction error” – the difference between what we think will happen and what actually does happen. Non-curious people perceive predictive gaps as threats and respond accordingly, looking for something or someone to blame. They believe that it can’t be a mistaken belief that is to blame, it must be something else that caused the error. Curious people look at prediction errors as continually running scientific experiments, given them a chance to discover the errors in their current mental models and update them based on new information.

Our appetite for curiosity has a huge impact on where we turn to be informed. The incurious will turn to information sources that won’t challenge their beliefs. These are people who get their news from either end of the political bias spectrum, either consistently liberal or consistently conservative. Given that, they can’t really be called information sources so much as opinion platforms. Curious people are more willing to be introduced to non-conforming information. In terms of media bias, you’ll find them consuming news from the middle of the pack.

Given the current state of the world, more curiosity is needed but is becoming harder to find. When humans (or any animal, really) are threatened, we become less curious. This is a feature, not a bug. A curious brain takes a lot longer to make a decision than a non-curious one. It is the difference between thinking “fast” and “slow” – in the words of psychologist and Nobel laureate Daniel Kahneman. But this feature evolved when threats to humans were usually immediate and potentially fatal. A slow brain is not of any benefit if you’re at risk of being torn apart by a pack of jackals. But today, our jackal encounters are usually of the metaphorical type, not the literal one. And that’s a threat of a very different kind.

Whatever the threat, our brain throttles back our appetite for curiosity. Even the habitually curious develop defense mechanisms in an environment of consistently bad news. We seek solace in the trivial and avoid the consequential. We start saving cognitive bandwidth from whatever impending doom we may be facing. We seek media that affirms our beliefs rather than challenges them.

This is unfortunate, because the threats we face today could use a little more curiosity.