The Cost of Not Being Curious

The world is having a pandemic-proportioned wave of Ostrichitis.

Now, maybe you haven’t heard of Ostrichitis. But I’m willing to bet you’re showing at least some of the symptoms:

  • Avoiding newscasts, especially those that feature objective and unbiased reporting
  • Quickly scrolling past any online news items in your feed that look like they may be uncomfortable to read
  • Dismissing out of hand information coming from unfamiliar sources

These are the signs of Ostrichitis – or the Ostrich Effect – and I have all of them. This is actually a psychological effect, more pointedly called willful ignorance, which I wrote about a few years ago. And from where I’m observing the world, we all seem to have it to one extent or another.

I don’t think this avoidance of information comes as a shock to anyone. The world is a crappy place right now. And we all seem to have gained comfort from adopting the folk wisdom that “no news is good news.” Processing bad news is hard work, and we just don’t have the cognitive resources to crunch through endless cycles of catastrophic news. If the bad news affirms our existing beliefs, it makes us even madder than what we were. If it runs counter to our beliefs, it forces us to spin up our sensemaking mechanisms and reframe our view of reality. Either way, there are way more fun things to do.

A recent study from the University of Chicago attempted to pinpoint when children started avoid bad news. The research team found that while young children don’t tend to put boundaries around their curiosity, as they age they start avoiding information that challenges their beliefs or their own well-being. The threshold seems to be about 6 years old. Before that, children are actively seeking information of all kinds (as any parent barraged by never ending “Whys” can tell you). After that, chidren start strategizing the types of information they pay attention to.

Now, like everything about humans, curiosity tends to be an individual thing. Some of us are highly curious and some of us avoid seeking new information religiously. But even if we are a curious sort, we may pick and choose what we’re curious about. We may find “safe zones” where we let our curiosity out to play. If things look too menacing, we may protect ourselves by curbing our curiosity.

The unfortunate part of this is that curiosity, in all its forms, is almost always a good thing for humans (even if it can prove fatal to cats).

The more curious we are, the better tied we are to reality. The lens we use to parse the world is something called a sense-making loop. I’ve often referred to this in the past. It’s a processing loop that compares what we experience with what we believe, referred to as our “frame”. For the curious, this frame is often updated to match what we experience. For the incurious, the frame is held on to stubbornly, often by ignoring new information or bending information to conform to their beliefs. A curious brain is a brain primed to grow and adapt. An incurious brain is one that is stagnant and inflexible. That’s why the father of modern-day psychology, William James, called curiosity “the impulse towards better cognition.”

When we think about the world we want, curiosity is a key factor in defining it. Curiosity keeps us moving forward. The lack of curiosity locks us in place or even pushes us backwards, causing the world to regress to a more savage and brutal place. Writers of dystopian fiction knew this. That’s why authors including H.G. Wells, Aldous Huxley, Ray Bradbury and George Orwell all made a lack of curiosity a key part of their bleak future worlds. Our current lack of curiosity is driving our world in the same dangerous direction.

For all these reasons, it’s essential that we stay curious, even if it’s becoming increasingly uncomfortable.

When Did the Future Become So Scary?

The TWA hotel at JFK airport in New York gives one an acute case of temporal dissonance. It’s a step backwards in time to the “Golden Age of Travel” – the 1960s. But even though you’re transported back 60 years, it seems like you’re looking into the future. The original space – the TWA Flight Center – was designed in 1962 by Eero Saarinen. This was a time when America was in love with the idea of the future. Science and technology were going to be our saving grace. The future was going to be a utopian place filled with flying jet cars, benign robots and gleaming, sexy white curves everywhere.  The TWA Flight Center was dedicated to that future.

It was part of our love affair with science and technology during the 60s. Corporate America was falling over itself to bring the space-age fueled future to life as soon as possible. Disney first envisioned the community of tomorrow that would become Epcot. Global Expos had pavilions dedicated to what the future would bring. There were four World Fairs over 12 years, from 1958 to 1970, each celebrating a bright, shiny white future. There wouldn’t be another for 22 years.

This fascination with the future was mirrored in our entertainment. Star Trek (pilot in 1964, series start in 1966) invited all of us to boldly go where no man had gone before, namely a future set roughly three centuries from then.   For those of us of a younger age, the Jetsons (original series from 1963 to 64) indoctrinated an entire generation into this religion of future worship. Yes, tomorrow would be wonderful – just you wait and see!

That was then – this is now. And now is a helluva lot different.

Almost no one – especially in the entertainment industry – is envisioning the future as anything else than an apocalyptic hell hole. We’ve done an about face and are grasping desperately for the past. The future went from being utopian to dystopian, seemingly in the blink of an eye. What happened?

It’s hard to nail down exactly when we went from eagerly awaiting the future to dreading it, but it appears to be sometime during the last two decades of the 20th Century. By the time the clock ticked over to the next millennium, our love affair was over. As Chuck Palahniuk, author of the 1999 novel Invisible Monsters, quipped, “When did the future go from being a promise to a threat?”

Our dread about the future might just be a fear of change. As the future we imagined in the 1960’s started playing out in real time, perhaps we realized our vision was a little too simplistic. The future came with unintended consequences, including massive societal shifts. It’s like we collectively told ourselves, “Once burned, twice shy.” Maybe it was the uncertainty of the future that scared the bejeezus out of us.

But it could also be how we got our information about the impact of science and technology on our lives. I don’t think it’s a coincidence that our fear of the future coincided with the decline of journalism. Sensationalism and endless punditry replaced real reporting just about the time we started this about face. When negative things happened, they were amplified. Fear was the natural result. We felt out of control and we keep telling ourselves that things never used to be this way.  

The sum total of all this was the spread of a recognized psychological affliction called Anticipatory Anxiety – the certainty that the future is going to bring bad things down upon us. This went from being a localized phenomenon (“my job interview tomorrow is not going to go well”) to a widespread angst (“the world is going to hell in a handbasket”). Call it Existential Anticipatory Anxiety.

Futurists are – by nature – optimists. They believe things well be better tomorrow than they are today. In the Sixties, we all leaned into the future. The opposite of this is something called Rosy Retrospection, and it often comes bundled with Anticipatory Anxiety. It is a known cognitive bias that comes with a selective memory of the past, tossing out the bad and keeping only the good parts of yesterday. It makes us yearn to return to the past, when everything was better.

That’s where we are today. It explains the worldwide swing to the right. MAGA is really a 4-letter encapsulation of Rosy Retrospection – Make America Great Again! Whether you believe that or not, it’s a message that is very much in sync with our current feelings about the future and the past.

As writer and right-leaning political commentator William F. Buckley said, “A conservative is someone who stands athwart history, yelling Stop!”

The Double-Edged Sword of a “Doer” Society

Ask anyone who comes from somewhere else to the United States what attracted them. The most common answer is “because anything is possible here.” The U.S. is a nation of “doers”. It has been that promise that has attracted wave after wave of immigration, made of those chafing at the restraints and restrictions of their homelands. The concept of getting things done was embodied in Robert F. Kennedy’s famous speech, “Some men see things as they are and ask why? I dream of things that never were and ask why not?” The U.S. – more than anywhere else in the world – is the place to make those dreams come true.

But that comes with some baggage. Doers are individualists by definition. They are driven by what they can accomplish, by making something from nothing. And with that becomes an obsessive focus on time. When we have so much that we can do, we constantly worry about losing time. Time becomes one of the few constraints in a highly individualistic society.

But the US is not just individualistic. There are other countries that score highly on individualistic traits, including Australia, the U.K., New Zealand and my own home, Canada. But the U.S. is different, in that It’s also vertically individualistic – it is a highly hierarchal society obsessed with personal achievement. And – in the U.S. – achievement is measured in dollars and cents. In a Freakonomics podcast episode, Gert Jan Hofstede, a professor of artificial sociality in the Netherlands, called out this difference: “When you look at cultures like New Zealand or Australia that are more horizontal in their individualism, if you try to stand out there, they call it the tall poppy syndrome. You’re going to be shut down.”

In the U.S., tall poppies are celebrated and given god-like status. The ultra rich are recognized as the ideal to be aspired to. And this creates a problem in a nation of doers. If wealth is the ultimate goal, anything that stands between us and that goal is an obstacle to be eliminated.

When Breaking the Rules becomes The Rule

“Move fast and break things” – Mark Zuckerberg

In most societies, equality and fairness are the guardrails of governance. It was the U.S. that enshrined these in their constitution. Making sure things are fair and equal requires the establishment of rules of law and the setting of social norms.  But in the U.S., the breaking of rules is celebrated if it’s required to get things done. From the same Freakonomics podcast, Michele Gelfand, a professor of Organizational Behavior at Standford, said, “In societies that are tighter, people are willing to call out rule violators. Here in the U.S., it’s actually a rule violation to call out people who are violating norms. “

There is an inherent understanding in the US that sometimes trade-offs are necessary to achieve great things. It’s perhaps telling that Meta CEO Mark Zuckerberg is fascinated by the Roman emperor Augustus, a person generally recognized by history as gaining his achievements by inflicting some significant societal costs, including the subjugation of conquered territories and a brutal and systematic elimination of any opponents. This is fully recognized and embraced by Zuckerberg, who has said of his historic hero, ““Basically, through a really harsh approach, he established 200 years of world peace. What are the trade-offs in that? On the one hand, world peace is a long-term goal that people talk about today …(but)…that didn’t come for free, and he had to do certain things”.

Slipping from Entrepreneurialism to Entitlement

A reverence for “doing” can develop a toxic side when it becomes embedded in a society. In many cases, entrepreneurialism and entitlement are two different sides of the same coin. In a culture where entrepreneurial success is celebrated and iconized by media, the focus of entrepreneurialism can often shift from trying to profitably solve a problem to simply just profiting. Chasing wealth becomes the singular focus of “doing”.  in a society that has always encouraged everyone to chase their dreams, no matter the cost, it can create an environment where the Tragedy of the Commons is repeated over and over again.

This creates a paradox – a society that celebrates extreme wealth without seeming to realize that the more that wealth is concentrated in the hands of the few, the less there is for everyone else. Simple math is not the language of dreams.

To return to Augustus for a moment, we should remember that he was the one responsible for dismantling an admittedly barely functioning republic and installing himself as the autocratic emperor by doing away with democracy, consolidating power in his own hands and gutting Rome’s constitution.

Face Time in the Real World is Important

For all the advances made in neuroscience, we still don’t fully understand how our brains respond to other people. What we do know is that it’s complex.

Join the Chorus

Recent studies, including this one from Rochester University, are showing that when we see someone we recognize, the brain responds with a chorus of neuronal activity. Neurons from different parts of the brain fire in unison, creating a congruent response that may simultaneously pull from memory, from emotion, from the rational regions of our prefrontal cortex and from other deep-seated areas of our brain. The firing of any one neuron may be relatively subtle, but together this chorus of neurons can create a powerful response to a person. This cognitive choir represents our total comprehension of an individual.

Non-Verbal Communication

“You’ll have your looks, your pretty face. – And don’t underestimate the importance of body language!” – Ursula, The Little Mermaid

Given that we respond to people with different parts of the brain, it makes sense that we use part of the brain we didn’t realize when communicating with someone else. In 1967, psychologist Albert Mehrabian attempted to pin this down with some actual numbers, publishing a paper in which he put forth what became known as Mehrabian’s Rule: 7% of communication is verbal, 38% is tone of voice and 55% is body language.

Like many oft-quoted rules, this one is typically mis-quoted. It’s not that words are not important when we communication something. Words convey the message. But it’s the non-verbal part that determines how we interpret the message – and whether we trust it or not.

Folk wisdom has told us, “Your mouth is telling me one thing, but your eyes are telling me another.” In this case, folk wisdom is right. We evolved to respond to another person with our whole bodies, with our brains playing the part of conductor. Maybe the numbers don’t exactly add up to Mehrabian’s neat and tidy ratio, but the importance of non-verbal communication is undeniable. We intuitively pick up incredibly subtle hints: a slight tremor in the voice, a bead of sweat on the forehead, a slight turn down of one corner of the mouth, perhaps a foot tapping or a finger trembling, a split-second darting of the eye. All this is subconsciously monitored, fed to the brain and orchestrated into a judgment about a person and what they’re trying to tell us. This is how we evolved to judge whether we should build trust or lose it.

Face to Face vs Face to Screen

Now, we get to the question you knew was coming, “What happens when we have to make these decisions about someone else through a screen rather than face to face?”

Given that we don’t fully understand how the brain responds to people yet, it’s hard to say how much of our ability to judge whether we should convey trust or withhold it is impaired by screen-to-screen communication. My guess is that the impairment is significant, probably well over 50%. It’s difficult to test this in a laboratory setting, given that it generally requires some type of neuroimaging, such as an fMRI scanner. In order to present a stimulus for the brain to respond to when the subject is strapped in, a screen is really the only option. But common sense tells me – given the sophisticated and orchestrated nature of our brain’s social responses – that a lot is lost in translation from a real-world encounter to a screen recording.

New Faces vs Old Ones

If we think of how our brains respond to faces, we realize that in today’s world, a lot of our social judgements are increasing made without face-to-face encounters. In a case where we know someone, we will pull forward a snapshot of our entire history with that person. The current communication is just another data point in a rich collection of interpersonal experience. One would think that would substantially increase our odds of making a valid judgement.

But what if we must make a judgement on someone we’ve never met before, and have only seen through a screen; be it a TikTok post, an Instagram Reel, a YouTube video or a Facebook Post? What if we have to decide whether to believe an influencer when making an important life decision? Are we willing to rely on a fraction of our brain’s capacity when deciding whether to place trust in someone we’ve never met?

Media Modelling of Masculinity       

According to a study that was just released by the Movember Institute of Men’s Health, nearly two-thirds of 3000 young men surveyed in the US, the UK and Australia were regularly engaging with online masculinity influencers. They looked to them for inspiration on how to be fitter, more financially successful and how to increase the quantity and/or quality of their relationships.

Did they find what they were looking for?

It’s hard to say based on the survey results. While these young men said they found these influencers inspiring and were optimistic about their personal circumstances and the future social circumstances of men in general, they said some troubling things about their own mental health. They were less willing to prioritize mental health and were more likely to engage in risky health behaviors such as steroid use or ignoring their own bodies and pushing themselves to exercise too hard. These mixed signals seemed to come from influencers telling them that a man who can’t control is emotions is weak and is not a real man.

Also, not all the harm inflicted by these influencers was felt just by the men in their audience. Those in the study who followed influencers were more likely to report negative and limiting attitudes towards women and what they bring to a relationship. They felt that often women were being rude to them and that they didn’t have the same dating values as men.

Finally, men who followed influencers were almost twice as likely to value traits in their male friends such as ambition, popularity and wealth. They were less likely to look for trustworthiness or kindness in their male friends.

This brings us to a question. Why do young men need influencers to tell them how to be a better man? For that matter, why do any of us, regardless of age or sex, need someone to influence us? Especially if it’s someone who’s only qualification to dispense advice is that they happen to have a TikTok account with a million followers.

This is another unfortunate effect of social media. We have evolved to look for role models because to do so gives us a step up. Again, this made sense in our evolutionary past but may not do so today.

When we all belonged to a social group that was geographically bound together, it was advantageous to look at the most successful members of that group and emulate them. When we all competed in the same environment for the same resources, copying the ones that got the biggest share was a pretty efficient way to improve our own fortunes.

There was also a moral benefit to emulating a role model. Increasingly, as our fortunes relied more on creating better relationships with those outside our immediate group, things like trustworthiness became a behavior that we would do well to copy. Also, respect tended to accrue to the elderly. Our first role models were our parents and grandparents. In a community that depended on rules for survival, authority figures were another logical place to look for role models.

Let’s fast forward to today. Our decoupling with our evolutionarily determined, geographically limited idea of community has thrown several monkey wrenches into the delicate machinery of our society. Who we turn to as role models is just one example. As soon as we make the leap from rules based on physical proximity to the lure of mass influence, we inevitably run into problems.

Let’s go back to our masculinity influencers. These online influencers have one goal – to amass as many followers as possible. The economic reality of online influence is this: size of audience x depth of engagement = financial success. And how do you get a ton of followers? By telling them what they want to hear.

Let’s stare down some stark realities – well adjusted, mentally secure, emotionally mature, self-confident young males are less likely to desperately look for answers in online social media. There is no upside for influencers to go after this market. So they look elsewhere – primarily to young males who are none of the above things. And that audience doesn’t want to hear about emotional vulnerability or realistic appraisals of their dating opportunities. They want to hear that they can have it all – they can be real men. So the message (and the messenger) follows the audience, down a road that leads towards toxic masculinity.

Media provides a very distorted lens through which why might seek our new role models. We will still seek the familiar and the successful, but both those things are determined by what we see through media, rather than what we observe in real life. There is no proof that their advice or approach will pay off in the real world, but if they have a large following, they must be right.

Also, these are one-way “mentorships”. The influencers may know their audience in the aggregate, if only in terms of a market to be monetized, but they don’t know them individually. These are relationships without any reciprocity. There is no price that will be pad for passing on potentially harmful advice.

If there is damage done, it’s no big deal. It’s just one less follower.

Do We Have the Emotional Bandwidth to Stay Curious?

Curiosity is good for the brain. It’s like exercise for our minds. It stretches the prefrontal cortex and whips the higher parts of our brains into gear. Curiosity also nudges our memory making muscles into action and builds our brain’s capacity to handle uncertain situations.

But it’s hard work – mentally speaking. It takes effort to be curious, especially in situations where curiosity could figuratively “kill the cat.” The more dangerous our environment, the less curious we become.

A while back I talked about why the world no longer seems to make sense. Part of this is tied to our appetite for curiosity. Actively trying to make sense of the world puts us “out there”, leaving the safe space of our established beliefs behind. It is literally the definition of an “open mind” – a mind that has left itself open to being changed. And that’s a very uncomfortable place to be when things seem to be falling down around our ears.

Some of us are naturally more curious than others. Curious people typically achieve higher levels of education (learning and curiosity are two sides of the same coin). They are less likely to accept things at face value. They apply critical thinking to situations as a matter of course. Their brains are wired to be rewarded with a bigger dopamine hit when they learn something new.

Others rely more on what they believe to be true. They actively filter out information that may challenge those beliefs. They double down on what is known and defend themselves from the unknown. For them, curiosity is not an invitation, it’s a threat.

Part of this is a differing tolerance for something which neuroscientists call “prediction error” – the difference between what we think will happen and what actually does happen. Non-curious people perceive predictive gaps as threats and respond accordingly, looking for something or someone to blame. They believe that it can’t be a mistaken belief that is to blame, it must be something else that caused the error. Curious people look at prediction errors as continually running scientific experiments, given them a chance to discover the errors in their current mental models and update them based on new information.

Our appetite for curiosity has a huge impact on where we turn to be informed. The incurious will turn to information sources that won’t challenge their beliefs. These are people who get their news from either end of the political bias spectrum, either consistently liberal or consistently conservative. Given that, they can’t really be called information sources so much as opinion platforms. Curious people are more willing to be introduced to non-conforming information. In terms of media bias, you’ll find them consuming news from the middle of the pack.

Given the current state of the world, more curiosity is needed but is becoming harder to find. When humans (or any animal, really) are threatened, we become less curious. This is a feature, not a bug. A curious brain takes a lot longer to make a decision than a non-curious one. It is the difference between thinking “fast” and “slow” – in the words of psychologist and Nobel laureate Daniel Kahneman. But this feature evolved when threats to humans were usually immediate and potentially fatal. A slow brain is not of any benefit if you’re at risk of being torn apart by a pack of jackals. But today, our jackal encounters are usually of the metaphorical type, not the literal one. And that’s a threat of a very different kind.

Whatever the threat, our brain throttles back our appetite for curiosity. Even the habitually curious develop defense mechanisms in an environment of consistently bad news. We seek solace in the trivial and avoid the consequential. We start saving cognitive bandwidth from whatever impending doom we may be facing. We seek media that affirms our beliefs rather than challenges them.

This is unfortunate, because the threats we face today could use a little more curiosity.

Strategies for Surviving the News

When I started this post, I was going to unpack some of the psychology behind the consumption of the news. I soon realized that the topic is far beyond the confines of this post to realistically deal with. So I narrowed my focus to this – which has been very top of mind for me lately – how do you stay informed without becoming a trembling psychotic mess? How do you arm yourself for informed action rather than becoming paralyzed into inaction by the recent fire hose of sheer WTF insanity that makes up the average news feed.

Pick Your Battles

There are few things more debilitating to humans than fretting about things we can’t do anything about. Research has found a strong correlation between depression and our locus of control – the term psychologists use for the range of things we feel we can directly impact. There is actually a term for being so crushed by bad news that you lose the perspective needed to function in your own environment. It’s called Mean World Syndrome.

If effecting change is your goal, decide what is realistically within your scope of control. Then focus your information gathering on those specific things. When it comes to informing yourself to become a better change agent, going deep rather than wide might be a better strategy.

Be Deliberate about Your Information Gathering

The second strategy goes hand in hand with the first. Make sure you’re in the right frame of mind to gather information. There are two ways the brain processes information: top-down and bottom-up. Top-down processing is cognition with purpose – you have set an intent and you’re working to achieve specific goals. Bottom up is passively being exposed to random information and allowing your brain to be stimulated by it. The way you interpret the news will be greatly impacted by whether you’re processing it with a “top-down” intent or letting your brain parse it from the “bottom-up”

By being more deliberate in gathering information with a specific intent in mind, you completely change how your brain will process the news. It will instantly put it in a context related to your goal rather than let it rampage through our brains, triggering our primordial anxiety circuits.

Understand the Difference between Signal and Noise

Based on the top two strategies, you’ve probably already guessed that I’m not a big fan of relying on social media as an information source. And you’re right. A brain doom scrolling through a social media feed is not a brain primed to objectively process the news.

Here is what I did. For the broad context, I picked two international information sources I trust to be objective: The New York Times and the Economist out of the U.K. I subscribed to both because I wanted sources that weren’t totally reliant on advertising as a revenue source (a toxic disease which is killing true journalism). For Americans, I would highly recommend picking at least one source outside the US to counteract the polarized echo chamber that typifies US journalism, especially that which is completely ad supported.

Depending on your objectives, include sources that are relevant to those objectives. If local change is your goal, make sure you are informed about your community. With those bases in place, even If you get sucked down a doom scrolling rabbit hole, at least you’ll have a better context to allow you to separate signal from noise.

Put the Screen Down

I realize that the majority of people (about 54% of US Adults according to Pew Research) will simply ignore all of the above and continue to be informed through their Facebook or X feeds. I can’t really change that.

But for the few of you out there that are concerned about the direction the world seems to be spinning and want to filter and curate your information sources to effect some real change, these strategies may be helpful.

For my part, I’m going to try to be much more deliberate in how I find and consume the news.  I’m also going to be more disciplined about simply ignoring the news when I’m not actively looking for it. Taking a walk in the woods or interacting with a real person are two things I’m going to try to do more.

The Messaging of Climate Change

86% of the world believes that climate change is a real thing. That’s the finding of a massive new mega study with hundreds of authors (the paper’s author acknowledgement is a page and a half). 60,000 participants from 63 countries around the world took part. And, as I said, 86% of them believe in climate change.

Frankly, there’s no surprise there. You just have to look out your window to see it. Here in my corner of the world, wildfires wiped out hundreds of homes last summer and just a few weeks ago, a weird winter whiplash took temperatures from unseasonably warm to deep freeze cold literally overnight. This anomaly wiped out this region’s wine industry. The only thing surprising I find about the 86 percent stat is that 14% still don’t believe. That speaks of a determined type of ignorance.

What is interesting about this study is that it was conducted by behavioral scientists. This is an area that has always fascinated me. From the time I read Richard Thaler and Cass Sunstein’s book, Nudge, I have always been interested in behavioral interventions. What are the most effective “nudges” in getting people to shift their behaviors to more socially acceptable directions?

According to this study, that may not be that easy. When I first dove into this study, my intention was to look at how different messages had different impacts depending on the audience: right wing vs left wing for instance. But in going through the results, what struck me the most was just how poorly all the suggested interventions performed. It didn’t matter if you were liberal or conservative or lived in Italy or Iceland. More often than not, all the messaging fell on deaf ears.

What the study did find is that how you craft your campaign about climate change depends on what you want people to do. Do you want to shift non-believers in Climate Change towards being believers? Then decrease the psychological distance. More simply put, bring the dangers of climate change to their front doorstep. If you live next to a lot of trees, talk about wildfires. If you live on the coast, talk about flooding. If you live in a rural area, talk about the impacts of drought. But it should be noted that we weren’t talking a massive shift here – with an “absolute effect size of 2.3%”. It was the winner by the sheer virtue of sucking the least.

If you want to build support for legislation that mitigates climate change, the best intervention was to encourage people to write a letter to a child that’s close to you, with the intention that they read it in the future. This forces the writer to put some psychological skin in the game.  

Who could write a future letter to someone you care about without making some kind of pledge to make sure there’s still a world they can live in? And once you do that, you feel obligated to follow through. Once again, this had a minimal impact on behaviors, with an overall effect size of 2.6%.

A year and a half ago, I talked about Climate Change messaging, debating Mediapost Editor-in-Chief Joe Mandese about whether a doom and gloom approach would move the needle on behaviors. In a commentary from the summer of 2022, Mandese wrapped up by saying, “What the ad industry really needs to do is organize a massive global campaign to change the way people think, feel and behave about the climate — moving from a not-so-alarmist “change” to an “our house is on fire” crisis.”

In a follow up, I worried that doom and gloom might backfire on us, “Cranking up the crisis intensity on our messaging might have the opposite effect. It may paralyze us.”

So, what does this study say?

The answer, again, is, “it depends.” If we’re talking about getting people to share posts on social media, then Doom and Gloom is the way to go. Of all the various messaging options, this had the biggest impact on sharing, by a notable margin.

This isn’t really surprising. A number of studies have shown that negative news is more likely to be shared on social media than positive news.

But what if we’re asking people to make a change that requires some effort beyond clicking the “share” button? What if they actually have to do something? Then, as I suspected, Doom and Gloom messaging had the opposite effect, decreasing the likelihood that people would make a behavioral change to address climate change (the study used a tree planting initiative as an example). In fact, when asking participants to actually change their behavior in an effortful way, all the tested climate interventions either had no effect or, worse, they “depress(ed) and demoralize(d) the public into inaction”.

That’s not good news. It seems that no matter what the message is, or who the messenger is, we’re likely to shoot them if they’re asking us to do anything beyond bury our head in the sand.

What’s even worse, we may be losing ground. A study from 10 years ago by Yale University had more encouraging results. They showed that effective climate change messaging, was able to shift public perceptions by up to 19 percent. While not nearly as detailed as this study, the results seem to indicate a backslide in the effectiveness of climate messaging.

One of the commentators that covered the new worldwide study perhaps summed it up best by saying, “if we’re dealing with what is probably the biggest crisis ever in the history of humanity, it would help if we actually could talk about it.”