How We Forage for the News We Want

Reuters Institute out of the UK just released a comprehensive study looking at how people around the world are finding their news. There is a lot here, so I’ll break it into pieces over a few columns and look at the most interesting aspects. Today, I’ll look at the 50,000-foot view, which can best be summarized as a dysfunctional relationship between our news sources and ourselves. And like most dysfunctional relationships, the culprit here is a lack of trust.

Before we dive in, we should spend some time looking at how the way we access news has changed over the last several years.

Over my lifetime, we have trended in two general directions – less cognitively demanding news channels and less destination specific news sources. The most obvious shift has been away from print. According to Journalism.org and the Pew Research Center, circulation of U.S. Daily newspapers peaked around 1990, at about 62 and a half million. That’s one subscription for every 4 people in the country at that time.

In 2018, it was projected that circulation had dropped more than 50%, to less than 30 million. That would have been one subscription for every 10 people. We were no longer reading our news in a non-digital format. And that may have significant impact on our understanding of the news. I’ll return to this in another column, but for now, let’s just understand that our brain operates in a significantly different way when it’s reading rather than watching or listening.

Up the end of the last century, we generally trusted news destinations. Whether it be a daily newspaper like the New York Times, a news magazine like Time or a nightly newscast such as any of the network news shows, each was a destination that offered one thing above all others – the news. And whether you agreed with them or not, each had an editorial process that governed what news was shared. We had a loyalty to our chosen news destinations that was built on trust.

Over the past two decades, this trust has broken down due to one primary factor – our continuing use of social media. And that has dramatically shifted how we get our news.

In the US, three out of every four people use online sources to get their news. One in two use social media.  Those aged 18 to 24 are more than twice as likely to rely on social media. In the UK, under-35s get more of their news from Social Media than any other source.

Also, influencers have become a source of news, particularly amongst young people. In the US, a quarter of those 18 to 24 used Instagram as a source of news about COVID.

This means that most times, we’re getting our news through a social media lens. Let’s set aside for a moment the filtering and information veracity problems that introduces. Let’s just talk about intent for a moment.

I have talked extensively in the past about information foraging when it comes to search. When information is “patchy” and spread diversely, the brain has to make a quickly calculated guess about which patch it’s most likely to find the information in it’s looking for. With Information Foraging, the intent we have frames everything that comes after.

In today’s digital world, information sources have disaggregated into profoundly patchy environments. We still go to news-first destinations like CNN or Fox News but we also get much of our information about the world through our social media feeds. What was interesting about the Reuters report was that it was started before the COVID pandemic, but the second part of the study was conducted during COVID. And it highlights a fascinating truth about our relationship with the news when it comes to trust.

The study shows that the majority of us don’t trust the news we get through social media but most times, we’re okay with that. Less than 40% of people trust the news in general, and even when we pick a source, less than half of us trust that particular channel. Only 22% indicated they trust the news they see in social media. Yet half of us admit we use social media to get our news. The younger we are, the more reliant we are on social media for news. The fastest growing sources for news amongst all age groups – but especially those under 30 – are Instagram, SnapChat and WhatsApp.

Here’s another troubling fact that fell out of the study. Social platforms, especially Instagram and SnapChat, are dominated by influencers. That means that much of our news comes to us by way of a celebrity influencer reposting it on their feed. This is a far cry from the editorial review process that used to act as a gate keeper on our trusted news sources.

So why do we continue to use news sources we admit we don’t trust? I suspect it may have to do with something called the Meaning Maintenance Model. Proposed in 2006 by Heine, Proulx and Vohs, the model speculates that a primary driver for us is to maintain our beliefs in how the world works. This is related to the sense making loop (Klein, Moon and Hoffman) I’ve also talked about in the past. We make sense of the world by first starting with the existing frame of what we believe to be true. If what we’re experiencing is significantly different from what we believe, we will update our frame to align with the new evidence.

What the Meaning Maintenance Model suggests is that we will go to great lengths to avoid updating our frame. It’s much easier just to find supposed evidence that supports our current beliefs. So, if our intent is to get news that supports our existing world view, social media is the perfect source. It’s algorithmically filtered to match our current frame. Even if we believe the information is suspect, it still comforts us to have our beliefs confirmed. This works well for news about politics, societal concerns and other ideologically polarized topics.

We don’t like to admit this is the case. According to the Reuter’s study, 60% of us indicate we want news sources that are objective and not biased to any particular point of view. But this doesn’t jive with reality at all. As I wrote about in a previous column, almost all mainstream news sources in the US appear to have a significant bias to the right or left. If we’re talking about news that comes through social media channels, that bias is doubled down on. In practice, we are quite happy foraging from news sources that are biased, as long as that bias matches our own.

But then something like COVID comes along. Suddenly, we all have skin in the game in a very real and immediate way. Our information foraging intent changes and our minimum threshold for the reliability of our news sources goes way up. The Reuters study found that when it comes to sourcing COVID information, the most trusted sources are official sites of health and scientific organizations. The least trusted sources are random strangers, social media and messaging apps.

It requires some reading between the lines, but the Reuters study paints a troubling picture of the state of journalism and our relationship with it. Where we get our information directly impacts what we believe. And what we believe determines what we do.

These are high stakes in an all-in game of survival.

TV and My Generation

My Generation has been a dumpster fire of epic proportions. I am a baby boomer, born in 1961, at the tail end of the boom. And, according to Time magazine, we broke America.  We probably destroyed the planet. And, oh yeah, we’ve also screwed up the economy. I’d like to say it isn’t true, but I’m pretty sure it is. As a generation, we have an extensive rap sheet.

Statistically, baby boomers are one of the most politically polarized generations alive today. So, the vast chasm that exists between the right and the left may also be our fault. 

As I said, we’re a generational dumpster fire. 

A few columns back I said this: “We create the medium — which then becomes part of the environment we adapt to.”  I was referring to social media and its impact on today’s generations. 

But what about us? What about the generation that has wreaked all this havoc? If I am right and the media we make in turn makes us who we are, what the hell happened to our generation?

Television, that’s what. 

There have been innumerable treatises on how baby boomers got to be in the sorry state we’re in. Most blame the post-war affluence of America and the never-ending consumer orgy it sparked. 

But we were also the first generation to grow up in front of a television screen. Surely that must have had some impact. 

I suspect television was one of the factors that started driving the wedge between the right and left halves of our generation, creating a non-stretchable world in between. Further, I think it may have been the prime suspect.

Let’s plot the trends of what was on TV against my most influential formative years, and — by extension — my generation. 

When I was 5 years old, in 1966, the most popular TV shows fell into two categories: westerns like “Bonanza” and “Gunsmoke,” or cornfed comedies like “The Andy Griffith Show,” “The Beverly Hillbillies,” “Green Acres” and “Petticoat Junction.” Social commentary and satire were virtually nonexistent on American prime-time TV. The values of America were tightly censored, wholesome and non-confrontational. The only person of color in the line-up was Bill Cosby on “I Spy.” Thanks to “Hogan’s Heroes,” even the Nazis were lovable doofuses. 

I suspect when certain people of my generation want to Make America Great Again, it is this America they’re talking about. It was a white, wholesome America that was seen through the universally rose-colored glasses given to us by the three networks. 

It was also completely fictional, ignoring inconveniences like the civil rights movement, Vietnam and rampant gender inequality. This America never existed. 

When we talk about the cultural environment my generation literally cut our teeth in, this is what we refer to. There was no moral ambiguity. It was clear who the good guys were, because they all wore white hats. 

This moral baseline was spoon-fed to us right when we were first making sense of our own realities. Unfortunately, it bore little to no resemblance to what was actually real.

The fact was, through the late ’60s, America was already increasingly polarized politically. Left and right were drifting apart. Even Bob Hope felt the earth splitting beneath his feet. In November, 1969, he asked all the elected leaders of the country, no matter their politics, to join him in a week of national unity. One of those leaders called it “a time of crisis, greater today perhaps than since the Civil War.” 

But rather than trying to heal the wounds, politicians capitalized on them, further splitting the country apart by affixing labels like Nixon’s “The Silent Majority.” 

Now, let’s move ahead to my teen years. From our mid-teens to our mid-twenties, we create our social identities. Our values and morals take on some complexity. The foundations for our lifelong belief structures are formed during these years. 

In 1976, when I was 15, the TV line-up had become a lot more controversial. We had many shows regularly tackling social commentary: “All in the Family,” “M*A*S*H,” “Sanford and Son,” “Welcome Back, Kotter,” “Barney Miller” and “Good Times.” Of course, we still had heaps of wholesome, thanks to “Happy Days,” “Marcus Welby, M.D.” and “The Waltons.

Just when my generation was forming the values that would define us, our prime-time line-up was splitting left and right. You had the social moralizing of left-leaning show runners like Norman Lear (“All in the Family”) and Larry Gelbart (“M*A*S*H”) vs the God and Country values of “The Waltons” and “Little House on the Prairie.” 

I don’t know what happened in your hometown, but in mine, we started to be identified by the shows we watched (or, often, what our parents let us watch). You had the “All in the Family” Group and “The Waltons” Group. In the middle, we could generally agree on “Charlie’s Angels” and “The Six Million Dollar Man.” The cracks in the ideologies of my generation were starting to show.

I suspect as time went forward, the two halves of my generation started looking to television with two different intents: either to inform ourselves of the world that is, warts and all — or to escape to a world that never was. As our programming choices expanded, those two halves got further and further apart, and the middle ground disappeared. 

There are other factors, I’m sure. But speaking for myself, I spent an unhealthy amount of time watching TV when I was young. It couldn’t help but partially form the person I am today. And if that is true for me, I suspect it is also true for the rest of my generation.

Crisis? What Crisis?

You would think that a global pandemic would hold our attention for a while.

Nope.

We’re tired of it. We’re moving on.  We’re off to the next thing.

Granted, in this case the next thing deserves to be focused on. It is abysmal that it still exists. So it should be focused on. Probably for the rest of our lives and beyond – going forward until it ceases to be a thing. But it won’t be. Soon we’ll be talking about something else.

And that’s the point of this post – our collective inability to remain focused on anything without being distracted by the next breaking story in our news feed. How did we come to this?

I blame memes.

To a certain extent, our culture is the product of who we are and who we are is a product of our culture. Each is shaped by the other, going forward in a constantly improvised pas de deux. Humans create the medium – which then becomes part of the environment we adapt too.

Books and the printed word changed who we were for over five centuries.  Cinema has been helping to define us for almost 150 years. And radio and television has been moulding us for the past century. Our creations have helped create who we are.

This has never been truer than with social media. Unlike other media which took discrete chunks of our time and attention, social media is ubiquitous and pervasive. According to a recent survey, we spend on average 2 hours and 23 minutes per day on social media. That is about 13% of our waking hours.  Social media has become intertwined with our lives to the point that we had to start qualifying what happens where with labels like “IRL” (In Real Life).

There is another difference between social media and what has come before it. Almost every previous entertainment medium that has demanded our attention has been built on the foundation of a long form narrative arc. Interacting with each medium has been a process – a commitment to invest a certain amount of time to go on a journey with the storyteller. The construction of a story depends on patterns that are instantly recognized by us. Once we identify them, we are invested in discovering the outcome. We understand that our part of the bargain is to exchange our time and attention. The payoff is the joy that comes from us making sense of a new world or situation, even if it is imaginary.  

But social media depends on a different exchange. Rather than tapping into our inherent love of the structure of a story it depends on something called variable intermittent rewards. Essentially, it’s the same hook that casinos use to keep people at a slot machine or table. Not only is it highly addictive, it also pushes us to continually scroll to the next thing. It completely bypasses the thinking part of our brains and connects directly to the reward center buried in our limbic system. Rather than ask for our time and attention social media dangles a never-ending array of bright, shiny memes that asks nothing from us: no thinking, almost no attention and a few seconds of our time at most. For a lazy brain, this is the bargain of a lifetime.

It’s probably not a coincidence that the media that are most dependent on advertising are also the media that avoids locking our attention on a single topic for an extended period. This makes social media the perfect match for interruptive ad forms. They are simply slotted into the never-ending scroll of memes.

Social media has only been around for a little over 2 decades. It has been a significant part of our lives for half that time. If even a little bit of what I suspect is happening is indeed taking place, that scares the hell out of me. It would mean that no other medium has changed us so much and so quickly.

That is something worth paying attention to. 

A.I. and Our Current Rugged Landscape

In evolution, there’s something called the adaptive landscape. It’s a complex concept, but in the smallest nutshell possible, it refers to how fit species are for a particular environment. In a relatively static landscape, status quos tend to be maintained. It’s business as usual. 

But a rugged adaptive landscape —-one beset by disruption and adversity — drives evolutionary change through speciation, the introduction of new and distinct species. 

The concept is not unique to evolution. Adapting to adversity is a feature in all complex, dynamic systems. Our economy has its own version. Economist Joseph Schumpeter called them Gales of Creative Destruction.

The same is true for cultural evolution. When shit gets real, the status quo crumbles like a sandcastle at high tide. When it comes to life today and everything we know about it, we are definitely in a rugged landscape. COVID-19 might be driving us to our new future faster than we ever suspected. The question is, what does that future look like?

Homo Deus

In his follow up to his best-seller “Sapiens: A Brief History of Humankind,” author Yuval Noah Harari takes a shot at predicting just that. “Homo Deus: A Brief History of Tomorrow” looks at what our future might be. Written well before the pandemic (in 2015) the book deals frankly with the impending irrelevance of humanity. 

The issue, according to Harari, is the decoupling of intelligence and consciousness. Once we break the link between the two, the human vessels that have traditionally carried intelligence become superfluous. 

In his book, Harari foresees two possible paths: techno-humanism and Dataism. 

Techno-humanism

In this version of our future, we humans remain essential, but not in our current form. Thanks to technology, we get an upgrade and become “super-human.”

Dataism

Alternatively, why do we need humans at all? Once intelligence becomes decoupled from human consciousness, will it simply decide that our corporeal forms are a charming but antiquated oddity and just start with a clean slate?

Our Current Landscape

Speaking of clean slates, many have been talking about the opportunity COVID-19 has presented to us to start anew. As I was writing this column, I received a press release from MIT promoting a new book “Building the New Economy,” edited by Alex Pentland. I haven’t read it yet, but based on the first two lines in the release, it certainly seems to be following this type of thinking:“With each major crisis, be it war, pandemic, or major new technology, there has been a need to reinvent the relationships between individuals, businesses, and governments. Today’s pandemic, joined with the tsunami of data, crypto and AI technologies, is such a crisis.”

We are intrigued by the idea of using the technologies we have available to us to build a societal framework less susceptible to inevitable Black Swans. But is this just an invitation to pry open Pandora’s Box and allow the future Yuval Noah Harari is warning us about?

The Debate 

Harari isn’t the only one seeing the impending doom of the human race. Elon Musk has been warning us about it for years. As we race to embrace artificial intelligence, Musk sees the biggest threat to human existence we have ever faced. 

“I am really quite close, I am very close, to the cutting edge in AI and it scares the hell out of me,” warns Musk. “It’s capable of vastly more than almost anyone knows and the rate of improvement is exponential.”

There are those that pooh-pooh Musk’s alarmism, calling it much ado about nothing. Noted Harvard cognitive psychologist and author Steven Pinker, whose rose-colored vision of humanity’s future reliably trends up and to the right, dismissed Musk’s warnings with this: “If Elon Musk was really serious about the AI threat, he’d stop building those self-driving cars, which are the first kind of advanced AI that we’re going to see.”

In turn, Musk puts Pinker’s Pollyanna perspective down to human hubris: “This tends to plague smart people. They define themselves by their intelligence and they don’t like the idea that a machine could be way smarter than them, so they discount the idea — which is fundamentally flawed.”

From Today Forward

This brings us back to our current adaptive landscape. It’s rugged. The peaks and valleys of our day-to-day reality are more rugged then they have ever been — at least in our lifetimes. 

We need help. And when you’re dealing with a massive threat that involves probability modeling and statistical inference, more advanced artificial intelligence is a natural place to look. 

Would we trade more invasive monitoring of our own bio-status and aggregation of that data to prevent more deaths? In a heartbeat.

Would we put our trust in algorithms that can instantly crunch vast amounts of data our own brains couldn’t possibly comprehend? We already have.

Will we even adopt connected devices constantly streaming the bits of data that define our existence to some corporate third party or government agency in return for a promise of better odds that we can extend that existence? Sign us up.

We are willingly tossing the keys to our future to the Googles, Apples, Amazons and Facebooks of the world. As much as the present may be frightening, we should consider the steps we’re taking carefully.

If we continue rushing down the path towards Yuval Noah Harari’s Dataism, we should be prepared for what we find there: “This cosmic data-processing system would be like God. It will be everywhere and will control everything, and humans are destined to merge into it.”

Media’s Mea Culpa Moment

It’s hard to see when you’re stuck inside. And I’m not talking about self-isolating during a pandemic. I’m talking about our perspective of the media landscape.

The Problem with Politics

Currently, the concept of “Us vs Them” is embedded into our modern idea of politics. Populist politics, by its very nature, needs an enemy to blame. It forces you to pick sides. It creates a culture of antagonism, eroding social capital and dismantling any bipartisan trust. We are far down this path. Perhaps too far to turn back. But we have to realize that no nation or region in modern history has ever prospered in the long term by wantonly destroying social capital. There are many examples of how regionalism, xenophobia and populism have caused nations to regress. There is no example of these things leading to prosperity and long-term success. Not one. Yet this is the path we seem to have chosen.

If you look at the media, it’s politicians that are to blame for all our problems, whether they’re on the right or left. Based on most mainstream media, with its inherent, left wing bias, there is a personification of the problem, primarily in the President. “If Trump wasn’t there, things would be better.” But the problem would still persist. Much as we left leaning individuals found Obama a more palatable choice for president, the problem was here then as well. That’s how we got to where we are today.

The sad truth is, Trump didn’t cause the problem. He just capitalized on it. So we have to look elsewhere for where the problem originated. And that leads us to an uncomfortable reality. We are the problem – meaning we – the media, particularly in the U.S. But it’s hard to see that when you’re looking from the inside. So last week I changed my perspective.

Because of COVID-19, we should all be focused on the same story, perhaps for the first time in our lives. This gives us an unprecedented opportunity to compare the media landscapes against what should be a fairly objective baseline.

The Canadian Litmus Test

I’m Canadian — and for Americans, I know that living next to Canada is like having “The Simpsons'” Ned Flanders for a neighbor. We seem nice and polite, but you can’t help feeling that we’re constantly judging you. 

But Canada does offers Americans the chance to compare cultures that have much in common but with some key critical differences. It was this comparison that geographer, historian and anthropologist Jared Diamond employed in his latest book, “UpheavalTurning Points for Nations in Crisis.”

“Many of Canada’s social and political practices are drastically different from those of the U.S., such as with regards to national health plans, immigration, education, prisons, and balance between community and individual interests,” he writes. ”Some problems that Americans regard as frustratingly insoluble are solved by Canadians in ways that earn widespread public support.”

As a case in point, Canada has handled COVID-19 in a notably different way. Our pandemic response has been remarkably non-partisan. For example, we have the unusual spectacle of our most Trump-like politician, Ontario Conservative Premier Doug Ford, stepping up as a compassionate leader who is working effectively with Liberal Prime Minister Justin Trudeau and his own opposition.

The Myth of Impartial Reporting

This is not the case in the U.S. Because of America’s political divides, it can’t even agree on what should be a simple presentation of fact on the story that affects us all equally. 

A recent PEW study found that where you turn for your news will significantly impact your understanding of things like when a vaccine will be ready or whether coronavirus came about naturally. 

To check this out, I did a comparison of the three most popular U.S. news sites on April 29.

Let’s start with CNN.  Of the 28 news items featured on the home page “above the fold,” 16 had an overt left bias. The most prominent was  inflammatory, dealing with Trump’s handling of the pandemic and his blowing up at press criticism. A Biden story on the Tara Reade accusations was buried in small print links near the bottom.

Now let’s go to the other side of the spectrum: Fox News, which also featured 28 news items “above the fold.” Of these 14 had an overt right bias. Again, the headline was inflammatory, calling out Biden on the Tara Reade allegations. There was no mention of any Trump temper tantrums on the home page. 

Finally, MSNBC’s headline story was actually focused on COVID-19 and the Remdesivir trial results and had no political bias. The site only had nine news items above the fold. Four of these had a left-leaning bias. 

The home pages bore almost no resemblance to each other. You would be hard-pressed to understand that each of these sites represented the news from the same country on the same day.

Now, let’s compare with Canada’s top two news sites, CBC and Global News. 

About 60% of the stories covered were the same on both sites and given roughly the same priority. The same lead story was featured on both — about a missing Canadian military helicopter. On CBC, only one appeared to have any political bias at all and it was definitely not explicit, while none of Global’s did.

That’s in comparison to the American news sites, where over half the stories featured — and all the lead ones — were designed and written to provoke anger, pitting “us” against “them.”

Once mainstream media normalizes this antagonistic approach, it then gets shunted over to social media, where it’s stripped of context, amplified and shared. Mainstream media sets the mood of the nation, and that mood is anger. Social media then whips it into a frenzy. 

Both left- and right-wing media outlets are equally guilty. CNN’s overriding editorial tone is, “Can you believe how stupid they are?” Fox’s is, “They think you’re stupid and they’re trying to pull a fast one on you.” No wonder there is no common ground where public discourse across the political divide can begin.

Before COVID-19, perhaps we could look at this with a certain amount of resignation and even bemusement. If you’re “us” there is a certain satisfaction in vilifying “them.” But today, the stakes are too high. People are dying because of it. Somehow, the media has to turn America’s ideological landscape from a war zone into a safe space.

Quant vs Qual in the time of Crisis

Digesting reality is becoming more and more difficult. I often find myself gagging on it. Last Friday was a good example. I have been limiting my news intact for my own sanity, but Friday morning I went down the rabbit hole. Truth be told, I started doing some research for the post I was intending to write (which I will probably get to next week) and I was soon overwhelmed with what I was reading.

I’m beginning to suspect that we’re getting an extra dump of frightening news on Fridays as officials realize that it’s more difficult to enforce social distancing on weekends. Whether this is the case or not, I found my chest tightening from anxiety. My hands got shaky as I found myself clicking on frightening link after frightening link. Predictions scared the shit out of me. I was worried for my community and country. I was worried for myself. But most of all, I was worried for my kids, my wife, my dad, my in-laws and my family.

Fear and anxiety swamped my normally rational side. Intellect gave way to despair. That’s not a good mode for me. I have to run cool – I need to be rational to function. Emotions mentally shut me down.

So I retreated to the numbers. My single best source throughout this has been the posts from Tomas Pueyo – the VP of Growth at Course Hero. They are exhaustively researched statistical analyses and “what-if” models assembled by an ad-hoc team of rockstar quants. On his first post on March 10 –  “Coronavirus: Why You Must Act Now” – Pueyo and his team nailed it. If everyone listened and followed his advice, we wouldn’t be where we are now. Similarly, his post on March 19 – “Coronavirus: The Hammer and The Dance” gave a tough but rational prescription to follow. His latest – “Coronavirus: Out of Many, One” – drills down on a state-by-state analysis of COVID in the US.

I’m not going to blow smoke here. These are tough numbers to read. Even the best-case scenarios would have been impossible to imagine just a few weeks ago. But the worst-case scenarios are exponentially more frightening. And if you – like me – needs to retreat to ration in order to keep functioning, this is the best rationale I’ve found for dealing with COVID 19. It’s not what we want to hear, but it’s what we must listen to.

In my marketing life, I always encouraged a healthy mix of both quantitative and qualitative perspectives in trying to understand what is real. I’ve said in the past: “Quantitative is watching the dashboard while you drive. Qualitative is looking out the windshield.”

I often find that marketers tend to focus too much on the numbers and not enough on the people on the other side of those numbers. We were an industry deluged with data and it made us less human.

Ironically, I now find myself on the other side of that argument. We have to understand that even our most trustworthy media sources are going to be telling us the stories that have the most impact on us. Whether you turn to Fox or CNN as your news source, we would be getting soundbites out of context that are – by design – sensational in nature. They may differ in their editorial slants, but – right or left – we can’t consider them representational of reality. They are the outliers.

Being human, we can’t help but apply these to our current reality. It’s called availability bias. It the simplest terms possible, it means that those things that are most in our face become our understanding of any given situation.

In normal times, these individual examples can heighten our humanity and make us a little less numb. They remind us of the relevance of the individual experience– the importance of every life and the tragedy of even one person suffering.

“If only one man dies of hunger, that is a tragedy.
If millions die, that’s only a statistic.”

– Joseph Stalin

Normally, I would never dream of quoting Joe Stalin in a post. But these are not normal times. And the fact is, Stalin was right. when we start looking at statistics and mathematical modelling, our brain works differently. It forces us to use a more rational cognitive mechanism; one less likely to be influenced by emotion. And in responding to a crisis, this is exactly the type of reasoning required.

This is a time unlike anything any of us has experienced. In times like this, actions should be based on the most accurate and scientific information possible. We need the cold, hard logic of math as a way to not become swamped by the wave of our own emotions. In order to make really difficult decisions for the greater good, we need to distance ourselves from our own little bubbles of reality, especially when that reality is made up of non-representative examples streamed to us through media channels.

Whipped Into a Frenzy

Once again, we’re in unprecedented territory. According to the CDC – COVID-19 is the first global pandemic since the 2009 H1N1 outbreak. While Facebook was around in 2009, it certainly wasn’t as pervasive or impactful as it is today. Neither – for that matter – was H1N1 when compared to COVID-19. That would make COVID-19 the first true pandemic in the age of social media.

While we’re tallying the rapidly mounting human and economic costs of the pandemic on a day-by-day basis, there is a third type of damage to consider. There will be a cognitive cost to this as well.

So let’s begin by unpacking the psychology of a pandemic. Then we’ll add the social media lens to that.

Emotional Contagion aka “The Toilet Paper Syndrome”

Do you have toilet paper at your local store? Me neither. Why?

The short answer is that there is no rational answer. There is no disruption in the supply chain of toilet paper. If you were inclined to stock up on something to battle COVID-19, hand sanitizer would be a much better choice.  Search as you might, there is no logical reason why people should be pulling toilet paper by the pallet full out of their local Costco.

There is really only one explanation; panic is contagious. It’s called emotional contagion. And there is an evolutionary explanation for it. We evolved as herd animals and when our threats came from the environment around us, it made sense to panic when you saw your neighbor panicking. Those that were on the flanks of the herd acted as an early warning system for the rest. When you saw panic close to you, the odds were very good that you were about to be eaten, trampled or buried under a rockslide. We’re hardwired to live by the principle of “Monkey see, monkey do.”

Here’s the other thing about emotional contagion. It doesn’t work very well if you have to take time to think about it. Panicked responses to threats from your environment will only save your life if they happen instantly. Natural selection has ensured they bypass the slower and more rational processing loops of our brain.

But now let’s apply the social media lens to this. Before modern communication tools were invented, emotional contagion was limited by the constraints of physical proximity. It was the original application of social distancing. Emotions could spread to a social node linked by physical proximity, but it would seldom jump across ties to another node that was separated by distance.

Then came Facebook, a platform perfectly suited to emotional contagion. Through it, emotionally charged messages can spread like wildfire regardless of where the recipients might be – creating cascades of panic across all nodes in a social network.

Now we have cascades of panic causing – by definition – irrational responses. And that’s dangerous. As Wharton Management professor Sigal Barsade said in a recent podcast, “I would argue that emotional contagion, unless we get a hold on it, is going to greatly amplify the damage caused by COVID-19”

Why We Need to Keep Calm and Carry On

Keep Calm and Carry On – the famous slogan from World War II Britain – is more than just a platitude that looks good on a t-shirt. It’s a sound psychological strategy for survival, especially when faced with threats in a complex environment. We need to think with our whole brain and we can only do that when we’re not panicking.

Again, Dr. Barsade cautions us “One of the things we also know from the research literature is that negative emotions, particularly fear and anxiety, cause us to become very rigid in our decision-making. We’re not creative. We’re not as analytical, so we actually make worse decisions.”

Let’s again consider the Facebook Factor (in this case, Facebook being my proxy for all social media). Negative emotional messages driven by fear gets clicked and shared a lot on social media. Unfortunately, much of that messaging is – at best – factually incomplete or – at worst – a complete fabrication. A 2018 study from MIT showed that false news spreads six times faster on social media than factual information.

It gets worse. According to Pew Research, one in five Americans said that social media is their preferred source for news, surpassing newspapers. In those 18 -to 29, it was the number one source. When you consider the inherent flaws in the methodology of a voluntary questionnaire, you can bet the actual number is a lot higher.

Who Can You Trust?

Let’s assume we can stay calm. Let’s further assume we can remain rational. In order to make rational decisions, you need factual information.

Before 2016, you could generally rely on government sources to provide trustworthy information. But that was then. Now, we live in the reality distortion field that daily spews forth fabricated fiction from the Twitter account of Donald. J. Trump, aka the President of the United States.

The intentional manipulation of the truth by those we should trust has a crippling effect on our ability to respond as a cohesive and committed community. As recently as just a week and a half ago, a poll found that Democrats were twice as likely as Republicans to say that COVID-19 posed an imminent threat to the U.S. By logical extension, that means that Republicans were half as likely to do something to stop the spread of the disease.

My Plan for the Pandemic

Obviously, we live in a world of social media. COVID-19 or not, there is no going back. And while I have no idea what will happen regarding the pandemic, I do have a pretty good guess how this will play out on social media. Our behaviours will be amplified through social media and there will be a bell curve of those behaviors stretching from assholes to angels. We will see the best of ourselves – and the worst – magnified through the social media lens.

Given that, here’s what I’m planning to do. One I already mentioned. I’m going to keep calm. I’m going to do my damnedest to make calm, rational decisions based on trusted information (i.e. not from social media or the President of the United States) to protect myself, my loved ones and anyone else I can.

The other plan? I’m going to reread everything from Nassam Nicholas Taleb. This is a good time for all of us to brush up on our understanding of robustness and antifragility.

A Troubling Prognostication

It’s that time of year again. My inbox is jammed with pitches from PR flacks trying to get some editorial love for their clients. In all my years of writing, I think I have actually taken the bait maybe once or twice. That is an extremely low success rate. So much for targeting.

In early January, many of the pitches offer either reviews of 2019 or predictions for 2020.  I was just about to hit the delete button on one such pitch when something jumped out at me: “The number-one marketing trend for 2020 will be CDPs: customer data platforms.”

I wasn’t surprised by that. It makes sense. I know there’s a truckload of personal data being collected from everyone and their dog. Marketers love platforms. Why wouldn’t these two things come together?

But then I thought more about it — and immediately had an anxiety attack. This is not a good thing. In fact, this is a catastrophically terrible thing. It’s right up there with climate change and populist politics as the biggest world threats that keep me up at night.

To close out 2019,  fellow Insider Maarten Albarda gave you a great guide on where not to spend your money. In that column, he said this: “Remember when connected TVs, Google Glass and the Amazon Fire Phone were going to provide break-through platforms that would force mass marketing out of the box, and into the promised land of end-to-end, personalized one-on-one marketing?”

Ah, marketing nirvana: the Promised Land! The Holy Grail of personalized marketing. A perfect, friction-free direct connection between the marketer and the consumer.

Maarten went on to say that social media is one of the channels you shouldn’t be throwing money into, saying, “It’s also true that we have yet to see a compelling case where social media played a significant role in the establishment or continued success of a brand or service.”

I’m not sure I agree with this, though I admit I don’t have the empirical data to back up my opinion. But I do have another, darker reason why we should shut off the taps providing the flow of revenue to the usual social suspects. Social media based on an advertising revenue model is a cancerous growth — and we have to shut off its blood flow.

Personalized one-to-one marketing — that Promised Land —  cannot exist without a consistent and premeditated attack on our privacy. It comes at a price we should not be prepared to pay.

It depends on us trusting profit-driven corporations that have proven again and again that they shouldn’t be trusted. It is fueled by our darkest and least admirable motives.

The ecosystem that is required to enable one-to-one marketing is a cesspool of abuse and greed. In a pristine world of marketing with players who sport shiny ideals and rock-solid ethics, maybe it would be okay. Maybe. Personally, I wouldn’t take that bet. But in the world we actually live and work in, it’s a sure recipe for disaster.

To see just how subversive data-driven marketing can get, read “Mindf*ck” by Christopher Wylie. If that name sounds vaguely familiar to you, let me jog your memory. Wylie is the whistleblower who first exposed the Cambridge Analytica scandal. An openly gay, liberal, pink-haired Canadian, he seems an unlikely candidate to be the architect of the data-driven “Mindf*ck” machine that drove Trump into office and the Brexit vote over the 50% threshold.

Wylie admits to being blinded by the tantalizing possibilities of what he was working on at Cambridge Analytica: “Every day, I overlooked, ignored, or explained away warning signs. With so much intellectual freedom, and with scholars from the world’s leading universities telling me we were on the cusp of ‘revolutionizing’ social science, I had gotten greedy, ignoring the dark side of what we were doing.”

But Wylie is more than a whistleblower. He’s a surprisingly adept writer who has a firm grasp on not just the technical aspects, but also the psychology behind the weaponization of data. If venture capitalist Roger McNamee’s tell-all expose of Facebook, “Zucked,”  kept you up at night, “Mindf*ck” will give you screaming night terrors.

I usually hold off jumping on the year-end prognostication bandwagon, because I’ve always felt it’s a mug’s game. I would like to think that 2020 will be the year when the world becomes “woke” to the threat of profit-driven data abuse — but based on our collective track record of ignoring inconvenient truths, I’m not holding my breath.

The Ruts of Our Brain

We are not – by nature – open minded. In fact, as we learn something, the learning creates neural pathways in our brain that we tend to stick to. In other words, the more we learn, the bigger the ruts get.

Our brains are this way by design. At its core, the brain is an energy saving device. If there are two options open to it, one requiring more cognitive processing and one requiring less, the brain will default to the less resource intensive option.

This puts expertise into an interesting new perspective. In a recent study, researchers from Cold Spring Harbor Laboratory, Columbia University, University College London and Flatiron Institute found that when mice learn a new task, the neurons in their brain actually change as they move from being a novice to an expert. At the beginning as they’re learning the task, the required neurons don’t “fire” until the brain makes a decision. But, as expertise is gained, those same neurons start responding before they’re even needed. It’s essentially Hebbian Theory (named after neurologist Donald Hebbs) in action: the neurons that fire together eventually wire together.

We tend to think of experts as bringing a well-honed subset of intellectual knowledge to a question. And that is true, as long as the question is well within their area of expertise. But the minute an expert ventures outside of their “rut” they begin to flounder. In fact, even when they are in their area of expertise but are asked to predict where that path that may lead in the future – beyond their current rut – their expertise doesn’t help them. In 2005 psychologist Phillip Tetlock published “Expert Political Judgement” – a book showing the results of a 20-year long study on the prediction track record of experts. It wasn’t good. According to a New Yorker review of the book, “Human beings who spend their lives studying the state of the world…are poorer forecasters than dart-throwing monkeys”

Why? Well, just like those mice in the above-mentioned study, once we have a rut, our brains like to stick to the rut. It’s just easier for us. And experts have very deep ruts. The deeper the rut, the more effort it takes to peer above it. As Tetlock found, when it comes to predicting what might happen in some area in the future, even if you happen to be an expert in that area, you’d probably be better off flipping a coin than relying on your brain.

By the way, for most of human history, this has been a feature, not a bug. Saving cognitive energy is a wonderful evolutionary advantage. If you keep doing the same thing over and over, eventually the brain pre-lights the neuronal path required, saving itself time and energy. The brain is directing anticipated traffic at faster than the speed of thought. And it’s doing it so well, it would take a significant amount of cognitive horsepower to derail this action.

Like I said, in a fairly predictably world of cause and effect, this system works. But in an uncertain world full of wild card complexity, it can be crippling.

Complex worlds require Foxes, not Hedgehogs. This analogy also comes from Tetlock’s book. According to an old Greek fable, “The fox knows many things but the hedgehog knows just one thing.” To that I would add; the fox knows a little about many things, but the hedgehog knows a lot about one thing. In other words, the hedgehog is an expert.

In Tetlock’s study, people with “fox” qualities had a significantly better track record then “hedgehogs” when it came to predicting the future. Their brains were better able to take the time to synthesize the various data inputs required to deal with the complexity of crystal balling the future because they weren’t barrelling down a pre-ordained path that had been carved by years of accumulated expertise.

But it’s not just expertise that creates these ruts in our brains. The same pattern plays out when we look at the impact of our beliefs play in how open-minded we are. The stronger the belief, the deeper the rut.

Again, we have to remember that this tendency of our brains to form well-travelled grooves over time has been crafted by the blind watchmaker of evolution. But that doesn’t make it any less troubling when we think about the limitations it imposes in a more complex world. This is especially true when new technologies deliberately leverage our vulnerability in this area. Digital platforms ruthlessly eliminate the real estate that lies between perspectives. The ideological landscape in which foxes can effectively operate is disappearing. Increasingly we grasp for expertise – whether it’s on the right or left of any particular topic – with the goal of preserving our own mental ruts.

And as the ruts get deeper, foxes are becoming an endangered species.

Just in Time for Christmas: More Search Eye-Tracking

The good folks over at the Nielsen Norman Group have released a new search eye tracking report. The findings are quite similar to one my former company — Mediative — did a number of years ago (this link goes to a write-up about the study. Unfortunately, the link to the original study is broken. *Insert head smack here).

In the Nielsen Norman study, the two authors — Kate Moran and Cami Goray — looked at how a more visually rich and complex search results page would impact user interaction with the page. The authors of the report called the sum of participant interactions a “Pinball Pattern”: “Today, we find that people’s attention is distributed on the page and that they process results more nonlinearly than before. We observed so much bouncing between various elements across the page that we can safely define a new SERP-processing gaze pattern — the pinball pattern.”

While I covered this at some length when the original Mediative report came out in 2014 (in three separate columns: 1,2 & 3), there are some themes that bear repeating. Unfortunately, I found the study’s authors missed what I think are some of the more interesting implications. 

In the days of the “10 Blue Links” search results page, we used the same scanning strategy no matter what our intent was. In an environment where the format never changes, you can afford to rely on a stable and consistent strategy. 

In our first eye tracking study, published in 2004, this consistent strategy led to something we called the Golden Triangle. But those days are over.

Today, when every search result can look a little bit different, it comes as no surprise that every search “gaze plot” (the path the eyes take through the results page) will also be different. Let’s take a closer look at the reasons for this. 

SERP Eye Candy

In the Nielsen Norman study, the authors felt “visual weighting” was the main factor in creating the “Pinball Pattern”: “The visual weight of elements on the page drives people’s scanning patterns. Because these elements are distributed all over the page and because some SERPs have more such elements than others, people’s gaze patterns are not linear. The presence and position of visually compelling elements often affect the visibility of the organic results near them.”

While the visual impact of the page elements is certainly a factor, I think it’s only part of the answer. I believe a bigger, and more interesting, factor is how the searcher’s brain and its searching strategies have evolved in lockstep with a more visually complex results page. 

The Importance of Understanding Intent

The reason why we see so much variation in scan patterns is that there is also extensive variation in searchers’ intent. The exact same search query could be used by someone intent on finding an online or physical place to purchase a product, comparing prices on that product, looking to learn more about the technical specs of that product, looking for how-to videos on the use of the product, or looking for consumer reviews on that product.

It’s the same search, but with many different intents. And each of those intents will result in a different scanning pattern. 

Predetermined Page Visualizations

I really don’t believe we start each search page interaction with a blank slate, passively letting our eyes be dragged to the brightest, shiniest object on the page. I think that when we launch the search, our intent has already created an imagined template for the page we expect to see. 

We have all used search enough to be fairly accurate at predicting what the page elements might be: thumbnails of videos or images, a map showing relevant local results, perhaps a Knowledge Graph result in the lefthand column. 

Yes, the visual weighting of elements act as an anchor to draw the eye, but I believe the eye is using this anticipated template to efficiently parse the results page. 

I have previously referred to this behavior as a “chunking” of the results page. And we already have an idea of what the most promising chunks will be when we launch the search. 

It’s this chunking strategy that’s driving the “pinball” behavior in the Nielsen Norman study.  In the Mediative study, it was somewhat surprising to see that users were clicking on a result in about half the time it took in our original 2005 study. We cover more search territory, but thanks to chunking, we do it much more efficiently.

One Last Time: Learn Information Scent

Finally, let me drag out a soapbox I haven’t used for a while. If you really want to understand search interactions, take the time to learn about Information Scent and how our brains follow it (Information Foraging Theory — Pirolli and Card, 1999 — the link to the original study is also broken. *Insert second head smack, this one harder.). 

This is one area where the Nielsen Norman Group and I are totally aligned. In 2003, Jakob Nielsen — the first N in NNG — called the theory “the most important concept to emerge from human-computer interaction research since 1993.”

On that we can agree.