Tired of Reality? Take 2 Full-Strength Schitt’s Creeks

“Schitt’s Creek” stormed the Emmys by winning awards in every comedy series category — a new record. It was co-creators Dan and Eugene Levy’s gift to the world: a warm bowl of hot cultural soup, brimming with life-affirming values, acceptance and big-hearted Canadian corniness.

It was the perfect entertainment solution to an imperfect time. It was good for what ails us.

It’s not the first time we’ve turned to entertainment for comfort. In fact, if there is anything as predictable as death and taxes, it’s that during times of trial, we need to be entertained.

There is a direct correlation between feel-good fantasy and feeling-shitty reality. The worse things get, the more we want to escape it.

But the ways we choose to be entertained have changed. And maybe — just maybe — the media channels we’re looking to for our entertainment are adding to the problem. 

The Immersiveness of Media

A medium’s ability to distract us from reality depends on how much it removes us from that reality.

Our media channels have historically been quite separate from the real world. Each channel offered its own opportunity to escape. But as the technology we rely on to be entertained has become more capable of doing multiple things, that escape from the real world has become more difficult.

Books, for example, require a cognitive commitment unlike any other form of entertainment. When we read a book, we — in effect — enter into a co-work partnership with the author. Our brains have to pick up where theirs left off, and we together build a fictional world to which we can escape. 

As the science of interpreting our brain’s behavior has advanced, we have discovered that our brains actually change while we read.

Maryanne Wolf explains in her book, “Proust and the Squid: The Story and Science of the Reading Brain”: “Human beings invented reading only a few thousand years ago. And with this invention, we rearranged the very organization of our brain, which in turn expanded the ways we were able to think, which altered the intellectual evolution of our species. . . . Our ancestors’ invention could come about only because of the human brain’s extraordinary ability to make new connections among its existing structures, a process made possible by the brain’s ability to be reshaped by experience.”

Even movies, which dramatically lowered the bar for the cognitive commitments they ask by supplying content specifically designed for two of our senses, do so by immersing us in a dedicated single-purpose environment. The distraction of the real world is locked outside the theater doors.

But today’s entertainment media platforms not only live in the real world, they are the very same platforms we use to function in said world. They are our laptops, our tablets, our phones and our connected TVs.

It’s hard to ignore that world when the flotsam and jetsam of reality is constantly bumping into us. And that brings us to the problem of the multitasking myth.

Multitasking Anxiety

The problem is not so much that we can’t escape from the real world for a brief reprise in a fictional one. It’s that we don’t want to. 

Even if we’re watching our entertainment in our home theater room on a big screen, the odds are very good that we have a small screen in our hands at the same time. We mistakenly believe we can successfully multitask, and our mental health is paying the price for that mistake.

Research has found that trying to multitask brings on a toxic mix of social anxiety, depression, a lessening of our ability to focus attention, and a sociopsychological impairment that impacts our ability to have rewarding relationships. 

When we use the same technology to be entertained that we use to stay on top of our social networks we fall prey to the fear of missing out.

It’s called Internet Communication Disorder, and it’s an addictive need to continually scroll through Facebook, Twitter, WhatsApp and our other social media platforms. It’s these same platforms that are feeding us a constant stream of the very things we’re looking to escape from. 

It may be that laughter is the best medicine, but the efficacy of that medicine is wholly dependent on where we get our laughs.

The ability for entertainment to smooth the jagged edges of reality depend on our being able to shift our minds off the track that leads to chronic anxiety and depression — and successfully escape into a fictional kinder, gentler, funnier world.

For entertainment to be a beneficial distraction, we first have to mentally disengage from the real world, and then fully engage in the fictional one.

That doesn’t work nearly as well when our entertainment delivery channel also happens to be the same addictive channel that is constantly tempting us to tiptoe through the anxiety-strewn landscape that is our social media feed. 

In other words, before going to “Schitt’s Creek,” unpack your other shit and leave it behind. I guarantee it will be waiting for you when you get back.

Our Brain And Its Junk News Habit

Today, I’m going to return to the Reuter’s Digital News Report and look at the relationship between us, news and social media. But what I’m going to talk about is probably not what you think I’m going to talk about.

Forget all the many, many problems that come with relying on social media to be informed. Forget about filter bubbles and echo chambers. Forget about misleading or outright false stories. Forget about algorithmic targeting. Forget about the gaping vulnerabilities that leave social media open to nefarious manipulation. Forget all that (but just for the moment, because those are all horrible and very real problems that we need to focus on).

Today, I want to talk about one specific problem that comes when we get our news through social media. When we do that, our brains don’t work the way they should if we want to be well informed.

First, let’s talk about the scope of the issue here. According to the Reuter’s study, in the U.S. more people — 72% — turn online for news than any other source. Television comes in second at 59%. If we single out social media, it comes in third at 48%. Trailing the pack is print media at just 20%.

Reuters Digital News Study 2020 – Sources of News in US

If we plot this on a chart over the last seven years, print and social media basically swapped spots, with their respective lines crossing each other in 2014; one trending up and one trending down. In 2013, 47% of us turned to print as a primary news source and just 27% of us went to social media.

If we further look at those under 35, accessing news through social media jumps to the number-one spot by a fairly wide margin. And because they’re young, we’re not talking Facebook here. Those aged 18 to 24 are getting their news through Instagram, Snapchat and TikTok.

The point, if it’s not clear by now, is that many of us get our news through a social media channel — and the younger we are, the more that’s true. The paradox is that the vast majority of us — over 70% — don’t trust the news we see on our social media feeds. If we were to pick an information source we trusted, we would never go to social media.

This brings up an interesting juxtaposition in how we’re being informed about the world: almost all of us are getting our news through social media, but almost none of us are looking for it when we do.

According to the Reuter’s Report, 72% of us (all ages, all markets) get our news through the “side door.” This means we are delivered news — primarily through social media and search — without us intentionally going directly to the source of the information. For those aged 18 to 24, “side door” access jumps to 84% and, of that, access through social media jumps to 38%.

Our loyalty to the brand and quality of an information provider is slipping between our fingers and we don’t seem to care. We say we want objective, non-biased, quality news sources, but in practice we lap up whatever dubious crap is spoon-fed to us by Facebook or Instagram. It’s the difference between telling our doctor what we intend to eat and what we actually eat when we get home to the leftover pizza and the pint of Häagen-Dazs in our fridge.

The difference between looking for and passively receiving information is key to understanding how our brain works. Let’s talk a little bit about “top-down” and “bottom-up” activation and the “priming” of our brain.

When our brain has a goal — like looking for COVID-19 information — it behaves significantly differently than when it is just bored and wanting to be entertained.

The goal sets a “top down” intent. It’s like an executive order to the various bits and pieces of our brain to get their shit together and start working as a team. Suddenly the entire brain focuses on the task at hand and things like reliability of information become much more important to us. If we’re going to go directly to a information source we trust, this is going to be when we do it.

If the brain isn’t actively engaged in a goal, then information has to initiate a “bottom-up” activation. And that is an entirely different animal.

We never go to social media looking for a specific piece of news. That’s not how social media works. We go to our preferred social channels either out of sheer boredom or a need for social affirmation. We hope there’s something in the highly addictive endlessly scrolling format that will catch our attention.

For a news piece to do that, it has to somehow find a “hook” in our brain.  Often, that “hook” is an existing belief. The parts of our brain that act as gatekeepers against unreliable information are bypassed because no one bothered to wake them up.

There is a further brain-related problem with relying on social media, and that’s the “priming” issue. This is where one stimulus sets a subconscious “lens” that will impact subsequent stimuli. Priming sets the brain on a track we’re not aware of, which makes it difficult to control.

Social media is the perfect priming platform. One post sets the stage for the next, even if they’re completely unrelated.

These are just two factors that make social media an inherently dangerous platform to rely on for being informed.

The third is that social media makes information digestion much too easy. Our brain barely needs to work at all. And if it does need to work, we quickly click back and scroll down to the next post. Because we’re looking to be entertained, not informed, the brain is reluctant to do any unnecessary heavy lifting.   

This is a big reason why we may know the news we get through social media channels is probably not good for us, but we gulp it down anyway, destroying our appetite for more trustworthy information sources.

These three things create a perfect cognitive storm for huge portions of the population to be continually and willingly misinformed. That’s not even factoring in all the other problems with social media that I mentioned at the outset of this column. We need to rethink this — soon!

Playing Fast and Loose with the Truth

A few months ago, I was having a conversation with someone and they said something that I was pretty sure was not true. I don’t know if it was a deliberate lie. It may have just been that this particular person was uninformed. But they said it with the full confidence that what they said was true. I pushed back a little and they instantly defended their position.

My first instinct was just to let it go. I typically don’t go out of my way to cause friction in social settings. Besides, it was an inconsequential thing. I didn’t really care about it. But I was feeling a little pissy at the time, so I fact checked her by looking it up on my phone. And I was right. She had stated something that wasn’t true and then doubled down on it.

Like I said, it was inconsequential – a trivial conversation point. But what if it wasn’t? What if there was a lot riding on whether or not what they said was true? What if this person was in a position of power, like – oh, I don’t know – the President of the United States?

The role of truth in our social environment is currently a thing in flux. I cannot remember a time when we have been more suspicious of what we see, read and hear on a daily basis. As I mentioned a couple weeks ago, less than 40% of us trust what we hear on the news. And when that news comes through our social media feed, the level of distrust jumps to a staggering 80%

Catching someone in a lie has significant social and cognitive implications. We humans like to start from a default position of trust. If we can do that, it eliminates a lot of social friction and cognitive effort. We only go to not trusting when we have to protect ourselves.

Our proclivity for trust is what has made a global commerce and human advancement possible. But, unfortunately, it does leave us vulnerable. Collectively, we usually play by the same playbook I was initially going to use in my opening example. It’s just easier to go along with what people say, even if we may doubt that it’s true. This is especially so if the untruth is delivered with confidence. We humans love confidence in others because it means we don’t have to work as hard. Confidence is a signal we use to decide to trust and trust is always easier than distrust. The more confident the delivery, the less likely we are to question it.

It’s this natural human tendency that put the “con” in “con artist.” “Con” is short for confidence, and it originates with an individual named William Thompson, who plied the streets of New York in the 1840’s. He would walk up to a total stranger who was obviously well off and greet them like a long-lost friend. After a few minutes of friendly conversation during which the target would be desperately trying to place this individual, Thompson would ask for the loan of something of value. He would then set his hook with this, “Do you have confidence in me to loan me this [item] til tomorrow?”  The success of this scam was totally dependent on an imbalance of confidence; extreme confidence on the part of the con artist and a lack of confidence on the part of the target.

It is ironic that in an era where it’s easier than ever to fact check, we are seeing increasing disregard for the truth. According to the Washington Post, Donald Trump passed a misinformation milestone on July 9, making 20,000 false or misleading claims since he became President. He surged past that particular post when he lied 62 times on that day alone. I don’t even think I talk 62 times per day.

This habit of playing fast and loose with the truth is not Trump’s alone. Unfortunately, egregious lying has been normalized in today’s world. We have now entered an era where full time fact checking is necessary. On July 7, NY Times columnist Thomas Friedman said we need a Biden-Trump debate, but only on two conditions: First, only if Trump releases his tax returns, and second, only if there is a non-partisan real-time fact-checking team keeping the debaters accountable.

We have accepted this as the new normal. But we shouldn’t. There is an unacceptable cost we’re paying by doing so. And that cost becomes apparent when we think about the consequence of lying on a personal basis.

If we catch an acquaintance in a deliberate lie, we put them in the untrustworthy column. We are forced into a default position of suspicion whenever we deal with them in the future. This puts a huge cognitive load on us. As I said before, it takes much more effort to not trust someone. It makes it exponentially harder to do business with them. It makes it more difficult to enjoy their company. It introduces friction into our relationship with them.

Even if the lie is not deliberate but stated with confidence, we label them as uninformed. Again, we trust them less.

Now multiply this effort by everyone. You quickly see where the model breaks down. Lying may give the liar a temporary advantage, but it’s akin to a self-limiting predator-prey model. If it went unchecked, soon the liars would only have other liars to deal with. It’s just not sustainable.

Truth exists for a reason. It’s the best social strategy for the long term. We should fight harder for it.

What Would Aaron Do?

I am a big Aaron Sorkin fan. And before you rain on my parade, I say that fully understanding that he epitomizes the liberal intellectual elitist, sanctimonious cabal that has helped cleave American culture in two. I get that. And I don’t care.

I get that his message is from the left side of the ideological divide. I get that he is preaching to the choir. And I get that I am part of the choir. Still, given the times, I felt that a little Sorkin sermon was just what I needed. So I started rewatching Sorkin’s HBO series “The Newsroom.”

If you aren’t part of this particular choir, let me bring you up to speed. The Newsroom in this case is at the fictional cable network ACN. One of the primary characters is lead anchor Will McEvoy (played by Jeff Daniels), who has built his audience by being noncontroversial and affable — the Jay Leno of journalism. 

This brings us to the entrance of the second main character: Mackenzie McHale, played by Emily Mortimer. Exhausted from years as an embedded journalist covering multiple conflicts in Afghanistan, Pakistan and Iraq, she comes on board as McEvoy’s new executive producer (and also happens to be his ex-girlfriend). 

In typical Sorkin fashion, she goads everyone to do better. She wants to reimagine the news by “reclaiming journalism as an honorable profession,” with “civility, respect, and a return to what’s important; the death of bitchiness; the death of gossip and voyeurism; speaking truth to stupid.”

I made it to episode 3 before becoming profoundly sad and world-weary. Sorkin’s sermon from 2012—– just eight years ago —  did not age well. It certainly didn’t foreshadow what was to come. 

Instead of trying to be better, the news business — especially cable news — has gone in exactly the opposite direction, heading straight for Aaron Sorkin’s worst-case scenario. This scenario formed part of a Will McEvoy speech in that third episode: “I’m a leader in an industry that miscalled election results, hyped up terror scares, ginned up controversy, and failed to report on tectonic shifts in our country — from the collapse of the financial system to the truths about how strong we are to the dangers we actually face.”

That pretty much sums up where we are. But even Sorkin couldn’t anticipate what horrors social media would throw into the mix. The reality is actually worse than his worst-case scenario. 

Sorkin’s appeal for me was that he always showed what “better” could be. That was certainly true in his breakthrough political hit “The West Wing.” 

He brought the same message to the jaded world of journalism in “The Newsroom. He was saying, “Yes, we are flawed people working in a flawed system set in a flawed nation. But it can be better….Our future is in our hands. And whatever that future may be, we will be held accountable for it when it happens.”

This message is not new. It was the blood and bones of Abraham Lincoln’s annual address to Congress on December 1, 1862, just one month before the Emancipation Proclamation was signed into law. Lincoln was preparing the nation for the choice of a path which may have been unprecedented and unimaginably difficult, but would ultimately be proven to be the more moral one: “It is not ‘can any of us imagine better?’ but, ‘can we all do better?’ The dogmas of the quiet past, are inadequate to the stormy present. The occasion is piled high with difficulty, and we must rise — with the occasion.”

“The Newsroom” was Sorkin’s last involvement with a continuing TV series. He was working on his directorial movie debut, “Molly’s World,” when Trump got elected. 

Since then, he has adapted Harper Lee’s “To Kill a Mockingbird” for Broadway, with “The Newsroom’”s Jeff Daniels as Atticus Finch. 

Sorkin being Sorkin, he ran into a legal dispute with Lee’s estate when he updated the source material to be a little more open about the racial tension that underlies the story. Aaron Sorkin is not one to let sleeping dogmas lie. 

Aaron Sorkin also wrote a letter to his daughter and wife on the day after the 2016 election, a letter than perhaps says it all. 

It began, “Well the world changed late last night in a way I couldn’t protect us from.”

He was saying that as a husband and father. But I think it was a message for us all — a message of frustration and sadness. He closed the letter by saying “I will not hand [my daughter] a country shaped by hateful and stupid men. Your tears last night woke me up, and I’ll never go to sleep on you again.”

Yes, Sorkin was preaching when he was scripting “The Newsroom.” But he was right. We should do better. 

In that spirit, I’ll continue to dissect the Reuters study on the current state of journalism I mentioned last week. And I’ll do this because I think we have to hold our information sources to “doing better.” We have to do a better job of supporting those journalists that are doing better. We have to be willing to reject the “dogmas of the quiet past.” 

One of those dogmas is news supported by advertising. The two are mutually incompatible. Ad-supported journalism is a popularity contest, with the end product a huge audience custom sliced, diced and delivered to advertisers — instead of a well-informed populace.

We have to do better than that.

How We Forage for the News We Want

Reuters Institute out of the UK just released a comprehensive study looking at how people around the world are finding their news. There is a lot here, so I’ll break it into pieces over a few columns and look at the most interesting aspects. Today, I’ll look at the 50,000-foot view, which can best be summarized as a dysfunctional relationship between our news sources and ourselves. And like most dysfunctional relationships, the culprit here is a lack of trust.

Before we dive in, we should spend some time looking at how the way we access news has changed over the last several years.

Over my lifetime, we have trended in two general directions – less cognitively demanding news channels and less destination specific news sources. The most obvious shift has been away from print. According to Journalism.org and the Pew Research Center, circulation of U.S. Daily newspapers peaked around 1990, at about 62 and a half million. That’s one subscription for every 4 people in the country at that time.

In 2018, it was projected that circulation had dropped more than 50%, to less than 30 million. That would have been one subscription for every 10 people. We were no longer reading our news in a non-digital format. And that may have significant impact on our understanding of the news. I’ll return to this in another column, but for now, let’s just understand that our brain operates in a significantly different way when it’s reading rather than watching or listening.

Up the end of the last century, we generally trusted news destinations. Whether it be a daily newspaper like the New York Times, a news magazine like Time or a nightly newscast such as any of the network news shows, each was a destination that offered one thing above all others – the news. And whether you agreed with them or not, each had an editorial process that governed what news was shared. We had a loyalty to our chosen news destinations that was built on trust.

Over the past two decades, this trust has broken down due to one primary factor – our continuing use of social media. And that has dramatically shifted how we get our news.

In the US, three out of every four people use online sources to get their news. One in two use social media.  Those aged 18 to 24 are more than twice as likely to rely on social media. In the UK, under-35s get more of their news from Social Media than any other source.

Also, influencers have become a source of news, particularly amongst young people. In the US, a quarter of those 18 to 24 used Instagram as a source of news about COVID.

This means that most times, we’re getting our news through a social media lens. Let’s set aside for a moment the filtering and information veracity problems that introduces. Let’s just talk about intent for a moment.

I have talked extensively in the past about information foraging when it comes to search. When information is “patchy” and spread diversely, the brain has to make a quickly calculated guess about which patch it’s most likely to find the information in it’s looking for. With Information Foraging, the intent we have frames everything that comes after.

In today’s digital world, information sources have disaggregated into profoundly patchy environments. We still go to news-first destinations like CNN or Fox News but we also get much of our information about the world through our social media feeds. What was interesting about the Reuters report was that it was started before the COVID pandemic, but the second part of the study was conducted during COVID. And it highlights a fascinating truth about our relationship with the news when it comes to trust.

The study shows that the majority of us don’t trust the news we get through social media but most times, we’re okay with that. Less than 40% of people trust the news in general, and even when we pick a source, less than half of us trust that particular channel. Only 22% indicated they trust the news they see in social media. Yet half of us admit we use social media to get our news. The younger we are, the more reliant we are on social media for news. The fastest growing sources for news amongst all age groups – but especially those under 30 – are Instagram, SnapChat and WhatsApp.

Here’s another troubling fact that fell out of the study. Social platforms, especially Instagram and SnapChat, are dominated by influencers. That means that much of our news comes to us by way of a celebrity influencer reposting it on their feed. This is a far cry from the editorial review process that used to act as a gate keeper on our trusted news sources.

So why do we continue to use news sources we admit we don’t trust? I suspect it may have to do with something called the Meaning Maintenance Model. Proposed in 2006 by Heine, Proulx and Vohs, the model speculates that a primary driver for us is to maintain our beliefs in how the world works. This is related to the sense making loop (Klein, Moon and Hoffman) I’ve also talked about in the past. We make sense of the world by first starting with the existing frame of what we believe to be true. If what we’re experiencing is significantly different from what we believe, we will update our frame to align with the new evidence.

What the Meaning Maintenance Model suggests is that we will go to great lengths to avoid updating our frame. It’s much easier just to find supposed evidence that supports our current beliefs. So, if our intent is to get news that supports our existing world view, social media is the perfect source. It’s algorithmically filtered to match our current frame. Even if we believe the information is suspect, it still comforts us to have our beliefs confirmed. This works well for news about politics, societal concerns and other ideologically polarized topics.

We don’t like to admit this is the case. According to the Reuter’s study, 60% of us indicate we want news sources that are objective and not biased to any particular point of view. But this doesn’t jive with reality at all. As I wrote about in a previous column, almost all mainstream news sources in the US appear to have a significant bias to the right or left. If we’re talking about news that comes through social media channels, that bias is doubled down on. In practice, we are quite happy foraging from news sources that are biased, as long as that bias matches our own.

But then something like COVID comes along. Suddenly, we all have skin in the game in a very real and immediate way. Our information foraging intent changes and our minimum threshold for the reliability of our news sources goes way up. The Reuters study found that when it comes to sourcing COVID information, the most trusted sources are official sites of health and scientific organizations. The least trusted sources are random strangers, social media and messaging apps.

It requires some reading between the lines, but the Reuters study paints a troubling picture of the state of journalism and our relationship with it. Where we get our information directly impacts what we believe. And what we believe determines what we do.

These are high stakes in an all-in game of survival.

Crisis? What Crisis?

You would think that a global pandemic would hold our attention for a while.

Nope.

We’re tired of it. We’re moving on.  We’re off to the next thing.

Granted, in this case the next thing deserves to be focused on. It is abysmal that it still exists. So it should be focused on. Probably for the rest of our lives and beyond – going forward until it ceases to be a thing. But it won’t be. Soon we’ll be talking about something else.

And that’s the point of this post – our collective inability to remain focused on anything without being distracted by the next breaking story in our news feed. How did we come to this?

I blame memes.

To a certain extent, our culture is the product of who we are and who we are is a product of our culture. Each is shaped by the other, going forward in a constantly improvised pas de deux. Humans create the medium – which then becomes part of the environment we adapt too.

Books and the printed word changed who we were for over five centuries.  Cinema has been helping to define us for almost 150 years. And radio and television has been moulding us for the past century. Our creations have helped create who we are.

This has never been truer than with social media. Unlike other media which took discrete chunks of our time and attention, social media is ubiquitous and pervasive. According to a recent survey, we spend on average 2 hours and 23 minutes per day on social media. That is about 13% of our waking hours.  Social media has become intertwined with our lives to the point that we had to start qualifying what happens where with labels like “IRL” (In Real Life).

There is another difference between social media and what has come before it. Almost every previous entertainment medium that has demanded our attention has been built on the foundation of a long form narrative arc. Interacting with each medium has been a process – a commitment to invest a certain amount of time to go on a journey with the storyteller. The construction of a story depends on patterns that are instantly recognized by us. Once we identify them, we are invested in discovering the outcome. We understand that our part of the bargain is to exchange our time and attention. The payoff is the joy that comes from us making sense of a new world or situation, even if it is imaginary.  

But social media depends on a different exchange. Rather than tapping into our inherent love of the structure of a story it depends on something called variable intermittent rewards. Essentially, it’s the same hook that casinos use to keep people at a slot machine or table. Not only is it highly addictive, it also pushes us to continually scroll to the next thing. It completely bypasses the thinking part of our brains and connects directly to the reward center buried in our limbic system. Rather than ask for our time and attention social media dangles a never-ending array of bright, shiny memes that asks nothing from us: no thinking, almost no attention and a few seconds of our time at most. For a lazy brain, this is the bargain of a lifetime.

It’s probably not a coincidence that the media that are most dependent on advertising are also the media that avoids locking our attention on a single topic for an extended period. This makes social media the perfect match for interruptive ad forms. They are simply slotted into the never-ending scroll of memes.

Social media has only been around for a little over 2 decades. It has been a significant part of our lives for half that time. If even a little bit of what I suspect is happening is indeed taking place, that scares the hell out of me. It would mean that no other medium has changed us so much and so quickly.

That is something worth paying attention to. 

How Social Media is Rewiring our Morality

Just a few short months ago, I never dreamed that one of the many fault lines in our society would be who wore a face mask and who didn’t. But on one day last week, most of the stories on CNN.com were about just that topic.

For reasons I’ll explain at the end of this post, the debate has some interesting moral and sociological implications. But before we get to that, let’s address this question: What is morality anyway?

Who’s On First?

In the simplest form possible, there is one foundational evolutionary spectrum to what we consider our own morality, which is: Are we more inclined to worry about ourselves or worry about others? Each of us plots our own morals somewhere on this spectrum.

At one end we have the individualist, the one who continually puts “me first.” Typically, the morals of those focused only on themselves concern individual rights, freedoms and beliefs specific to them. This concern for these rights does not extend to anyone considered outside their own “in” group.

As we move across the spectrum, we next find the familial moralist: Those who worry first about their own kin. Morality is always based on “family first.”

Next comes those who are more altruistic, as long as that altruism is directed at those who share common ground with themselves.  You could call this the “we first” group.

Finally, we have the true altruist, who believes in a type of universal altruism and that a rising tide truly lifts all boats.  

This concept of altruism has always been a bit of a puzzle for early evolutionists. In sociological parlance, it’s called proactive prosociality — doing something nice for someone who is not closely related to you without being asked. It seems at odds with the concept of the Selfish Gene, first introduced by evolutionary biologist Richard Dawkins in his book of the same name in 1976.

But as Dawkins has clarified over and over again since the publication of the book, selfish genes and prosociality are not mutually exclusive. They are, in fact, symbiotic.

Moral Collaboration

We have spent about 95% or our entire time as a species as hunter-gatherers. If we have evolved a mechanism of morality,  it would make sense to be most functional in that type of environment.

Hunter-gatherer societies need to collaborate. This is where the seeds of reciprocal altruism can be found. A group of people who work together to ensure continued food supplies will outlive and out-reproduce a group of people who don’t.  From a selfish gene perspective, collaboration will beat stubborn individualism.

But this type of collaboration comes with an important caveat: It only applies to individuals that live together in the same communal group.

Social conformity acts as a manual override on our own moral beliefs. Even in situations where we may initially have a belief of what is right and wrong, most of us will end up going with what the crowd is doing.

It’s an evolutionary version of the wisdom of crowds. But our evolved social conformity safety net comes with an important caveat: it assumes that everyone in the group is  in the same physical location and dealing with the same challenge.  

There is also a threshold effect there that determines how likely we are to conform. How we will act in any given situation will depend on a number of factors: how strong our existing beliefs are, the situation we’re in, and how the crowd is acting. This makes sense. Our conformity is inversely related to our level of perceived knowledge. The more we think we know, the less likely it is that we’ll conform to what the crowd is doing.

We should expect that a reasonably “rugged” evolutionary environment where survival is a continual struggle would tend to produce an optimal moral framework somewhere in the middle of familial and community altruism, where the group benefits from collaboration but does not let its guard down against outside threats.

But something interesting happens when the element of chronic struggle is removed, as it is in our culture. It appears that our morality tends to polarize to opposite ends of the spectrum.

Morality Rewired

What happens when our morality becomes our personal brand, part of who we believe we are? When that happens, our sense of morality migrates from the evolutionary core of our limbic brain to our cortex, the home of our personal brand. And our morals morph into a sort of tribal identity badge.

In this case, social media can short-circuit the evolutionary mechanisms of morality.

For example, there has been a proven correlation  between prosociality and the concept of “watching eye.” We are more likely to be good people when we have an audience.

But social media twists the concept of audience and can nudge our behavior from the prosocial to the more insular and individualistic end of the spectrum.

The successfulness of social conformity and the wisdom of crowds depends on a certain heterogeneity in the ideological makeup of the crowd. The filter bubble of social media strips this from our perceived audience, as I have written. It reinforces our moral beliefs by surrounding us with an audience that also shares those beliefs. The confidence that comes from this tends to push us away from the middle ground of conformed morality toward outlier territory. Perhaps this is why we’re seeing the polarization of morality all too evident today.

As I mentioned at the beginning, there may never have been  a more observable indicator of our own brand of morality than the current face-mask debate.

In an article on Businessinsider.com, Daniel Ackerman compared it to the crusade against seat belts in the 1970’s. Certainly when it comes to our perceived individual rights and not wanting to be told what to do, there are similarities. But there is one crucial difference. You wear seat belts to save your own life. You wear a face mask to save other lives.

We’ve been told repeatedly that the main purpose of face masks is to stop you spreading the virus to others, not the other way around. That makes the decision of whether you wear a face mask or not the ultimate indicator of your openness to reciprocal altruism.

The cultural crucible in which our morality is formed has changed. Our own belief structure of right and wrong is becoming more inflexible. And I have to believe that social media may be the culprit.

Media’s Mea Culpa Moment

It’s hard to see when you’re stuck inside. And I’m not talking about self-isolating during a pandemic. I’m talking about our perspective of the media landscape.

The Problem with Politics

Currently, the concept of “Us vs Them” is embedded into our modern idea of politics. Populist politics, by its very nature, needs an enemy to blame. It forces you to pick sides. It creates a culture of antagonism, eroding social capital and dismantling any bipartisan trust. We are far down this path. Perhaps too far to turn back. But we have to realize that no nation or region in modern history has ever prospered in the long term by wantonly destroying social capital. There are many examples of how regionalism, xenophobia and populism have caused nations to regress. There is no example of these things leading to prosperity and long-term success. Not one. Yet this is the path we seem to have chosen.

If you look at the media, it’s politicians that are to blame for all our problems, whether they’re on the right or left. Based on most mainstream media, with its inherent, left wing bias, there is a personification of the problem, primarily in the President. “If Trump wasn’t there, things would be better.” But the problem would still persist. Much as we left leaning individuals found Obama a more palatable choice for president, the problem was here then as well. That’s how we got to where we are today.

The sad truth is, Trump didn’t cause the problem. He just capitalized on it. So we have to look elsewhere for where the problem originated. And that leads us to an uncomfortable reality. We are the problem – meaning we – the media, particularly in the U.S. But it’s hard to see that when you’re looking from the inside. So last week I changed my perspective.

Because of COVID-19, we should all be focused on the same story, perhaps for the first time in our lives. This gives us an unprecedented opportunity to compare the media landscapes against what should be a fairly objective baseline.

The Canadian Litmus Test

I’m Canadian — and for Americans, I know that living next to Canada is like having “The Simpsons'” Ned Flanders for a neighbor. We seem nice and polite, but you can’t help feeling that we’re constantly judging you. 

But Canada does offers Americans the chance to compare cultures that have much in common but with some key critical differences. It was this comparison that geographer, historian and anthropologist Jared Diamond employed in his latest book, “UpheavalTurning Points for Nations in Crisis.”

“Many of Canada’s social and political practices are drastically different from those of the U.S., such as with regards to national health plans, immigration, education, prisons, and balance between community and individual interests,” he writes. ”Some problems that Americans regard as frustratingly insoluble are solved by Canadians in ways that earn widespread public support.”

As a case in point, Canada has handled COVID-19 in a notably different way. Our pandemic response has been remarkably non-partisan. For example, we have the unusual spectacle of our most Trump-like politician, Ontario Conservative Premier Doug Ford, stepping up as a compassionate leader who is working effectively with Liberal Prime Minister Justin Trudeau and his own opposition.

The Myth of Impartial Reporting

This is not the case in the U.S. Because of America’s political divides, it can’t even agree on what should be a simple presentation of fact on the story that affects us all equally. 

A recent PEW study found that where you turn for your news will significantly impact your understanding of things like when a vaccine will be ready or whether coronavirus came about naturally. 

To check this out, I did a comparison of the three most popular U.S. news sites on April 29.

Let’s start with CNN.  Of the 28 news items featured on the home page “above the fold,” 16 had an overt left bias. The most prominent was  inflammatory, dealing with Trump’s handling of the pandemic and his blowing up at press criticism. A Biden story on the Tara Reade accusations was buried in small print links near the bottom.

Now let’s go to the other side of the spectrum: Fox News, which also featured 28 news items “above the fold.” Of these 14 had an overt right bias. Again, the headline was inflammatory, calling out Biden on the Tara Reade allegations. There was no mention of any Trump temper tantrums on the home page. 

Finally, MSNBC’s headline story was actually focused on COVID-19 and the Remdesivir trial results and had no political bias. The site only had nine news items above the fold. Four of these had a left-leaning bias. 

The home pages bore almost no resemblance to each other. You would be hard-pressed to understand that each of these sites represented the news from the same country on the same day.

Now, let’s compare with Canada’s top two news sites, CBC and Global News. 

About 60% of the stories covered were the same on both sites and given roughly the same priority. The same lead story was featured on both — about a missing Canadian military helicopter. On CBC, only one appeared to have any political bias at all and it was definitely not explicit, while none of Global’s did.

That’s in comparison to the American news sites, where over half the stories featured — and all the lead ones — were designed and written to provoke anger, pitting “us” against “them.”

Once mainstream media normalizes this antagonistic approach, it then gets shunted over to social media, where it’s stripped of context, amplified and shared. Mainstream media sets the mood of the nation, and that mood is anger. Social media then whips it into a frenzy. 

Both left- and right-wing media outlets are equally guilty. CNN’s overriding editorial tone is, “Can you believe how stupid they are?” Fox’s is, “They think you’re stupid and they’re trying to pull a fast one on you.” No wonder there is no common ground where public discourse across the political divide can begin.

Before COVID-19, perhaps we could look at this with a certain amount of resignation and even bemusement. If you’re “us” there is a certain satisfaction in vilifying “them.” But today, the stakes are too high. People are dying because of it. Somehow, the media has to turn America’s ideological landscape from a war zone into a safe space.

Looking Back at a Decade That’s 99.44% Done

Remember 2010? For me that was a pretty important year. It was the year I sold my digital marketing business. While I would continue to actively work in the industry for another 3 years, for me things were never the same as they were in 2010. And – looking back – I realize that’s pretty well true for most of us. We were more innocent and more hopeful. We still believed that the Internet would be the solution, not the problem.

In 2010, two big trends were jointly reshaping our notions of being connected. Early in the year, former Morgan Stanley analyst Mary Meeker laid them out for us in her “State of the Internet” report. Back then, just three years after the introduction of the iPhone, internet usage from mobile devices hadn’t even reached double digits as a percentage of overall traffic. Meeker knew this was going to change, and quickly. She saw mobile adoption on track to be the steepest tech adoption curve in history. She was right. Today, over 60% of internet usage comes from a mobile device.

The other defining trend was social media. Even then, Facebook had about 600 million users, or just under 10% of the world’s population. When you had a platform that big – connecting that many people – you just knew the consequences will be significant. There were some pretty rosy predications for the impact of social media.

Of course, it’s the stuff you can’t predict that will bite you. Like I said, we were a little naïve.

One trend that Meeker didn’t predict was the nasty issue of data ownership. We were just starting to become aware of the looming spectre of privacy.

The biggest Internet related story of 2010 was WikiLeaks. In February, Julian Assange’s site started releasing 260,000 sensitive diplomatic cables sent to them by Chelsea Manning, a US soldier stationed in Iraq. According to the governments of the world, this was an illegal release of classified material, tantamount to an act of espionage. According to public opinion, this was shit finally rolling uphill. We revelled in the revelations. Wikileaks and Julian Assange was taking it to the man.

That budding sense of optimism continued throughout the year. By December of 2010, the Arab Spring had begun. This was our virtual vindication – the awesome power of social media was a blinding light to shine on the darkest nooks and crannies of despotism and tyranny. The digital future was clear and bright. We would triumph thanks to technology. The Internet had helped put Obama in the White House. It had toppled corrupt regimes.

A decade later, we’re shell shocked to discover that the Internet is the source of a whole new kind of corruption.

The rigidly digitized ideals of Zuckerberg, Page, Brin et al seemed to be a call to arms: transparency, the elimination of bureaucracy, a free and open friction-free digital market, the sharing economy, a vast social network that would connect humanity in ways never imagined, connected devices in our pockets – in 2010 all things seemed possible. And we were naïve enough to believe that those things would all be good and moral and in our best interests.

But soon, we were smelling the stench that came from Silicon Valley. Those ideals were subverted into an outright attack on our privacy. Democratic elections were sold to the highest bidder. Ideals evaporated under the pressure of profit margins and expanding power. Those impossibly bright, impossibly young billionaire CEO’s of ten years ago are now testifying in front of Congress. The corporate culture of many tech companies reeks like a frat house on Sunday morning.

Is there a lesson to be learned? I hope so. I think it’s this. Technology won’t do the heavy lifting for us. It is a tool that is subject to our own frailty. It amplifies what it is to be human. It won’t eliminate greed or corruption unless we continually steer it in that direction. 

And I use the term “we” deliberately. We have to hold tech companies to a higher standard. We have to be more discerning of what we agree to. We have to start demanding better treatment and not be willing to trade our rights away with the click of an accept button. 

A lot of what could have been slipped through our fingers in the last 10 years.  It shouldn’t have happened. Not on our watch.

This Election, Canucks were “Zucked”

Note: I originally wrote this before results were available. Today, we know Trudeau’s Liberals won a minority government, but the Conservatives actually won the popular vote: 34.4% vs 33.06% for the Liberals. It was a very close election.

As I write this, Canadians are going to the polls in our national election. When you read this, the outcome will have been decided. I won’t predict — because this one is going to be too close to call.

For a nation that is often satirized for our tendencies to be nice and polite, this has been a very nasty campaign. So nasty, in fact, that in focusing on scandals and personal attacks, it forgot to mention the issues.

Most of us are going to the polls today without an inkling of who stands for what. We’re basically voting for the candidate we hate the least. In other words, we’re using the same decision strategy we used to pick the last guest at our grade 6 birthday party.

The devolvement of democracy has now hit the Great White North, thanks to Facebook and Mark Zuckerberg.

While the amount of viral vitriol I have seen here is still a pale shadow of what I saw from south of the 49th in 2016, it’s still jarring to witness. Canucks have been “Zucked.” We’re so busy slinging mud that we’ve forgotten to care about the things that are essential to our well being as a nation.

It should come as news to no one that Facebook has been wantonly trampling the tenets of democracy. Elizabeth Warren recently ran a fake ad on Facebook just to show she could. Then Mark Zuckerberg defended Facebook last week when he said: “While I certainly worry about an erosion of truth, I worry about living in a world where you can only post things that tech companies decide to be 100 per cent true.”

Zuckerberg believes the onus lies with the Facebook user to be able to judge what is false and what is not. This is a suspiciously convenient defense of Facebook’s revenue model wrapped up as a defense of freedom of speech. At best it’s naïve, not to mention hypocritical. What we see is determined by Facebook’s algorithm. At worst it’s misleading and malicious.

Hitting hot buttons tied to emotions is nothing new in politics. Campaign runners have been drawing out and sharpening the long knives for decades now. TV ads added a particularly effective weapon into the political arsenal. In the 1964 presidential campaign, it even went nuclear with Lyndon Johnson’s famous “Daisy” Ad.

But this is different. For many reasons.

First of all, there is the question of trust in the channel. We have been raised in a world where media channels historically take some responsibility to delineate between what they say is factual (i.e., the news) and what is paid persuasion (i.e., the ads).

In his statement, Zuckerberg is essentially telling us that giving us some baseline of trust in political advertising is not Facebook’s job and not their problem. We should know better.

But we don’t. It’s a remarkably condescending and convenient excuse for Zuckerberg to appear to be telling us “You should be smarter than this” when he knows that this messaging has little to do with our intellectual horsepower.

This is messaging that is painstakingly designed to be mentally processed before the rational part of our brain even kicks in.

In a recent survey, three out of four Canadians said they had trouble telling which social media accounts were fake. And 40% of Canadians say they had found links to stories on current affairs that were obviously false. Those were only the links they knew were fake. I assume that many more snuck through their factual filters. By the way, people of my generation are the worst at sniffing out fake news.

We’ve all seen it, but only one third of Canadians 55 and over realize it. We can’t all be stupid.

Because social media runs on open platforms, with very few checks and balances, it’s wide open for abuse. Fake accounts, bots, hacks and other digital detritus litter the online landscape. There has been little effective policing of this. The issue is that cracking down on this directly impacts the bottom line. As Upton Sinclair said: “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

Even given these two gaping vulnerabilities, the biggest shift when we think of social media as an ad platform is that it is built on the complexity of a network. The things that come with this — things like virality, filter bubbles, threshold effects — have no corresponding rule book to play by. It’s like playing poker with a deck full of wild cards.

Now — let’s talk about targeting.

When you take all of the above and then factor in the data-driven targeting that is now possible, you light the fuse on the bomb nestled beneath our democratic platforms. You can now segment out the most vulnerable, gullible, volatile sectors of the electorate. You can feed them misinformation and prod them to action. You can then sit back and watch as the network effects play themselves out. Fan — meet shit. Shit — meet fan.

It is this that Facebook has wrought, and then Mark Zuckerberg feeds us some holier-than-thou line about freedom of speech.

Mark, I worry about living in a world where false — and malicious — information can be widely disseminated because a tech company makes a profit from it.