Friendship: Uncoupled

This probably won’t come as a shock to anyone reading this: A recent study says that it’s not if you use social media that determines your happiness, but how you use social media. 

Derrick Wirtz, an associate professor of teaching in psychology at the University of British Columbia-Okanagan, took a close look at how people use three major social platforms—Facebook, Twitter and Instagram—and if how you use it can make you happier or sadder.

As I said, most of you probably said to yourself, “Yeah, that checks out.” But this study does bring up an interesting nuance with some far-reaching implications. 

In today’s world, we’re increasingly using Facebook to maintain our social connections. And, according to Facebook’s mission statement, that’s exactly what’s supposed to happen: “People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.”

The interesting thing in this study is the divide between our social activities — those aimed at bonding versus those aimed at gaining status — and how that impacts our moods and behaviors. It’s difficult to untangle the effect of those two factors, because they are so intertwined in our psyches. But according to this study, Dr. Wirtz found that some of us are spending far more time on social media “status-checking” then actually tending to our friendships.

“Passive use, scrolling through others’ posts and updates, involves little person-to-person reciprocal interaction while providing ample opportunity for upward comparison,” says Wirtz. 

We can scroll our newsfeed without any actual form of engagement — but that’s not what we were designed to do. Our social skills evolved to develop essential mutually beneficial bonds in a small group setting.

Friendship is meant to be nurtured and tended to organically and intimately in a face-to-face environment.  But the distal nature of social media is changing the dynamics of how we maintain relationships in our network. 

Take how we first establish friendships, for instance. When you meet someone for the very first time, how do you decide whether you’re going to become friendly or not? The answer, not surprisingly, is complex and nuanced. Our brain works overtime to determine whether we should bond or not. But, also not surprisingly, almost none of that work is based on rational thought.

UCLA psychologist Dr. Elizabeth Laugeson teaches young adults with social challenges, such as those on the autism spectrum, how to take those very first steps toward friendship when meeting a stranger. If you can’t pick up the face-to-face nuances of body language and unspoken social cues intuitively, becoming friends can be incredibly difficult. Essentially, we are constantly scanning the other person for small signs of common interest from which we can start working toward building trust. 

Even if you clear this first hurdle, it’s not easy to build an actual friendship. It requires a massive investment of our time and energy. A recent study from the University of Kansas found it takes about 50 hours of socializing just to go from acquaintance to casual friend. 

Want to make a “real” friend? Tack another 40 hours onto that. And if you goal is to become a “close” friend, you’d better be prepared to invest at least a total of 200 hours. 

So that begs the question, why would we make this investment in the first place? Why do we need friends? And why do we need at least a handful of really close friends? The answer lies in the concept of reciprocity. 

From an evolutionary perspective, having friends made it easier to survive and reproduce. We didn’t have to go it alone. We could help each other past the rough spots, even if we weren’t related to each other. Having friends stacked the odds in our favor. 

This is when our investment in all those hours of building friendships paid off. Again, this takes us back to the intimate and organic roots of friendship. 

Our brains, in turn, reinforced this behavior by making sure that having friends made us happy. 

Of course, like most human behaviors, it’s not nearly that simple or benign. Our brains also entwine the benefits of friendship with the specter of social status, making everything much more complicated. 

Status also confers an evolutionary advantage. For many generations, we have trod this fine line between being a true friend and being obsessed with our own status in the groups where we hang out.

And then came social media.

As Wirtz’s study shows, we now have this dangerous uncoupling between these two sides of our nature. With social media, friendship is now many steps removed from its physical, intimate and organic roots. It is stripped of the context in which it evolved. And, it appears, the intertwined strands of friendship and social status are unraveling. When this happens, time on social media can reap the anxiety and jealousy of status-checking without any of the joy that comes from connecting with and helping a friend. 

On a person-to-person basis, this uncoupling can be disturbing and unfortunate. But consider what may happen when these same tendencies are amplified and magnified through a massive, culture-wide network.

Analyzing the Problem with News “Analysis”

Last week, I talked about the Free News problem. In thinking about how to follow that up, I ran across an interesting study that was published earlier this year in the Science Advances Journal. One of the authors was Duncan Watts, who I’ve mentioned repeatedly in previous columns.

In the study, the research team tackled the problem of “Fake News” which is – of course – another symptom of the creeping malaise that is striking the industry of journalism. It certainly has become a buzzword in the last few years. But the team found that the problem of fake news may not be a problem at all. It makes up just 0.15% of our entire daily media diet. In fact, across all ages in the study, any type of news is – at the most – just 14.2% of our total media consumption.

The problem may be our overuse of the term “news” – applying it to things we think are news but are actually just content meant to drive advertising revenues. In most cases, this is opinion (sometimes informed but often not) masquerading as news in order to generate a lot of monetizable content. Once again, to get to the root of the problem, we have to follow the money.

If we look again at the Ad Fontes Media Bias chart, it’s not “news” that’s the problem. Most acknowledged leaders in true journalism are tightly clustered in the upper middle of the chart, which is where we want our news sources to be. They’re reliable and unbiased.

If we follow the two legs of the chart down to the right or left into the unreliable territory where we might encounter “fake” news, we find from the study mentioned above that this makes up an infinitesimal percentage of the media most of us actually pay attention to. The problem here can be found in the middle regions of the chart. This is where we find something called analysis. And that might just be our problem.

Again, we have to look at the creeping poison of incentive here. Some past students from Stanford University have an interesting essay about the economics of journalism that shows how cable tv and online have disrupted the tenuous value chain of news reporting.

The profitability of hard reporting was defined in the golden age of print journalism – specifically newspapers. The problem with reporting as a product is twofold. One is that news in non-excludable. Once news is reported anyone can use it. And two is that while reporting is expensive, the cost of distribution is independent of the cost of reporting. The cost of getting the news out is the same, regardless of how much news is produced.

While newspapers were the primary source of news, these two factors could be worked around. Newspapers came with a built-in 24-hour time lag. If you could get a one day jump on the competition, you could be very profitable indeed.

Secondly, the fixed distribution costs made newspapers a very cost-effective ad delivery vehicle. It cost the newspapers next to nothing to add advertising to the paper, thereby boosting revenues.

But these two factors were turned around by Internet and Cable News. If a newspaper bore the bulk of the costs by breaking a story, Cable TV and the Internet could immediately jump on board and rake in the benefits of using content they didn’t have to pay for.

And that brings us to the question of news “analysis”. Business models that rely on advertising need eyeballs. And those eyeballs need content. Original content – in the form of real reporting – is expensive and eats into profit. But analysis of news that comes from other sources costs almost nothing. You load up on talking heads and have them talk endlessly about the latest story. You can spin off never ending reams of content without having to invest anything in actually breaking the story.

This type of content has another benefit; customers love analysis. Real news can be tough to swallow. If done correctly, it should be objective and based on fact.  Sometimes it will force us to reconsider our beliefs. As is often the case with news, we may not like what we hear.

Analysis – or opinion – is much more palatable. It can be either partially or completely set free from facts and swayed and colored to match the audience’s beliefs and biases. It scores highly on the confirmation bias scale. It hits all the right (or left) emotional buttons. And by doing this, it stands a better chance of being shared on social media feeds. Eyeballs beget eyeballs. The gods of corporate finance smile benignly on analysis content because of its effectiveness at boosting profitability.

By understanding how the value chain of good reporting has broken down due to this parasitic piling on by online and cable platforms in the pursuit of profit, we begin to understand how we can perhaps save journalism. There is simply too much analytical superstructure built on top of the few real journalists that are doing real reporting. And the business model that once supported that reporting is gone.

The further that analysis gets away from the facts that fuel it, the more dangerous it becomes. At some point it crosses the lines from analysis to opinion to propaganda. The one thing it’s not is “news.” We need to financially support through subscription the few that are still reporting on the things that are actually happening.

Why Free News is (usually) Bad News

Pretty much everything about the next week will be unpredictable. But whatever happens on Nov. 3, I’m sure there will be much teeth-gnashing and navel-gazing about the state of journalism in the election aftermath.

And there should be. I have written much about the deplorable state of that particular industry. Many, many things need to be fixed. 

For example, let’s talk about the extreme polarization of both the U.S. population and their favored news sources. Last year about this time, the PEW Research Center released a study showing that over 30% of Americans distrust their news sources. 

But what’s more alarming is, when we break this down by Republicans versus Democrats, only 27% of Democrats didn’t trust the news for information about politics or elections. With Republicans, that climbed to a whopping 67%. 

The one news source Republicans do trust? Fox News. Sixty-five percent of them say Fox is reliable. 

And that’s a problem.

Earlier this year, Ad Fontes Media came out with its Media Bias Chart. It charts major news and media channels on two axes: source reliability and political bias. The correlation between bias and reliability is almost perfect. The further a news source is out to the right or left, the less reliable it is.

How does Fox fare? Not well. Ad Fontes separates Fox TV from Fox Online. Fox Online lies on the border between being “reliable for news, but high in analysis/opinion content” and “some reliability issues and/or extremism.” Fox TV falls squarely in the second category.

I’ve written before that media bias is not just a right-wing problem. Outlets like CNN and MSNBC show a significant left-leaning bias. But CNN Online, despite its bias, still falls within the “Most Reliable for News” category. According to Ad Fontes, MSNBC has the same reliability issues as Fox.

The question that has to be asked is “How did we get here?”  And that’s the question tackled head-on in a new book, “Free is Bad,” by John Marshall.

I’ve known Marshall for ages. He has covered a lot of the things I’ve been writing about in this column. 

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” 

Upton Sinclair

The problem here is one of incentive. Our respective media heads didn’t wake up one morning and say, “You know what we need to be? A lot more biased!” They have walked down that path step by step, driven by the need to find a revenue model that meets their need for profitability. 

When we talk about our news channels, the obvious choice to be profitable is to be supported by ads. And to be supported by ads, you have to be able to target those ads. One of the most effective targeting strategies is to target by political belief, because it comes reliably bundled with a bunch of other beliefs that makes it very easy to predict behaviors. And that makes these ads highly effective in converting prospects.

This is how we got to where we are. But there are all types of ways to prop up your profit through selling ads. Some are pretty open and transparent. Some are less so. And that brings us to a particularly interesting section of Marshall’s book. 

John Marshall is a quant geek at heart. He has been a serial tech entrepreneur — and, in one of those ventures, built a very popular web analytics platform. He also has intimate knowledge of how the sausages are made in the ad-tech business. He knows sketchy advertising practices when he sees them. 

Given all of this, Marshall was able to undertake a fascinating analysis of the ads we see on various news platforms that dovetails nicely with the Ad Fontes chart. 

Marshall created the Ad Shenanigans chart. Basically, he did a forensic analysis of the advertising approaches of various online news platforms. He was looking for those that gathered data about their users, sold traffic to multiple networks, featured clickbait chumboxes and other unsavory practices. Then he ranked them accordingly.

Not surprisingly, there’s a pretty strong correlation between reputable reporting and business ethics. Highly biased and less reputable sites on the Ad Fontes Bias Chart (Breitbart, NewsMax, and Fox News) all can also be found near the top of Marshall’s Ad Shenanigans Chart. Those that do seem to have some ethics when it comes to the types of ads they run also seem to take objective journalism seriously. Case in point, The Guardian in the UK and ProPublica in the U.S.

The one anomaly in the group seems to be CNN. While it does fare relatively well on reputable reporting according to Ad Fontes, CNN appears to be willing to do just about anything to turn a buck. It ranks just a few slots below Fox in terms of “ad shenanigans.”

Marshall also breaks out those platforms that have a mix of paid firewalls and advertising. While there are some culprits in the mix such as the Daily Caller, Slate and the National Review, most sites that have some sort of subscription model seem to be far less likely to fling the gates of their walled gardens open to the ethically challenged advertising hordes. 

All of this drives home Marshall’s message: When it comes to the quality of your news sources, free is bad. As soon as something costs you nothing, you are no longer the customer. You’re the product. Invisible hand market forces are no longer working for you. They are working for the advertiser. And that means they’re working against you if you’re looking for an unbiased, quality news source.

Lockdown Advice For A Long Winter

No matter where you live in the world, it’s likely you’re going to be forced to spend a lot of time at home. And if that home includes others  — like your beloved life partner — living under the same roof, you may experience a little friction now and again. In anticipation of this, I thought I’d share a few insights on what might come.

There is No Gender Equality with COVID

recent study by Oxford, Cambridge and Zurich Universities found that women’s sense of mental well being took a more significant drop then men due to COVID. The researchers speculated on a number of reasons for this, but were unable to narrow it down to any identifiable factor. Perhaps, they reasoned,  it had something to do with women losing jobs at a greater rate than men, taking on a greater share of the burden of home schooling — or the fact that even when both men and women were home all the time, women still did more than their fair share of domestic chores. But no, even when controlling for these factors, it didn’t explain why women were becoming more distressed than men. 

Maybe it was something else.

Warriors and Worriers: Two Approaches to Survival

In 2014, psychologist Joyce Benenson published her book “Warriors and Worries: The Survival of the Sexes.”  As an evolutionary biologist, she has spent years studying children and primates, looking for innate rather than socialized differences between the sexes. 

Her findings turned conventional wisdom on its head. Women may not be more sociable than men, and men may not be more competitive than women. It’s just that they define those things differently. 

Men are quite comfortable forming packs of convenience to solve a particular problem, whether it is defending against an enemy or winning a pick-up basketball game. This could explain why team sports entertainment always seems to have a male bias.

Women, on the other hand, have fewer but much more complex relationships that they deem essential to their survival as the primary caregiver for their family. The following is from the abstract of a 1990 study by Berenson: “Although males and females did not differ in the number of best friends they reported, males were found to have larger social networks than females. Further, for males, position in a social network was more highly linked with acceptance by the peer group. Finally, males were concerned with attributes that could be construed as important for status in the peer group, and females were concerned with attributes that appeared essential to relationships with a few friends.”

If we apply this to the current COVID situation, we begin to see why women might be struggling more with lockdown then men. A male’s idea of socializing might be more easily met with a Zoom call or another type of digital connection, such as online gaming. But connecting in these way lacks the bandwidth necessary to properly convey the complexity of a female relationship. 

Introverts and Extroverts Revisited

Of course, gender isn’t the only variable at play here. I’ve written before about what happens when an extrovert and introvert are locked down in the same house together (those being my wife and myself). One of the things I’ve noticed is a different level of comfort we have at being left alone with our thoughts. 

Because I have always been a writer of one kind or another, I require time to ruminate on a fairly frequent basis. I am a little (lot?) dictatorial in my requirements for this: my environment needs to be silent and free from interruption. When the weather is good outside, this is fairly easy. I can grab my laptop and go outside. But in the winter, it’s a different story. My wife is subjected to forced silence so I can have my quiet time.

My wife functions best when there is some type of sensory stimuli, especially the sound of voices. She doesn’t have the same need to sit in silence and be alone with her thoughts. 

And she’s not unique in that. A 2014 study found that most of us fall into the same category. In fact, the researchers found that, “many of the people studied, particularly the men, chose to give themselves a mild electric shock rather than be deprived of external sensory stimuli.”

A Difference in Distraction

When we do look for distraction, we can also have different needs and approaches. Another area I’ve touched on in a past post is how our entertainment delivery platforms have now become entangled with multitasking. 

I like an immersive, interruption-free entertainment experience. The bigger the screen and the louder the sound, the better. I suspect this may be another “male” thing.  Again, this preference tends to cast a dictatorial tone on our household, so I generally retreat to my media cave in the basement. I also tuck my phone away while I’m watching. 

My wife prefers to multiscreen when watching TV and to do so in the highest traffic area of our house. For her, staying connected is more important than being immersed in whatever she might be watching. 

These differences in our entertainment preferences often means we’re not together when we seek distraction. 

I don’t think this is a bad thing. In a normal world filled with normal activities, this balancing of personal preference is probably accommodated by our normal routines. But in a decidedly abnormal world where we spend every minute together in the same house, these differences become more noticeable.

Try a Little Friluftsliv

In the end, winter is going to be long, lonely and cold for many of us. So we may just want to borrow a strategy from Norwegians: friluftsliv. Basically, it means “open-air living.” Most winters, my main activity is complaining. But this year, I’m going to get away from the screens and social media, strap on a pair of snowshoes and embrace winter.

Amazon Prime: Buy Today, Pay Tomorrow?

This column goes live on the most eagerly anticipated day of the year. My neighbor, who has a never-ending parade of delivery vans stopping in front of her door, has it circled on her calendar. At least one of my daughters has been planning for it for several months. Even I, who tends to take a curmudgeonly view of many celebrations, has a soft spot in my heart for this particular one.

No, it’s not the day after Canadian Thanksgiving. This, my friends, is Amazon Prime Day!

Today, in our COVID-clouded reality, the day will likely hit a new peak of “Prime-ness.” Housebound and tired of being bludgeoned to death by WTF news headlines, we will undoubtedly treat ourselves with an unprecedented orgy of one-click shopping. And who can blame us? We can’t go to Disneyland, so leave me alone and let me order that smart home toilet plunger and the matching set of Fawlty Towers tea towels that I’ve been eyeing. 

Of course, me being me, I do think about the consequences of Amazon’s rise to retail dominance. 

I think we’re at a watershed moment in our retail behaviors, and this moment has been driven forward precipitously by the current pandemic. Being locked down has forced many of us to make Amazon our default destination for buying. Speaking solely as a sample of one, I know check Amazon first and then use that as my baseline for comparison shopping. But I do so for purely selfish reasons – buying stuff on Amazon is as convenient as hell!

I don’t think I’m alone. We do seem to love us some Amazon. In a 2018 survey conducted by Recode, respondents said that Amazon had the most positive impact on society out of any major tech company. And that was pre-Pandemic. I suspect this halo effect has only increased since Amazon has become the consumer lifeline for a world forced to stay at home.

As I give into to the siren call of Bezos and Co., I wonder what forces I might be unleashing. What unintended consequences might come home to roost in years hence? Here are a few possibilities. 

The Corporate Conundrum

First of all, let’s not kid ourselves. Amazon is a for-profit corporation. It has shareholders that demand results. The biggest of those shareholders is Jeff Bezos, who is the world’s richest man. 

But amazingly, not all of Amazon’s shareholders are focused on the quarterly financials. Many of them – with an eye to the long game – are demanding that Amazon adopt a more ethical balance sheet.  At the 2019 Annual Shareholder Meeting, a list of 12 resolutions were brought forward to be voted on. The recommendations included zero tolerance for sexual harassment and hate speech, curbing Amazon’s facial recognition technology, addressing climate change and Amazon’s own environmental impact. These last two were supported by a letter signed by 7600 of Amazon’s own employees. 

The result? Amazon strenuously fought every one of them and none were adopted. So, before we get all warm and gooey about how wonderful Amazon is, let’s remember that the people running the joint have made it very clear that they will absolutely put profit before ethics. 

A Dagger in the Heart of Our Communities

For hundreds of years, we have been building a supply chain that was bound by the realities of geography. That supply chain required some type of physical presence within a stone’s throw of where we live. Amazon has broken that chain and we are beginning to feel the impact of that. 

Community shopping districts around the world were being gutted by the “Amazon Effect” even before COVID. In the last 6 months, that dangerous trend has accelerated exponentially. In a commentary from CNBC in 2018, venture capitalist Alan Patricof worried about the social impact of losing our community gathering spots, “This decline has brought a deterioration in places where people congregated, socialized, made friends and were greeted by a friendly face offering an intangible element of belonging to a community.”

The social glue that held us together has been dissolving over the past two decades. Whether you’re a fan of shopping malls or not (I fall into the “not” category) they were at least a common space where you might run into your neighbor. In his book Bowling Alone, from 2000, Harvard political scientist Robert Putnam documented the erosion of social capital in America. We are now 20 years hence and Putnam’s worst case scenario seems quaintly optimistic now. With the loss of our common ground – in the most literal sense – we increasingly retreat to the echo chambers of social media. 

Frictionless Consumerism

This last point is perhaps the most worrying. Amazon has made it stupid simple to buy stuff. They have relentlessly squeezed every last bit of friction out of the path to purchase. That worries me greatly.

If we could rely on a rational marketplace filled with buyers acting in the best homo economicus tradition, then I perhaps rest easier, knowing that there was some type of intelligence driving Adam Smith’s Invisible Hand. But experience has shown that is not the case. Rampant consumerism appears to be one of the three horsemen of the modern apocalypse. And, if this is true, then Amazon has put us squarely in their path. 

This is not to even mention things like Amazon’s emerging monopoly-like dominance in a formerly competitive marketplace, the relentless downward pressure it exerts on wages within its supply chain, the evaporation of jobs outside its supply chain or the privacy considerations of Alexa. 

Still, enjoy your Amazon Prime Day. I’m sure everything will be fine.

How to Look Past the Nearest Crisis

I was talking to someone the other day who was trying to make plans for 2021. Those plans were dependent on the plans of others. In the course of our conversation, she said something interesting: “It’s so hard to plan because most of the people I’m talking to can’t see past COVID.” 

If anything sums up our current reality, it might be that. We’re all having a lot of trouble seeing past COVID. Or the upcoming U.S. election. Or catastrophic weather events. Or an impending economic crisis. Take your pick. There are so many looming storm clouds on the horizon that it’s difficult to even make out that horizon any more. 

We humans are pretty dependent on the past to tell us what may be happening in the future. We evolved in an environment that — thanks to its stability — was reasonably predictable. In evolutionary survival terms, it was smart to hedge our bets on the future by glancing over our shoulders at the past. If a saber-toothed tiger was likely to eat you yesterday, the odds were very much in favor of it also wanting to eat you tomorrow. 

But our ability to predict things gets thrown for a loop in the face of uncertainty like we’re currently processing. There are just too many variables forced into the equation for us to be able to rely on what has happened in the past. Both the number of variables and the range of variation pushes our prediction probability of error past the breaking point. 

When it comes to planning for the future, we become functionally paralyzed and start living day to day, waiting for the proverbial “other shoe to drop.” 

The bigger problem, however, is that when the world is going to hell in a hand basket, we don’t realize that the past is a poor foundation on which to build our future. Evolved habits die hard, and so we continue to use hindsight to try to move forward. 

And by “we,” I mean everyone — most especially the leaders we elect and the experts we rely on to point us in the right direction.  Many seem to think that a post-COVID world will snap back to be very much like a pre-COVID world.

And that, I’m afraid, may be the biggest problem. You’d think that when worrying about an uncertain future is above our pay grade, there would be someone wiser and smarter than us to rely on and save our collective asses. But if common folk tend to consistently bet on the past as a guide to our future, it’s been shown that people we think of as “experts” double down on that bet. 

A famous study by Philip Tetlock showed just how excruciatingly awful experts were at predicting the future. He assembled a group of 284 experts and got them to make predictions about future events, including those that fell into their area of expertise. Across the board, he found their track record of being correct was only slightly ahead of a random coin toss or a troupe of chimpanzees throwing darts. The more famous the expert, the worse their track record.

Expertise is rooted in experience. Both words spring from the same root: The Latin experiri for “try.” Experience is gained in the past. For experts, their worth comes from their experience in one particular area, so they are highly unlikely to ignore it when predicting the future. They are like the hedgehog in Isiah Berlin’s famous essay “The Hedgehog and The Fox“: They “know one important thing.”

But when it comes to predicting the future, Tetlock found it’s better to be a fox: to “know many little things.” In a complex, highly uncertain world, it’s the generalist  who thrives. 

The reason is pretty simple. In an uncertain world, we have to be more open to sense making in the classic cognitive sense. We have to be attuned to the signals that are playing out in real time and not be afraid to consider new information that may conflict with our current beliefs.

This is how generalists operate. It’s also how science is supposed to operate. Our view of the future should be no more than a hypothesis that we’re willing to have proven wrong. Hedgehogs dig in when their expertise about “one big thing” is questioned. Foxes use it as an opportunity to update their take on reality. 

Foxes have another advantage over hedgehogs. They tend to be dilettantes, spreading their interest over a wide range of topics without diving too deeply into any of them. This keeps their network diverse and expansive, giving them the opportunity to synthesize their sense of reality from the broadest range of signals possible. 

In a world that depends on being nimble enough to shift directions depending on the input your receive, this stacks the odds in favor of the fox. 

Still, it’s against human nature to be so cavalier about our future. We like certainty. We crave predictability. We are big fans of transparent causes and effects. If those things are clouded by complexity and uncertainty, we start constructing our own narratives. Hence the current spike of conspiracy theories, as I noted previously. This is especially true when the stakes are as high as they are now. 

I don’t blame those having a very hard time looking past COVID — or any other imminent disaster. But someone should be. 

It’s time to start honing those fox instincts. 

Tired of Reality? Take 2 Full-Strength Schitt’s Creeks

“Schitt’s Creek” stormed the Emmys by winning awards in every comedy series category — a new record. It was co-creators Dan and Eugene Levy’s gift to the world: a warm bowl of hot cultural soup, brimming with life-affirming values, acceptance and big-hearted Canadian corniness.

It was the perfect entertainment solution to an imperfect time. It was good for what ails us.

It’s not the first time we’ve turned to entertainment for comfort. In fact, if there is anything as predictable as death and taxes, it’s that during times of trial, we need to be entertained.

There is a direct correlation between feel-good fantasy and feeling-shitty reality. The worse things get, the more we want to escape it.

But the ways we choose to be entertained have changed. And maybe — just maybe — the media channels we’re looking to for our entertainment are adding to the problem. 

The Immersiveness of Media

A medium’s ability to distract us from reality depends on how much it removes us from that reality.

Our media channels have historically been quite separate from the real world. Each channel offered its own opportunity to escape. But as the technology we rely on to be entertained has become more capable of doing multiple things, that escape from the real world has become more difficult.

Books, for example, require a cognitive commitment unlike any other form of entertainment. When we read a book, we — in effect — enter into a co-work partnership with the author. Our brains have to pick up where theirs left off, and we together build a fictional world to which we can escape. 

As the science of interpreting our brain’s behavior has advanced, we have discovered that our brains actually change while we read.

Maryanne Wolf explains in her book, “Proust and the Squid: The Story and Science of the Reading Brain”: “Human beings invented reading only a few thousand years ago. And with this invention, we rearranged the very organization of our brain, which in turn expanded the ways we were able to think, which altered the intellectual evolution of our species. . . . Our ancestors’ invention could come about only because of the human brain’s extraordinary ability to make new connections among its existing structures, a process made possible by the brain’s ability to be reshaped by experience.”

Even movies, which dramatically lowered the bar for the cognitive commitments they ask by supplying content specifically designed for two of our senses, do so by immersing us in a dedicated single-purpose environment. The distraction of the real world is locked outside the theater doors.

But today’s entertainment media platforms not only live in the real world, they are the very same platforms we use to function in said world. They are our laptops, our tablets, our phones and our connected TVs.

It’s hard to ignore that world when the flotsam and jetsam of reality is constantly bumping into us. And that brings us to the problem of the multitasking myth.

Multitasking Anxiety

The problem is not so much that we can’t escape from the real world for a brief reprise in a fictional one. It’s that we don’t want to. 

Even if we’re watching our entertainment in our home theater room on a big screen, the odds are very good that we have a small screen in our hands at the same time. We mistakenly believe we can successfully multitask, and our mental health is paying the price for that mistake.

Research has found that trying to multitask brings on a toxic mix of social anxiety, depression, a lessening of our ability to focus attention, and a sociopsychological impairment that impacts our ability to have rewarding relationships. 

When we use the same technology to be entertained that we use to stay on top of our social networks we fall prey to the fear of missing out.

It’s called Internet Communication Disorder, and it’s an addictive need to continually scroll through Facebook, Twitter, WhatsApp and our other social media platforms. It’s these same platforms that are feeding us a constant stream of the very things we’re looking to escape from. 

It may be that laughter is the best medicine, but the efficacy of that medicine is wholly dependent on where we get our laughs.

The ability for entertainment to smooth the jagged edges of reality depend on our being able to shift our minds off the track that leads to chronic anxiety and depression — and successfully escape into a fictional kinder, gentler, funnier world.

For entertainment to be a beneficial distraction, we first have to mentally disengage from the real world, and then fully engage in the fictional one.

That doesn’t work nearly as well when our entertainment delivery channel also happens to be the same addictive channel that is constantly tempting us to tiptoe through the anxiety-strewn landscape that is our social media feed. 

In other words, before going to “Schitt’s Creek,” unpack your other shit and leave it behind. I guarantee it will be waiting for you when you get back.

The Day My Facebook Bubble Popped

I learned this past week just how ideologically homogenous my Facebook bubble usually is. Politically, I lean left of center. Almost all the people in my bubble are the same.

Said bubble has been built from the people I have met in the past 40 years or so. Most of these people are in marketing, digital media or tech. I seldom see anything in my feed I don’t agree with — at least to some extent.

But before all that, I grew up in a small town in a very right-wing part of Alberta, Canada. Last summer, I went to my 40-year high-school reunion. Many of my fellow graduates stayed close to our hometown for those 40 years. Some are farmers. Many work in the oil and gas industry. Most of them would fall somewhere to the right of where I sit in my beliefs and political leanings.

At the reunion, we did what people do at such things — we reconnected. Which in today’s world meant we friended each other on Facebook. What I didn’t realize at the time is that I had started a sort of sociological experiment. I had poked a conservative pin into my liberal social media bubble.

Soon, I started to see posts that were definitely coming from outside my typical bubble. But most of them fell into the “we can agree to disagree” camp of political debate. My new Facebook friends and I might not see eye-to-eye on certain things, but hell — you are good people, I’m good people, we can all live together in this big ideological tent.

On May 1, 2020, things began to change. That was when Canadian Prime Minister Justin Trudeau announced that 1,500 models of “assault-style” weapons would be classified as prohibited, effective immediately. This came after Gabriel Wortman killed 22 people in Nova Scotia, making it Canada’s deadliest shooting spree. Now, suddenly posts I didn’t politically agree with were hitting a very sensitive raw nerve. Still, I kept my mouth shut. I believed arguing on Facebook was pointless.

Through everything that’s happened in the four months since (it seems like four decades), I have resisted commenting when I see posts I don’t agree with. I know how pointless it is. I realize that I am never going to change anyone’s mind through a comment on a Facebook post.

I understand this is just an expression of free speech, and we are all constitutionally entitled to exercise it. I stuck with the Facebook rule I imposed for myself — keep scrolling and bite your tongue. Don’t engage.

I broke that rule last week. One particular post did it. This post implied that with a COVID-19 survival rate of almost 100%, why did we need a vaccine? I knew better, but I couldn’t help it.

I engaged. It was limited engagement to begin with. I posted a quick comment suggesting that with 800,000 (and counting) already gone, saving hundreds of thousands of lives might be a pretty good reason. Right or left, I couldn’t fathom anyone arguing with that.

I was wrong. Oh my God, was I wrong. My little comment unleashed a social media shit storm. Anti-vaxxing screeds, mind-control plots via China, government conspiracies to artificially over-count the death toll and calling out the sheer stupidity of people wearing face masks proliferated in the comment string for the next five days. I watched the comment string grow in stunned disbelief. I had never seen this side of Facebook before.

Or had I? Perhaps the left-leaning posts I am used are just as conspiratorial, but I don’t realize it because I happen to agree with them. I hope not, but perspective does strange things to our grasp of the things we believe to be true. Are we all — right or left — just exercising our right to free speech through a new platform? And — if we are — who am I to object?

Free speech is held up by Mark Zuckerberg and others as hallowed ground in the social-media universe. In a speech last fall at Georgetown University, Zuckerberg said: “The ability to speak freely has been central in the fight for democracy worldwide.”

It’s hard to argue that. The ability to publicly disagree with the government or any other holder of power over you is much better than any alternative. And the drafters of the U.S. Bill of Rights agreed. Freedom of speech was enshrined in the First Amendment. But the authors of that amendment — perhaps presciently — never defined exactly what constituted free speech. Maybe they knew it would be a moving target.

Over the history of the First Amendment, it has been left to the courts to decide what the exceptions would be.

In general, it has tightened the definitions around one area — what types of expression constitute a “clear and present danger” to others.  Currently, unless you’re specifically asking someone to break the law in the very near future, you’re protected under the First Amendment.

But is there a bigger picture here —one very specific to social media? Yes, legally in the U.S. (or Canada), you can post almost anything on Facebook.

Certainly, taking a stand against face masks and vaccines would qualify as free speech. But it’s not only the law that keeps society functioning. Most of the credit for that falls to social norms.

Social norms are the unwritten laws that govern much of our behavior. They are the “soft guard rails” of society that nudge us back on track when we veer off-course. They rely on us conforming to behaviors accepted by the majority.

If you agree with social norms, there is little nudging required. But if you happen to disagree with them, your willingness to follow them depends on how many others also disagree with them.

Famed sociologist Mark Granovetter showed in his Threshold Models of Collective Behavior that there can be tipping points in groups. If there are enough people who disagree with a social norm, it will create a cascade that can lead to a revolt against the norm.

Prior to social media, the thresholds for this type of behavior were quite high. Even if some of us were quick to act anti-socially, we were generally acting alone.

Most of us felt we needed a substantial number of like-minded people before we were willing to upend a social norm. And when our groups were determined geographically and comprised of ideologically diverse members, this was generally sufficient to keep things on track.

But your social-media feed dramatically lowers this threshold.

Suddenly, all you see are supporting posts of like-minded people. It seems that everyone agrees with you. Emboldened, you are more likely to go against social norms.

The problem here is that social norms are generally there because they are in the best interests of the majority of the people in society. If you go against them, by refusing a vaccine or to wear a face mask,  thereby allowing a disease to spread, you endanger others. Perhaps it doesn’t meet the legal definition of “imminent lawlessness,” but it does present a “clear and present danger.”

That’s a long explanation of why I broke my rule about arguing on Facebook.

Did I change anyone’s mind? No. But I did notice that the person who made the original post has changed their settings, so I don’t see the political ones anymore. I just see posts about grandkids and puppies.

Maybe it’s better that way.

Our Brain And Its Junk News Habit

Today, I’m going to return to the Reuter’s Digital News Report and look at the relationship between us, news and social media. But what I’m going to talk about is probably not what you think I’m going to talk about.

Forget all the many, many problems that come with relying on social media to be informed. Forget about filter bubbles and echo chambers. Forget about misleading or outright false stories. Forget about algorithmic targeting. Forget about the gaping vulnerabilities that leave social media open to nefarious manipulation. Forget all that (but just for the moment, because those are all horrible and very real problems that we need to focus on).

Today, I want to talk about one specific problem that comes when we get our news through social media. When we do that, our brains don’t work the way they should if we want to be well informed.

First, let’s talk about the scope of the issue here. According to the Reuter’s study, in the U.S. more people — 72% — turn online for news than any other source. Television comes in second at 59%. If we single out social media, it comes in third at 48%. Trailing the pack is print media at just 20%.

Reuters Digital News Study 2020 – Sources of News in US

If we plot this on a chart over the last seven years, print and social media basically swapped spots, with their respective lines crossing each other in 2014; one trending up and one trending down. In 2013, 47% of us turned to print as a primary news source and just 27% of us went to social media.

If we further look at those under 35, accessing news through social media jumps to the number-one spot by a fairly wide margin. And because they’re young, we’re not talking Facebook here. Those aged 18 to 24 are getting their news through Instagram, Snapchat and TikTok.

The point, if it’s not clear by now, is that many of us get our news through a social media channel — and the younger we are, the more that’s true. The paradox is that the vast majority of us — over 70% — don’t trust the news we see on our social media feeds. If we were to pick an information source we trusted, we would never go to social media.

This brings up an interesting juxtaposition in how we’re being informed about the world: almost all of us are getting our news through social media, but almost none of us are looking for it when we do.

According to the Reuter’s Report, 72% of us (all ages, all markets) get our news through the “side door.” This means we are delivered news — primarily through social media and search — without us intentionally going directly to the source of the information. For those aged 18 to 24, “side door” access jumps to 84% and, of that, access through social media jumps to 38%.

Our loyalty to the brand and quality of an information provider is slipping between our fingers and we don’t seem to care. We say we want objective, non-biased, quality news sources, but in practice we lap up whatever dubious crap is spoon-fed to us by Facebook or Instagram. It’s the difference between telling our doctor what we intend to eat and what we actually eat when we get home to the leftover pizza and the pint of Häagen-Dazs in our fridge.

The difference between looking for and passively receiving information is key to understanding how our brain works. Let’s talk a little bit about “top-down” and “bottom-up” activation and the “priming” of our brain.

When our brain has a goal — like looking for COVID-19 information — it behaves significantly differently than when it is just bored and wanting to be entertained.

The goal sets a “top down” intent. It’s like an executive order to the various bits and pieces of our brain to get their shit together and start working as a team. Suddenly the entire brain focuses on the task at hand and things like reliability of information become much more important to us. If we’re going to go directly to a information source we trust, this is going to be when we do it.

If the brain isn’t actively engaged in a goal, then information has to initiate a “bottom-up” activation. And that is an entirely different animal.

We never go to social media looking for a specific piece of news. That’s not how social media works. We go to our preferred social channels either out of sheer boredom or a need for social affirmation. We hope there’s something in the highly addictive endlessly scrolling format that will catch our attention.

For a news piece to do that, it has to somehow find a “hook” in our brain.  Often, that “hook” is an existing belief. The parts of our brain that act as gatekeepers against unreliable information are bypassed because no one bothered to wake them up.

There is a further brain-related problem with relying on social media, and that’s the “priming” issue. This is where one stimulus sets a subconscious “lens” that will impact subsequent stimuli. Priming sets the brain on a track we’re not aware of, which makes it difficult to control.

Social media is the perfect priming platform. One post sets the stage for the next, even if they’re completely unrelated.

These are just two factors that make social media an inherently dangerous platform to rely on for being informed.

The third is that social media makes information digestion much too easy. Our brain barely needs to work at all. And if it does need to work, we quickly click back and scroll down to the next post. Because we’re looking to be entertained, not informed, the brain is reluctant to do any unnecessary heavy lifting.   

This is a big reason why we may know the news we get through social media channels is probably not good for us, but we gulp it down anyway, destroying our appetite for more trustworthy information sources.

These three things create a perfect cognitive storm for huge portions of the population to be continually and willingly misinformed. That’s not even factoring in all the other problems with social media that I mentioned at the outset of this column. We need to rethink this — soon!

Playing Fast and Loose with the Truth

A few months ago, I was having a conversation with someone and they said something that I was pretty sure was not true. I don’t know if it was a deliberate lie. It may have just been that this particular person was uninformed. But they said it with the full confidence that what they said was true. I pushed back a little and they instantly defended their position.

My first instinct was just to let it go. I typically don’t go out of my way to cause friction in social settings. Besides, it was an inconsequential thing. I didn’t really care about it. But I was feeling a little pissy at the time, so I fact checked her by looking it up on my phone. And I was right. She had stated something that wasn’t true and then doubled down on it.

Like I said, it was inconsequential – a trivial conversation point. But what if it wasn’t? What if there was a lot riding on whether or not what they said was true? What if this person was in a position of power, like – oh, I don’t know – the President of the United States?

The role of truth in our social environment is currently a thing in flux. I cannot remember a time when we have been more suspicious of what we see, read and hear on a daily basis. As I mentioned a couple weeks ago, less than 40% of us trust what we hear on the news. And when that news comes through our social media feed, the level of distrust jumps to a staggering 80%

Catching someone in a lie has significant social and cognitive implications. We humans like to start from a default position of trust. If we can do that, it eliminates a lot of social friction and cognitive effort. We only go to not trusting when we have to protect ourselves.

Our proclivity for trust is what has made a global commerce and human advancement possible. But, unfortunately, it does leave us vulnerable. Collectively, we usually play by the same playbook I was initially going to use in my opening example. It’s just easier to go along with what people say, even if we may doubt that it’s true. This is especially so if the untruth is delivered with confidence. We humans love confidence in others because it means we don’t have to work as hard. Confidence is a signal we use to decide to trust and trust is always easier than distrust. The more confident the delivery, the less likely we are to question it.

It’s this natural human tendency that put the “con” in “con artist.” “Con” is short for confidence, and it originates with an individual named William Thompson, who plied the streets of New York in the 1840’s. He would walk up to a total stranger who was obviously well off and greet them like a long-lost friend. After a few minutes of friendly conversation during which the target would be desperately trying to place this individual, Thompson would ask for the loan of something of value. He would then set his hook with this, “Do you have confidence in me to loan me this [item] til tomorrow?”  The success of this scam was totally dependent on an imbalance of confidence; extreme confidence on the part of the con artist and a lack of confidence on the part of the target.

It is ironic that in an era where it’s easier than ever to fact check, we are seeing increasing disregard for the truth. According to the Washington Post, Donald Trump passed a misinformation milestone on July 9, making 20,000 false or misleading claims since he became President. He surged past that particular post when he lied 62 times on that day alone. I don’t even think I talk 62 times per day.

This habit of playing fast and loose with the truth is not Trump’s alone. Unfortunately, egregious lying has been normalized in today’s world. We have now entered an era where full time fact checking is necessary. On July 7, NY Times columnist Thomas Friedman said we need a Biden-Trump debate, but only on two conditions: First, only if Trump releases his tax returns, and second, only if there is a non-partisan real-time fact-checking team keeping the debaters accountable.

We have accepted this as the new normal. But we shouldn’t. There is an unacceptable cost we’re paying by doing so. And that cost becomes apparent when we think about the consequence of lying on a personal basis.

If we catch an acquaintance in a deliberate lie, we put them in the untrustworthy column. We are forced into a default position of suspicion whenever we deal with them in the future. This puts a huge cognitive load on us. As I said before, it takes much more effort to not trust someone. It makes it exponentially harder to do business with them. It makes it more difficult to enjoy their company. It introduces friction into our relationship with them.

Even if the lie is not deliberate but stated with confidence, we label them as uninformed. Again, we trust them less.

Now multiply this effort by everyone. You quickly see where the model breaks down. Lying may give the liar a temporary advantage, but it’s akin to a self-limiting predator-prey model. If it went unchecked, soon the liars would only have other liars to deal with. It’s just not sustainable.

Truth exists for a reason. It’s the best social strategy for the long term. We should fight harder for it.