You Said, ‘Why Public Broadcasting?’ I Still Say, ‘Why Not?’

It appears my column a few weeks ago on public broadcasting hit a few raw nerves. Despite my trying to stickhandle around the emotionally charged use of the word “socialism” there were a few comments saying, in essence, why should taxpayers have to support broadcasting when there were private and corporate donors willing to do so? Why would we follow a socialist approach to ensuring fair and responsible journalism? We are the land of the free and open market. Let’s just let it do its job.

One commenter suggested that if people want to support responsible journalism, let them become subscribers. Make it a Netflix-based model for journalism. That is one solution put forward in my friend John Marshall’s  new book, “Free is Bad.”

It’s not wrong. It’s certainly one approach. I would encourage everyone to subscribe to at least one news publication that still practices real journalism.

Another commenter suggested that as long as there are donors who believe in journalism and are willing to put their money where their mouth is, we can let them carry the load. That’s another approach. 

Case in point, ProPublica. 

ProPublica is a nonprofit newsroom funded by donations. The quality of its reporting has garnered it six Pulitzers, five Peabodys, three Emmys and a number of other awards. It can certainly be pointed to as a great example of high-quality reporting that doesn’t rely on advertising dollars. But ProPublica has been around since 2008 and it only has a little over 100 journalists on the payroll. I’m sure its principals would love to hire more. They just don’t have enough money. 

The problem here — the one that prompted my suggestion to consider public broadcasting as an alternative — is that both subscriber and donor-based approaches are like trying to kill the elephant in the room with a flyswatter. The economics are hopelessly imbalanced and just can’t work.

Journalism is in full-scale attrition because its revenue model is irretrievably broken. Here’s why it’s broken: The usual winner in competitions based on capitalism is what’s most popular, not what’s the best. It’s a race to the shallow end of the pool.

And that’s what’s happened to real news reporting. Staying shallow in an advertising-supported marketplace is the best way to ensure profitability. 

But even the shallow end needs some water; there needs to be some news to act as the raw material for opinion and analysis content. In the news business, that water is the overflow from the deep end. And someone — somewhere — has to keep refilling the deep end.

In a market that is determined to cling to free-market capitalism, no one is willing to invest in the type of journalism required to keep the deep end full. It’s the Tragedy of the Commons, applied to journalism. There are too many taking, and no one is giving back. Incentives and required outcomes are not only not aligned, they are pointed in opposite directions. 

But, as my commenters noted, that is where subscriptions and donations can come in. Obviously, a subscriber-based model has worked very well for streaming services like Netflix. Why couldn’t the same be true for journalism? 

I don’t believe the same approach will work, for a few reasons. 

First, Netflix has the advantage of exclusivity. You have to subscribe to access their content. Journalism doesn’t work that way. Once a news story has broken, there is a whole downstream gaggle of news channels that will jump on it and endlessly spin and respin it with their own analysis and commentary.  

This respun content will always be more popular that the original story, because it’s been predigested to align with the target audience’s own beliefs and perspectives. As I’ve said before, when it comes to news, we have a junk food habit. And why would you buy broccoli when you can get a cheeseburger for free?

This exclusivity also gives Netflix the ability to program both for quality and popularity. For every “Queen’s Gambit,” there are dozens of “Tiger King’s” and other brain-food junk snacks. When all the money is being dumped into the same pool, it can fill both the shallow and deep ends at the same time.

But perhaps the biggest misconception about Netflix’s success is that it’s not determined if Netflix is, in fact, successful. It is still a model in transition and is still relying heavily on licensed content to prop up the profitability of its original programming. When it comes to successfully transitioning the majority of viewer streams to its own programming, the jury is still very much out, as this analysis notes.

There are more reasons why I don’t think a subscription model is the best answer to journalism attrition, but we’ll leave it there for now. 

But what about donor-based journalism, like that found on PBS affiliates or ProPublica? While I don’t doubt their intentions or the quality of the reporting, I do have issues with the scale. There are simply not enough donor dollars flowing into these organizations to fund the type of expensive journalism that we need. 

And these donor dollars are largely missing in local markets, where the attrition of true news reporting is progressing at an even faster rate. In the big picture — and to return to our previous analogy — this represents a mere trickle into the deep end. 

There are just some things that shouldn’t exist in a for-profit setting. The dynamics of capitalism and how it aligns incentive just don’t work for these examples. These things are almost always social obligations that we must have but that require a commitment that usually represents personal sacrifice. 

This is the basis of a social democracy where personal sacrifice is typically exacted through taxation. While you may not like it, taxation is still the best way we’ve found to prevent the Tragedy of the Commons. 

We are now to the point where access to true and reliable information has become a social obligation. And much as we may not like it, we all need to sacrifice a little bit to make sure we don’t lose it forever.

Friendship: Uncoupled

This probably won’t come as a shock to anyone reading this: A recent study says that it’s not if you use social media that determines your happiness, but how you use social media. 

Derrick Wirtz, an associate professor of teaching in psychology at the University of British Columbia-Okanagan, took a close look at how people use three major social platforms—Facebook, Twitter and Instagram—and if how you use it can make you happier or sadder.

As I said, most of you probably said to yourself, “Yeah, that checks out.” But this study does bring up an interesting nuance with some far-reaching implications. 

In today’s world, we’re increasingly using Facebook to maintain our social connections. And, according to Facebook’s mission statement, that’s exactly what’s supposed to happen: “People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.”

The interesting thing in this study is the divide between our social activities — those aimed at bonding versus those aimed at gaining status — and how that impacts our moods and behaviors. It’s difficult to untangle the effect of those two factors, because they are so intertwined in our psyches. But according to this study, Dr. Wirtz found that some of us are spending far more time on social media “status-checking” then actually tending to our friendships.

“Passive use, scrolling through others’ posts and updates, involves little person-to-person reciprocal interaction while providing ample opportunity for upward comparison,” says Wirtz. 

We can scroll our newsfeed without any actual form of engagement — but that’s not what we were designed to do. Our social skills evolved to develop essential mutually beneficial bonds in a small group setting.

Friendship is meant to be nurtured and tended to organically and intimately in a face-to-face environment.  But the distal nature of social media is changing the dynamics of how we maintain relationships in our network. 

Take how we first establish friendships, for instance. When you meet someone for the very first time, how do you decide whether you’re going to become friendly or not? The answer, not surprisingly, is complex and nuanced. Our brain works overtime to determine whether we should bond or not. But, also not surprisingly, almost none of that work is based on rational thought.

UCLA psychologist Dr. Elizabeth Laugeson teaches young adults with social challenges, such as those on the autism spectrum, how to take those very first steps toward friendship when meeting a stranger. If you can’t pick up the face-to-face nuances of body language and unspoken social cues intuitively, becoming friends can be incredibly difficult. Essentially, we are constantly scanning the other person for small signs of common interest from which we can start working toward building trust. 

Even if you clear this first hurdle, it’s not easy to build an actual friendship. It requires a massive investment of our time and energy. A recent study from the University of Kansas found it takes about 50 hours of socializing just to go from acquaintance to casual friend. 

Want to make a “real” friend? Tack another 40 hours onto that. And if you goal is to become a “close” friend, you’d better be prepared to invest at least a total of 200 hours. 

So that begs the question, why would we make this investment in the first place? Why do we need friends? And why do we need at least a handful of really close friends? The answer lies in the concept of reciprocity. 

From an evolutionary perspective, having friends made it easier to survive and reproduce. We didn’t have to go it alone. We could help each other past the rough spots, even if we weren’t related to each other. Having friends stacked the odds in our favor. 

This is when our investment in all those hours of building friendships paid off. Again, this takes us back to the intimate and organic roots of friendship. 

Our brains, in turn, reinforced this behavior by making sure that having friends made us happy. 

Of course, like most human behaviors, it’s not nearly that simple or benign. Our brains also entwine the benefits of friendship with the specter of social status, making everything much more complicated. 

Status also confers an evolutionary advantage. For many generations, we have trod this fine line between being a true friend and being obsessed with our own status in the groups where we hang out.

And then came social media.

As Wirtz’s study shows, we now have this dangerous uncoupling between these two sides of our nature. With social media, friendship is now many steps removed from its physical, intimate and organic roots. It is stripped of the context in which it evolved. And, it appears, the intertwined strands of friendship and social status are unraveling. When this happens, time on social media can reap the anxiety and jealousy of status-checking without any of the joy that comes from connecting with and helping a friend. 

On a person-to-person basis, this uncoupling can be disturbing and unfortunate. But consider what may happen when these same tendencies are amplified and magnified through a massive, culture-wide network.

Why Free News is (usually) Bad News

Pretty much everything about the next week will be unpredictable. But whatever happens on Nov. 3, I’m sure there will be much teeth-gnashing and navel-gazing about the state of journalism in the election aftermath.

And there should be. I have written much about the deplorable state of that particular industry. Many, many things need to be fixed. 

For example, let’s talk about the extreme polarization of both the U.S. population and their favored news sources. Last year about this time, the PEW Research Center released a study showing that over 30% of Americans distrust their news sources. 

But what’s more alarming is, when we break this down by Republicans versus Democrats, only 27% of Democrats didn’t trust the news for information about politics or elections. With Republicans, that climbed to a whopping 67%. 

The one news source Republicans do trust? Fox News. Sixty-five percent of them say Fox is reliable. 

And that’s a problem.

Earlier this year, Ad Fontes Media came out with its Media Bias Chart. It charts major news and media channels on two axes: source reliability and political bias. The correlation between bias and reliability is almost perfect. The further a news source is out to the right or left, the less reliable it is.

How does Fox fare? Not well. Ad Fontes separates Fox TV from Fox Online. Fox Online lies on the border between being “reliable for news, but high in analysis/opinion content” and “some reliability issues and/or extremism.” Fox TV falls squarely in the second category.

I’ve written before that media bias is not just a right-wing problem. Outlets like CNN and MSNBC show a significant left-leaning bias. But CNN Online, despite its bias, still falls within the “Most Reliable for News” category. According to Ad Fontes, MSNBC has the same reliability issues as Fox.

The question that has to be asked is “How did we get here?”  And that’s the question tackled head-on in a new book, “Free is Bad,” by John Marshall.

I’ve known Marshall for ages. He has covered a lot of the things I’ve been writing about in this column. 

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” 

Upton Sinclair

The problem here is one of incentive. Our respective media heads didn’t wake up one morning and say, “You know what we need to be? A lot more biased!” They have walked down that path step by step, driven by the need to find a revenue model that meets their need for profitability. 

When we talk about our news channels, the obvious choice to be profitable is to be supported by ads. And to be supported by ads, you have to be able to target those ads. One of the most effective targeting strategies is to target by political belief, because it comes reliably bundled with a bunch of other beliefs that makes it very easy to predict behaviors. And that makes these ads highly effective in converting prospects.

This is how we got to where we are. But there are all types of ways to prop up your profit through selling ads. Some are pretty open and transparent. Some are less so. And that brings us to a particularly interesting section of Marshall’s book. 

John Marshall is a quant geek at heart. He has been a serial tech entrepreneur — and, in one of those ventures, built a very popular web analytics platform. He also has intimate knowledge of how the sausages are made in the ad-tech business. He knows sketchy advertising practices when he sees them. 

Given all of this, Marshall was able to undertake a fascinating analysis of the ads we see on various news platforms that dovetails nicely with the Ad Fontes chart. 

Marshall created the Ad Shenanigans chart. Basically, he did a forensic analysis of the advertising approaches of various online news platforms. He was looking for those that gathered data about their users, sold traffic to multiple networks, featured clickbait chumboxes and other unsavory practices. Then he ranked them accordingly.

Not surprisingly, there’s a pretty strong correlation between reputable reporting and business ethics. Highly biased and less reputable sites on the Ad Fontes Bias Chart (Breitbart, NewsMax, and Fox News) all can also be found near the top of Marshall’s Ad Shenanigans Chart. Those that do seem to have some ethics when it comes to the types of ads they run also seem to take objective journalism seriously. Case in point, The Guardian in the UK and ProPublica in the U.S.

The one anomaly in the group seems to be CNN. While it does fare relatively well on reputable reporting according to Ad Fontes, CNN appears to be willing to do just about anything to turn a buck. It ranks just a few slots below Fox in terms of “ad shenanigans.”

Marshall also breaks out those platforms that have a mix of paid firewalls and advertising. While there are some culprits in the mix such as the Daily Caller, Slate and the National Review, most sites that have some sort of subscription model seem to be far less likely to fling the gates of their walled gardens open to the ethically challenged advertising hordes. 

All of this drives home Marshall’s message: When it comes to the quality of your news sources, free is bad. As soon as something costs you nothing, you are no longer the customer. You’re the product. Invisible hand market forces are no longer working for you. They are working for the advertiser. And that means they’re working against you if you’re looking for an unbiased, quality news source.

Amazon Prime: Buy Today, Pay Tomorrow?

This column goes live on the most eagerly anticipated day of the year. My neighbor, who has a never-ending parade of delivery vans stopping in front of her door, has it circled on her calendar. At least one of my daughters has been planning for it for several months. Even I, who tends to take a curmudgeonly view of many celebrations, has a soft spot in my heart for this particular one.

No, it’s not the day after Canadian Thanksgiving. This, my friends, is Amazon Prime Day!

Today, in our COVID-clouded reality, the day will likely hit a new peak of “Prime-ness.” Housebound and tired of being bludgeoned to death by WTF news headlines, we will undoubtedly treat ourselves with an unprecedented orgy of one-click shopping. And who can blame us? We can’t go to Disneyland, so leave me alone and let me order that smart home toilet plunger and the matching set of Fawlty Towers tea towels that I’ve been eyeing. 

Of course, me being me, I do think about the consequences of Amazon’s rise to retail dominance. 

I think we’re at a watershed moment in our retail behaviors, and this moment has been driven forward precipitously by the current pandemic. Being locked down has forced many of us to make Amazon our default destination for buying. Speaking solely as a sample of one, I know check Amazon first and then use that as my baseline for comparison shopping. But I do so for purely selfish reasons – buying stuff on Amazon is as convenient as hell!

I don’t think I’m alone. We do seem to love us some Amazon. In a 2018 survey conducted by Recode, respondents said that Amazon had the most positive impact on society out of any major tech company. And that was pre-Pandemic. I suspect this halo effect has only increased since Amazon has become the consumer lifeline for a world forced to stay at home.

As I give into to the siren call of Bezos and Co., I wonder what forces I might be unleashing. What unintended consequences might come home to roost in years hence? Here are a few possibilities. 

The Corporate Conundrum

First of all, let’s not kid ourselves. Amazon is a for-profit corporation. It has shareholders that demand results. The biggest of those shareholders is Jeff Bezos, who is the world’s richest man. 

But amazingly, not all of Amazon’s shareholders are focused on the quarterly financials. Many of them – with an eye to the long game – are demanding that Amazon adopt a more ethical balance sheet.  At the 2019 Annual Shareholder Meeting, a list of 12 resolutions were brought forward to be voted on. The recommendations included zero tolerance for sexual harassment and hate speech, curbing Amazon’s facial recognition technology, addressing climate change and Amazon’s own environmental impact. These last two were supported by a letter signed by 7600 of Amazon’s own employees. 

The result? Amazon strenuously fought every one of them and none were adopted. So, before we get all warm and gooey about how wonderful Amazon is, let’s remember that the people running the joint have made it very clear that they will absolutely put profit before ethics. 

A Dagger in the Heart of Our Communities

For hundreds of years, we have been building a supply chain that was bound by the realities of geography. That supply chain required some type of physical presence within a stone’s throw of where we live. Amazon has broken that chain and we are beginning to feel the impact of that. 

Community shopping districts around the world were being gutted by the “Amazon Effect” even before COVID. In the last 6 months, that dangerous trend has accelerated exponentially. In a commentary from CNBC in 2018, venture capitalist Alan Patricof worried about the social impact of losing our community gathering spots, “This decline has brought a deterioration in places where people congregated, socialized, made friends and were greeted by a friendly face offering an intangible element of belonging to a community.”

The social glue that held us together has been dissolving over the past two decades. Whether you’re a fan of shopping malls or not (I fall into the “not” category) they were at least a common space where you might run into your neighbor. In his book Bowling Alone, from 2000, Harvard political scientist Robert Putnam documented the erosion of social capital in America. We are now 20 years hence and Putnam’s worst case scenario seems quaintly optimistic now. With the loss of our common ground – in the most literal sense – we increasingly retreat to the echo chambers of social media. 

Frictionless Consumerism

This last point is perhaps the most worrying. Amazon has made it stupid simple to buy stuff. They have relentlessly squeezed every last bit of friction out of the path to purchase. That worries me greatly.

If we could rely on a rational marketplace filled with buyers acting in the best homo economicus tradition, then I perhaps rest easier, knowing that there was some type of intelligence driving Adam Smith’s Invisible Hand. But experience has shown that is not the case. Rampant consumerism appears to be one of the three horsemen of the modern apocalypse. And, if this is true, then Amazon has put us squarely in their path. 

This is not to even mention things like Amazon’s emerging monopoly-like dominance in a formerly competitive marketplace, the relentless downward pressure it exerts on wages within its supply chain, the evaporation of jobs outside its supply chain or the privacy considerations of Alexa. 

Still, enjoy your Amazon Prime Day. I’m sure everything will be fine.

How to Look Past the Nearest Crisis

I was talking to someone the other day who was trying to make plans for 2021. Those plans were dependent on the plans of others. In the course of our conversation, she said something interesting: “It’s so hard to plan because most of the people I’m talking to can’t see past COVID.” 

If anything sums up our current reality, it might be that. We’re all having a lot of trouble seeing past COVID. Or the upcoming U.S. election. Or catastrophic weather events. Or an impending economic crisis. Take your pick. There are so many looming storm clouds on the horizon that it’s difficult to even make out that horizon any more. 

We humans are pretty dependent on the past to tell us what may be happening in the future. We evolved in an environment that — thanks to its stability — was reasonably predictable. In evolutionary survival terms, it was smart to hedge our bets on the future by glancing over our shoulders at the past. If a saber-toothed tiger was likely to eat you yesterday, the odds were very much in favor of it also wanting to eat you tomorrow. 

But our ability to predict things gets thrown for a loop in the face of uncertainty like we’re currently processing. There are just too many variables forced into the equation for us to be able to rely on what has happened in the past. Both the number of variables and the range of variation pushes our prediction probability of error past the breaking point. 

When it comes to planning for the future, we become functionally paralyzed and start living day to day, waiting for the proverbial “other shoe to drop.” 

The bigger problem, however, is that when the world is going to hell in a hand basket, we don’t realize that the past is a poor foundation on which to build our future. Evolved habits die hard, and so we continue to use hindsight to try to move forward. 

And by “we,” I mean everyone — most especially the leaders we elect and the experts we rely on to point us in the right direction.  Many seem to think that a post-COVID world will snap back to be very much like a pre-COVID world.

And that, I’m afraid, may be the biggest problem. You’d think that when worrying about an uncertain future is above our pay grade, there would be someone wiser and smarter than us to rely on and save our collective asses. But if common folk tend to consistently bet on the past as a guide to our future, it’s been shown that people we think of as “experts” double down on that bet. 

A famous study by Philip Tetlock showed just how excruciatingly awful experts were at predicting the future. He assembled a group of 284 experts and got them to make predictions about future events, including those that fell into their area of expertise. Across the board, he found their track record of being correct was only slightly ahead of a random coin toss or a troupe of chimpanzees throwing darts. The more famous the expert, the worse their track record.

Expertise is rooted in experience. Both words spring from the same root: The Latin experiri for “try.” Experience is gained in the past. For experts, their worth comes from their experience in one particular area, so they are highly unlikely to ignore it when predicting the future. They are like the hedgehog in Isiah Berlin’s famous essay “The Hedgehog and The Fox“: They “know one important thing.”

But when it comes to predicting the future, Tetlock found it’s better to be a fox: to “know many little things.” In a complex, highly uncertain world, it’s the generalist  who thrives. 

The reason is pretty simple. In an uncertain world, we have to be more open to sense making in the classic cognitive sense. We have to be attuned to the signals that are playing out in real time and not be afraid to consider new information that may conflict with our current beliefs.

This is how generalists operate. It’s also how science is supposed to operate. Our view of the future should be no more than a hypothesis that we’re willing to have proven wrong. Hedgehogs dig in when their expertise about “one big thing” is questioned. Foxes use it as an opportunity to update their take on reality. 

Foxes have another advantage over hedgehogs. They tend to be dilettantes, spreading their interest over a wide range of topics without diving too deeply into any of them. This keeps their network diverse and expansive, giving them the opportunity to synthesize their sense of reality from the broadest range of signals possible. 

In a world that depends on being nimble enough to shift directions depending on the input your receive, this stacks the odds in favor of the fox. 

Still, it’s against human nature to be so cavalier about our future. We like certainty. We crave predictability. We are big fans of transparent causes and effects. If those things are clouded by complexity and uncertainty, we start constructing our own narratives. Hence the current spike of conspiracy theories, as I noted previously. This is especially true when the stakes are as high as they are now. 

I don’t blame those having a very hard time looking past COVID — or any other imminent disaster. But someone should be. 

It’s time to start honing those fox instincts. 

Tired of Reality? Take 2 Full-Strength Schitt’s Creeks

“Schitt’s Creek” stormed the Emmys by winning awards in every comedy series category — a new record. It was co-creators Dan and Eugene Levy’s gift to the world: a warm bowl of hot cultural soup, brimming with life-affirming values, acceptance and big-hearted Canadian corniness.

It was the perfect entertainment solution to an imperfect time. It was good for what ails us.

It’s not the first time we’ve turned to entertainment for comfort. In fact, if there is anything as predictable as death and taxes, it’s that during times of trial, we need to be entertained.

There is a direct correlation between feel-good fantasy and feeling-shitty reality. The worse things get, the more we want to escape it.

But the ways we choose to be entertained have changed. And maybe — just maybe — the media channels we’re looking to for our entertainment are adding to the problem. 

The Immersiveness of Media

A medium’s ability to distract us from reality depends on how much it removes us from that reality.

Our media channels have historically been quite separate from the real world. Each channel offered its own opportunity to escape. But as the technology we rely on to be entertained has become more capable of doing multiple things, that escape from the real world has become more difficult.

Books, for example, require a cognitive commitment unlike any other form of entertainment. When we read a book, we — in effect — enter into a co-work partnership with the author. Our brains have to pick up where theirs left off, and we together build a fictional world to which we can escape. 

As the science of interpreting our brain’s behavior has advanced, we have discovered that our brains actually change while we read.

Maryanne Wolf explains in her book, “Proust and the Squid: The Story and Science of the Reading Brain”: “Human beings invented reading only a few thousand years ago. And with this invention, we rearranged the very organization of our brain, which in turn expanded the ways we were able to think, which altered the intellectual evolution of our species. . . . Our ancestors’ invention could come about only because of the human brain’s extraordinary ability to make new connections among its existing structures, a process made possible by the brain’s ability to be reshaped by experience.”

Even movies, which dramatically lowered the bar for the cognitive commitments they ask by supplying content specifically designed for two of our senses, do so by immersing us in a dedicated single-purpose environment. The distraction of the real world is locked outside the theater doors.

But today’s entertainment media platforms not only live in the real world, they are the very same platforms we use to function in said world. They are our laptops, our tablets, our phones and our connected TVs.

It’s hard to ignore that world when the flotsam and jetsam of reality is constantly bumping into us. And that brings us to the problem of the multitasking myth.

Multitasking Anxiety

The problem is not so much that we can’t escape from the real world for a brief reprise in a fictional one. It’s that we don’t want to. 

Even if we’re watching our entertainment in our home theater room on a big screen, the odds are very good that we have a small screen in our hands at the same time. We mistakenly believe we can successfully multitask, and our mental health is paying the price for that mistake.

Research has found that trying to multitask brings on a toxic mix of social anxiety, depression, a lessening of our ability to focus attention, and a sociopsychological impairment that impacts our ability to have rewarding relationships. 

When we use the same technology to be entertained that we use to stay on top of our social networks we fall prey to the fear of missing out.

It’s called Internet Communication Disorder, and it’s an addictive need to continually scroll through Facebook, Twitter, WhatsApp and our other social media platforms. It’s these same platforms that are feeding us a constant stream of the very things we’re looking to escape from. 

It may be that laughter is the best medicine, but the efficacy of that medicine is wholly dependent on where we get our laughs.

The ability for entertainment to smooth the jagged edges of reality depend on our being able to shift our minds off the track that leads to chronic anxiety and depression — and successfully escape into a fictional kinder, gentler, funnier world.

For entertainment to be a beneficial distraction, we first have to mentally disengage from the real world, and then fully engage in the fictional one.

That doesn’t work nearly as well when our entertainment delivery channel also happens to be the same addictive channel that is constantly tempting us to tiptoe through the anxiety-strewn landscape that is our social media feed. 

In other words, before going to “Schitt’s Creek,” unpack your other shit and leave it behind. I guarantee it will be waiting for you when you get back.

The Fickle Fate of Memes

“In the future, everyone will be world-famous for 15 minutes.”

Attributed to Andy Warhol

If your name is Karen, I’m sorry. The internet has not been kind to you over the past 2 years. You’re probably to the point where you hesitate before you tell people your name. And it’s not your fault that your name has meme-famous for being synonymous with bitchy white privilege.

The odds are that you’re a nice person. I know several Karens and not one of them is a “Karen.” On the other hand, I do know a few “Karen”s (as my Facebook adventure from last week makes clear) and not one of them is named Karen.

But that’s the way memes roll. You’re not at the wheel. The trolling masses have claimed your fate and you just have to go along for the ride. That’s true for Karen, where there doesn’t seem to be an actual “Karen” to which the meme can be attributed. But it’s also true when the meme starts with an actual person – like Rebecca Black.

Remember Rebecca Black? No?  I’ll jog your memory –

Yesterday was Thursday, Thursday
Today it is Friday, Friday (partyin’)
We-we-we so excited
We so excited
We gonna have a ball today

Rebecca Black

Yes, that Rebecca Black – star of “Friday”, which for many years was the most hated video in YouTube history (it still ranks at number 15 according to Wikipedia).

Admit it, when you remembered Rebecca Black, you did not do so fondly. But you know nothing about Rebecca Black. Memes seldom come bundled with a back story. So here are a few facts about Friday you didn’t know.

  • Black didn’t write the song. It was written by two LA music producers
  • Black was 13 at the time the video was shot
  • She had no input into the production or the heavy use of Autotune on her vocals
  • She didn’t see the video or hear the final version of the song before it was posted to YouTube

Although Black was put front and center into the onslaught of negativity the video produced, she had very little to do with the finished product. She was just a 13-year-old girl who was hoping to become a professional singer. And suddenly, she was one of the most hated and ridiculed people in the world. The trolls came out in force. And, unsurprisingly, they were merciless. But then mainstream media jumped on the bandwagon. Billboard and Time magazines, CNN, Stephen Colbert, Jimmy Fallon, Jimmy Kimmel and more all heaped ridicule on Black.

That’s a lot for any 13-year-old to handle.  To understand the impact a meme can have, take 11 minutes to watch the video above about Black from Vice. Black seems to have emerged from the experience as a pretty well-adjusted 22-year-old who is still hoping to turn the fame she got into a positive. She is – more than anything – just trying to regain control of her own story.

The fame Rebecca Black found may have turned out to be of the caustic kind when she found it, but at least she was looking for it. Ghyslain Raza never asked for it and never wanted it. He became a meme by accident.

Ghyslain who? Allow your memory to be jogged once again. You probably know Raza better as the Star Wars Kid.

In 2002, Ghyslain Raza was a shy 14-year-old from Quebec who liked to make videos. One of those videos was shot in the school AV room while Raza was “goofing around,” wielding a makeshift light saber he made from a golf ball retriever. That video fell into the hands of a classmate, who – with all the restraint middle schoolers are known for – promptly posted it online. Soon, a torrent of cyber bullying was unleashed on Raza as views climbed into the tens of millions.

The online comments were hurtful enough. More than a few commenters suggested that Raza commit suicide. Some offered to help. But it was no better for Razain in his real life. He had to change schools when what few friends he had evaporated. At the new school, it got worse, “In the common room, students climbed onto tabletops to insult me.”

Imagine for a moment yourself being 14 and dealing with this. Hell, imagine it at the age you are now. Life would be hell. It certainly was for Raza. In an interview with a Canadian news magazine, he said, “No matter how hard I tried to ignore people telling me to commit suicide, I couldn’t help but feel worthless, like my life wasn’t worth living.”

Both Black and Raza survived their ordeals. Aleksey Varner wasn’t so lucky. The over-the-top video resume he made in 2006, Impossible is Nothing, also became a meme when it was posted online without his permission. Actor Michael Cera was one of the many who did a parody. Like Black and Raza, Vayner battled to get his life back. He lost that battle in 2013. He died from a heart attack that a relative has said was brought on by an overdose of medication.

In our culture, online seems to equal open season. Everyone –  even celebrities that should know better – seem to think it’s okay to parody, ridicule, bully or even threaten death. What we conveniently forget is that there is a very real person with very real feelings on the other side of the meme. No one deserves that kind of fame.

Even if their name is Karen.

Our Brain And Its Junk News Habit

Today, I’m going to return to the Reuter’s Digital News Report and look at the relationship between us, news and social media. But what I’m going to talk about is probably not what you think I’m going to talk about.

Forget all the many, many problems that come with relying on social media to be informed. Forget about filter bubbles and echo chambers. Forget about misleading or outright false stories. Forget about algorithmic targeting. Forget about the gaping vulnerabilities that leave social media open to nefarious manipulation. Forget all that (but just for the moment, because those are all horrible and very real problems that we need to focus on).

Today, I want to talk about one specific problem that comes when we get our news through social media. When we do that, our brains don’t work the way they should if we want to be well informed.

First, let’s talk about the scope of the issue here. According to the Reuter’s study, in the U.S. more people — 72% — turn online for news than any other source. Television comes in second at 59%. If we single out social media, it comes in third at 48%. Trailing the pack is print media at just 20%.

Reuters Digital News Study 2020 – Sources of News in US

If we plot this on a chart over the last seven years, print and social media basically swapped spots, with their respective lines crossing each other in 2014; one trending up and one trending down. In 2013, 47% of us turned to print as a primary news source and just 27% of us went to social media.

If we further look at those under 35, accessing news through social media jumps to the number-one spot by a fairly wide margin. And because they’re young, we’re not talking Facebook here. Those aged 18 to 24 are getting their news through Instagram, Snapchat and TikTok.

The point, if it’s not clear by now, is that many of us get our news through a social media channel — and the younger we are, the more that’s true. The paradox is that the vast majority of us — over 70% — don’t trust the news we see on our social media feeds. If we were to pick an information source we trusted, we would never go to social media.

This brings up an interesting juxtaposition in how we’re being informed about the world: almost all of us are getting our news through social media, but almost none of us are looking for it when we do.

According to the Reuter’s Report, 72% of us (all ages, all markets) get our news through the “side door.” This means we are delivered news — primarily through social media and search — without us intentionally going directly to the source of the information. For those aged 18 to 24, “side door” access jumps to 84% and, of that, access through social media jumps to 38%.

Our loyalty to the brand and quality of an information provider is slipping between our fingers and we don’t seem to care. We say we want objective, non-biased, quality news sources, but in practice we lap up whatever dubious crap is spoon-fed to us by Facebook or Instagram. It’s the difference between telling our doctor what we intend to eat and what we actually eat when we get home to the leftover pizza and the pint of Häagen-Dazs in our fridge.

The difference between looking for and passively receiving information is key to understanding how our brain works. Let’s talk a little bit about “top-down” and “bottom-up” activation and the “priming” of our brain.

When our brain has a goal — like looking for COVID-19 information — it behaves significantly differently than when it is just bored and wanting to be entertained.

The goal sets a “top down” intent. It’s like an executive order to the various bits and pieces of our brain to get their shit together and start working as a team. Suddenly the entire brain focuses on the task at hand and things like reliability of information become much more important to us. If we’re going to go directly to a information source we trust, this is going to be when we do it.

If the brain isn’t actively engaged in a goal, then information has to initiate a “bottom-up” activation. And that is an entirely different animal.

We never go to social media looking for a specific piece of news. That’s not how social media works. We go to our preferred social channels either out of sheer boredom or a need for social affirmation. We hope there’s something in the highly addictive endlessly scrolling format that will catch our attention.

For a news piece to do that, it has to somehow find a “hook” in our brain.  Often, that “hook” is an existing belief. The parts of our brain that act as gatekeepers against unreliable information are bypassed because no one bothered to wake them up.

There is a further brain-related problem with relying on social media, and that’s the “priming” issue. This is where one stimulus sets a subconscious “lens” that will impact subsequent stimuli. Priming sets the brain on a track we’re not aware of, which makes it difficult to control.

Social media is the perfect priming platform. One post sets the stage for the next, even if they’re completely unrelated.

These are just two factors that make social media an inherently dangerous platform to rely on for being informed.

The third is that social media makes information digestion much too easy. Our brain barely needs to work at all. And if it does need to work, we quickly click back and scroll down to the next post. Because we’re looking to be entertained, not informed, the brain is reluctant to do any unnecessary heavy lifting.   

This is a big reason why we may know the news we get through social media channels is probably not good for us, but we gulp it down anyway, destroying our appetite for more trustworthy information sources.

These three things create a perfect cognitive storm for huge portions of the population to be continually and willingly misinformed. That’s not even factoring in all the other problems with social media that I mentioned at the outset of this column. We need to rethink this — soon!

Playing Fast and Loose with the Truth

A few months ago, I was having a conversation with someone and they said something that I was pretty sure was not true. I don’t know if it was a deliberate lie. It may have just been that this particular person was uninformed. But they said it with the full confidence that what they said was true. I pushed back a little and they instantly defended their position.

My first instinct was just to let it go. I typically don’t go out of my way to cause friction in social settings. Besides, it was an inconsequential thing. I didn’t really care about it. But I was feeling a little pissy at the time, so I fact checked her by looking it up on my phone. And I was right. She had stated something that wasn’t true and then doubled down on it.

Like I said, it was inconsequential – a trivial conversation point. But what if it wasn’t? What if there was a lot riding on whether or not what they said was true? What if this person was in a position of power, like – oh, I don’t know – the President of the United States?

The role of truth in our social environment is currently a thing in flux. I cannot remember a time when we have been more suspicious of what we see, read and hear on a daily basis. As I mentioned a couple weeks ago, less than 40% of us trust what we hear on the news. And when that news comes through our social media feed, the level of distrust jumps to a staggering 80%

Catching someone in a lie has significant social and cognitive implications. We humans like to start from a default position of trust. If we can do that, it eliminates a lot of social friction and cognitive effort. We only go to not trusting when we have to protect ourselves.

Our proclivity for trust is what has made a global commerce and human advancement possible. But, unfortunately, it does leave us vulnerable. Collectively, we usually play by the same playbook I was initially going to use in my opening example. It’s just easier to go along with what people say, even if we may doubt that it’s true. This is especially so if the untruth is delivered with confidence. We humans love confidence in others because it means we don’t have to work as hard. Confidence is a signal we use to decide to trust and trust is always easier than distrust. The more confident the delivery, the less likely we are to question it.

It’s this natural human tendency that put the “con” in “con artist.” “Con” is short for confidence, and it originates with an individual named William Thompson, who plied the streets of New York in the 1840’s. He would walk up to a total stranger who was obviously well off and greet them like a long-lost friend. After a few minutes of friendly conversation during which the target would be desperately trying to place this individual, Thompson would ask for the loan of something of value. He would then set his hook with this, “Do you have confidence in me to loan me this [item] til tomorrow?”  The success of this scam was totally dependent on an imbalance of confidence; extreme confidence on the part of the con artist and a lack of confidence on the part of the target.

It is ironic that in an era where it’s easier than ever to fact check, we are seeing increasing disregard for the truth. According to the Washington Post, Donald Trump passed a misinformation milestone on July 9, making 20,000 false or misleading claims since he became President. He surged past that particular post when he lied 62 times on that day alone. I don’t even think I talk 62 times per day.

This habit of playing fast and loose with the truth is not Trump’s alone. Unfortunately, egregious lying has been normalized in today’s world. We have now entered an era where full time fact checking is necessary. On July 7, NY Times columnist Thomas Friedman said we need a Biden-Trump debate, but only on two conditions: First, only if Trump releases his tax returns, and second, only if there is a non-partisan real-time fact-checking team keeping the debaters accountable.

We have accepted this as the new normal. But we shouldn’t. There is an unacceptable cost we’re paying by doing so. And that cost becomes apparent when we think about the consequence of lying on a personal basis.

If we catch an acquaintance in a deliberate lie, we put them in the untrustworthy column. We are forced into a default position of suspicion whenever we deal with them in the future. This puts a huge cognitive load on us. As I said before, it takes much more effort to not trust someone. It makes it exponentially harder to do business with them. It makes it more difficult to enjoy their company. It introduces friction into our relationship with them.

Even if the lie is not deliberate but stated with confidence, we label them as uninformed. Again, we trust them less.

Now multiply this effort by everyone. You quickly see where the model breaks down. Lying may give the liar a temporary advantage, but it’s akin to a self-limiting predator-prey model. If it went unchecked, soon the liars would only have other liars to deal with. It’s just not sustainable.

Truth exists for a reason. It’s the best social strategy for the long term. We should fight harder for it.

How We Forage for the News We Want

Reuters Institute out of the UK just released a comprehensive study looking at how people around the world are finding their news. There is a lot here, so I’ll break it into pieces over a few columns and look at the most interesting aspects. Today, I’ll look at the 50,000-foot view, which can best be summarized as a dysfunctional relationship between our news sources and ourselves. And like most dysfunctional relationships, the culprit here is a lack of trust.

Before we dive in, we should spend some time looking at how the way we access news has changed over the last several years.

Over my lifetime, we have trended in two general directions – less cognitively demanding news channels and less destination specific news sources. The most obvious shift has been away from print. According to Journalism.org and the Pew Research Center, circulation of U.S. Daily newspapers peaked around 1990, at about 62 and a half million. That’s one subscription for every 4 people in the country at that time.

In 2018, it was projected that circulation had dropped more than 50%, to less than 30 million. That would have been one subscription for every 10 people. We were no longer reading our news in a non-digital format. And that may have significant impact on our understanding of the news. I’ll return to this in another column, but for now, let’s just understand that our brain operates in a significantly different way when it’s reading rather than watching or listening.

Up the end of the last century, we generally trusted news destinations. Whether it be a daily newspaper like the New York Times, a news magazine like Time or a nightly newscast such as any of the network news shows, each was a destination that offered one thing above all others – the news. And whether you agreed with them or not, each had an editorial process that governed what news was shared. We had a loyalty to our chosen news destinations that was built on trust.

Over the past two decades, this trust has broken down due to one primary factor – our continuing use of social media. And that has dramatically shifted how we get our news.

In the US, three out of every four people use online sources to get their news. One in two use social media.  Those aged 18 to 24 are more than twice as likely to rely on social media. In the UK, under-35s get more of their news from Social Media than any other source.

Also, influencers have become a source of news, particularly amongst young people. In the US, a quarter of those 18 to 24 used Instagram as a source of news about COVID.

This means that most times, we’re getting our news through a social media lens. Let’s set aside for a moment the filtering and information veracity problems that introduces. Let’s just talk about intent for a moment.

I have talked extensively in the past about information foraging when it comes to search. When information is “patchy” and spread diversely, the brain has to make a quickly calculated guess about which patch it’s most likely to find the information in it’s looking for. With Information Foraging, the intent we have frames everything that comes after.

In today’s digital world, information sources have disaggregated into profoundly patchy environments. We still go to news-first destinations like CNN or Fox News but we also get much of our information about the world through our social media feeds. What was interesting about the Reuters report was that it was started before the COVID pandemic, but the second part of the study was conducted during COVID. And it highlights a fascinating truth about our relationship with the news when it comes to trust.

The study shows that the majority of us don’t trust the news we get through social media but most times, we’re okay with that. Less than 40% of people trust the news in general, and even when we pick a source, less than half of us trust that particular channel. Only 22% indicated they trust the news they see in social media. Yet half of us admit we use social media to get our news. The younger we are, the more reliant we are on social media for news. The fastest growing sources for news amongst all age groups – but especially those under 30 – are Instagram, SnapChat and WhatsApp.

Here’s another troubling fact that fell out of the study. Social platforms, especially Instagram and SnapChat, are dominated by influencers. That means that much of our news comes to us by way of a celebrity influencer reposting it on their feed. This is a far cry from the editorial review process that used to act as a gate keeper on our trusted news sources.

So why do we continue to use news sources we admit we don’t trust? I suspect it may have to do with something called the Meaning Maintenance Model. Proposed in 2006 by Heine, Proulx and Vohs, the model speculates that a primary driver for us is to maintain our beliefs in how the world works. This is related to the sense making loop (Klein, Moon and Hoffman) I’ve also talked about in the past. We make sense of the world by first starting with the existing frame of what we believe to be true. If what we’re experiencing is significantly different from what we believe, we will update our frame to align with the new evidence.

What the Meaning Maintenance Model suggests is that we will go to great lengths to avoid updating our frame. It’s much easier just to find supposed evidence that supports our current beliefs. So, if our intent is to get news that supports our existing world view, social media is the perfect source. It’s algorithmically filtered to match our current frame. Even if we believe the information is suspect, it still comforts us to have our beliefs confirmed. This works well for news about politics, societal concerns and other ideologically polarized topics.

We don’t like to admit this is the case. According to the Reuter’s study, 60% of us indicate we want news sources that are objective and not biased to any particular point of view. But this doesn’t jive with reality at all. As I wrote about in a previous column, almost all mainstream news sources in the US appear to have a significant bias to the right or left. If we’re talking about news that comes through social media channels, that bias is doubled down on. In practice, we are quite happy foraging from news sources that are biased, as long as that bias matches our own.

But then something like COVID comes along. Suddenly, we all have skin in the game in a very real and immediate way. Our information foraging intent changes and our minimum threshold for the reliability of our news sources goes way up. The Reuters study found that when it comes to sourcing COVID information, the most trusted sources are official sites of health and scientific organizations. The least trusted sources are random strangers, social media and messaging apps.

It requires some reading between the lines, but the Reuters study paints a troubling picture of the state of journalism and our relationship with it. Where we get our information directly impacts what we believe. And what we believe determines what we do.

These are high stakes in an all-in game of survival.