Happy New Year?

“Speaking of the happy new year, I wonder if any year ever had less chance of being happy. It’s as though the whole race were indulging in a kind of species introversion — as though we looked inward on our neuroses. And the thing we see isn’t very pretty… So we go into this happy new year, knowing that our species has learned nothing, can, as a race, learn nothing — that the experience of ten thousand years has made no impression on the instincts of the million years that preceded.”

That sentiment, relevant as it is to today, was not written about 2021. It was actually written 80 years ago — in 1941 — by none other than John Steinbeck.

John was feeling a little down. I’m sure we can all relate.

It’s pretty easy to say that we have hopefully put the worst year ever behind us. I don’t know about your news feed, but mine has been like a never-ending bus tour of Dante’s 7 Circles of Hell — and I’m sitting next to the life insurance salesman from Des Moines who decided to have a Caesar salad for lunch.

An online essay by Umair Haque kind of summed up 2020 for me: “The Year of the Idiot.” In it, Haque doesn’t pull any punches,

“It was the year that a pandemic searched the ocean of human stupidity, and found, to its gleeful delight, that it appeared to be bottomless. 2020 was the year that idiots wrecked our societies.”

In case you’re not catching the drift yet, Haque goes on to say, “The average person is a massive, gigantic, malicious, selfish idiot.”

Yeah. That pretty much covers it.

Or does it? Were our societies wrecked? Is the average person truly that shitty? Is the world a vast, soul sucking, rotten-cabbage-reeking dumpster fire? Or is it just the lens we’re looking through?

If you search hard enough, you can find those who are looking through a different lens — one that happens to be backed by statistical evidence rather than what bubbles to the top of our newsfeed. One of those people is Ola Rosling. He’s carrying on the mission of his late father, Hans Rosling, who was working on the book “Factfulness” when he passed away in 2017. Bill Gates called it “one of the most educational books I’ve ever read.” And Bill reads a lot of books!

Believe it or not, if you remove a global pandemic from the equation (which, admittedly, is a whole new scale of awful) the world may actually be in better shape than it was 12 months ago. And even if you throw the pandemic into the mix, there are some glimmers of silver peeking through the clouds.

Here are some things you may have missed in your news feed:

Wild polio was eradicated from Africa. That’s big news. It’s a massive achievement that had its to-do box ticked last August. And I’m betting you never heard about it.

Also, the medical and scientific world has never before mobilized and worked together on a project like the new COVID mRNA vaccines now rolling out. Again, this is a huge step forward that will have far reaching impacts on healthcare in the future. But that’s not what the news is talking about.

Here’s another thing. At long last, it looks like the world may finally be ready to start tearing apart the layers that hide systemic racism. What we’re learning is that it may not be the idiots  — and, granted, there are many, many idiots — who are the biggest problem. It may be people like me, who have unknowingly perpetuated the system and are finally beginning to see the endemic bias baked into our culture.

These are just three big steps forward that happened in 2020. There are others. We just aren’t talking about them.

We always look on the dark side. We’re a “glass half-empty” species. That’s what Rosling’s book is about: our tendency to skip over the facts to rush to the worst possible view of things. We need no help in that regard — but we get it anyway from the news business, which, run by humans and aimed at humans, amplifies our proclivity for pessimism.

I’m as glad as anyone to see 2020 in my rear-view mirror. But I am carrying something of that year forward with me: a resolution to spend more time looking for facts and relying less on media “spun” for profit to understand the state of the world.

As we consume media, we have to remember that good news is just not as profitable as bad news. We need to broaden our view to find the facts. Hans Rosling warned us, “Forming your worldview by relying on the media would be like forming your view about me by looking only at a picture of my foot.”

Yes, 2020 was bad, but it was also good. And because there are forces that swing the pendulum both ways, many of the things that were good may not have happened without the bad. In the same letter in which Steinbeck expressed his pessimism about 1941, he went on to say this:

“Not that I have lost any hope. All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die. I don’t know why we should expect it to. It seems fairly obvious that two sides of a mirror are required before one has a mirror, that two forces are necessary in man before he is man.”

There are two sides to every story, even when it’s a horror story like 2020.

Second Thoughts about the Social Dilemma

Watched “The Social Dilemma” yet? I did, a few months ago. The Netflix documentary sets off all kinds of alarms about social media and how it’s twisting the very fabric of our society. It’s a mix of standard documentary fodder — a lot of tell-all interviews with industry insiders and activists — with an (ill-advised, in my opinion) dramatization of the effects of social media addiction in one particular family.

The one most affected is a male teenager who is suddenly drawn, zombie-like,by his social media feed into an ultra-polarized political activist group. Behind the scenes, operating in a sort of evil-empire control room setting, there are literally puppet masters pulling his strings.

It’s scary as hell. But should we be scared? Or — at least — should we be that scared?

Many of us are sounding alarms about social media and how it nets out to be a bad thing. I’m one of the worst. I am very concerned about the impact of social media, and I’ve said so many, many times in this column. But I also admit that this is a social experiment playing out in real time, so it’s hard to predict what the outcome will be. We should keep our minds open to new evidence.

I’ve also said that younger generations seem to be handling this in stride. At least, they seem to be handling it better than those in my generation. They’re quicker to adapt and to use new technologies natively to function in their environments, rather than fumble as we do, searching for some corollary to the world we grew up in.

I’ve certainly had pushback on this observation. Maybe I’m wrong. Or maybe, like so many seemingly disastrous new technological trends before it, social media may turn out to be neither bad nor good. It may just be different.

That certainly seems to be the case if you read a new study from the Institute for Family Studies at Brigham Young University’s Wheatley Institution.  

One of the lead authors of the study, Jean Twenge, previously rang the alarm bells about how technology was short-circuiting the mental wiring of our youth. In a 2017 article in The Atlantic titled “Have Smartphones Destroyed a Generation?” she made this claim:

“It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.”

The article describes a generation of zombies mentally hardwired to social media through their addiction to their iPhone. One of the more startling claims was this:

“Teens who spend three hours a day or more on electronic devices are 35 percent more likely to have a risk factor for suicide, such as making a suicide plan.”

Again, scary as hell, right? This sounds frighteningly familiar to the scenarios laid out in ““The Social Dilemma.”  

But what if you take this same group and this same author, fast-forward three years to the middle of the worst pandemic in our lifetimes, and check in with over 1,500 teens to see how they’re doing in a time where they have every right to be depressed? Not only are they locked inside, they’re also processing societal upheavals and existential threats like systemic racial inequality, alt-right political populism and climate change. If 2017-2018 was scary for them, 2020 is a dumpster fire.

Surprisingly, those same teens appear to be doing better than they were two years ago. The study had four measures of ill-being: loneliness, life dissatisfaction, unhappiness and depression.  The results were counterintuitive, to say the least. The number of teens indicating they were depressed actually dropped substantially, from 27% in 2018 to 17% who were quarantined during the school year in 2020. Fewer said they were lonely as well. 

The study indicated that the reasons for this could be because teens were getting more sleep and were spending more time with family.

But what about smartphones and social media? Wouldn’t a quarantined teen (a Quaran-teen?) be spending even more time on his or her phone and social media? 

Well, yes – and no. The study found screen time didn’t really go up, but the way that time was spent did shift. Surprising, time spent on social media went down, but time spent on video chats with friends or watching online streaming entertainment went up. 

As I shared in my column a few weeks ago, this again indicates that it’s not how much time we spend on social media that determines our mental state. It’s how we spend that time. If we spend it looking for connection, rather than obsessing over social status, it can be a good thing. 

Another study, from the University of Oxford, examined data on more than 300,000 adolescents and found that increased screen time has no more impact on teenager’s mental health than eating more potatoes. Or wearing glasses.

If you’re really worried about your teen’s mental health, make sure they have breakfast. Or get enough sleep. Or just spend more time with them. All those things are going to have a lot more impact than the time they spend on their phone.

To be clear, this is not me becoming a fan of Facebook or social media in general. There are still many things to be concerned about. But let’s also realize that technology — any technology– is a tool. It is not inherently good or evil. Those qualities can be found in how we choose to use technology. 

Have More People Become More Awful?

Is it just me, or do people seem a little more awful lately? There seems to be a little more ignorance in the world, a little less compassion, a little more bullying and a lot less courtesy.

Maybe it’s just me.

It’s been a while since I’ve checked in with eternal optimist Steven Pinker.  The Harvard psychologist is probably the best-known proponent of the argument that the world is consistently trending towards being a better place.  According to Pinker, we are less bigoted, less homophobic, less misogynist and less violent. At least, that’s what he felt pre-COVID lockdown. As I said, I haven’t checked in with him lately, but I suspect he would say the long-term trends haven’t appreciably changed. Maybe we’re just going through a blip.

Why, then, does the world seem to be going to hell in a hand cart?  Why do people — at least some people — seem so awful?

I think it’s important to remember that our brain likes to play tricks on us. It’s in a never-ending quest to connect cause and effect. Sometimes, to do so, the brain jumps to conclusions. Unfortunately, it is aided in this unfortunate tendency by a couple of accomplices — namely news reporting and social media. Even if the world isn’t getting shittier, it certainly seems to be. 

Let me give you one example. In my local town, an anti-masking rally was recently held at a nearby shopping mall. Local news outlets jumped on it, with pictures and video of non-masked, non-socially distanced protesters carrying signs and chanting about our decline into Communism and how their rights were being violated.

What a bunch of boneheads — right? That was certainly the consensus in my social media circle. How could people care so little about the health and safety of their community? Why are they so awful?

But when you take the time to unpack this a bit, you realize that everyone is probably overplaying their hands. I don’t have exact numbers, but I don’t think there were more than 30 or 40 protestors at the rally. The population of my city is about 150,000. These protestors represented .03% of the total population. 

Let’s say for every person at the rally, there were 10 that felt the same way but weren’t there. That’s still less than 1%. Even if you multiplied the number of protesters by 100, it would still be just 3% of my community. We’re still talking about a tiny fraction of all the people who live in my city. 

But both the news media and my social media feed have ensured that these people are highly visible. And because they are, our brain likes to use that small and very visible sample and extrapolate it to the world in general. It’s called availability bias, a cognitive shortcut where the brain uses whatever’s easy to grab to create our understanding of the world.

But availability bias is nothing new. Our brains have always done this. So, what’s different about now?

Here, we have to understand that the current reality may be leading us into another “mind-trap.” A 2018 study from Harvard introduced something called “prevalence-induced concept change,” which gives us a better understanding of how the brain focuses on signals in a field of noise. 

Basically, when signals of bad things become less common, the brain works harder to find them. We expand our definition of what is “bad” to include more examples so we can feel more successful in finding them.

I’m probably stretching beyond the limits of the original study here, but could this same thing be happening now? Are we all super-attuned to any hint of what we see as antisocial behavior so we can jump on it? 

If this is the case, again social media is largely to blame. It’s another example of our current toxic mix of dog whistlecancel culturevirtue signaling, pseudo-reality that is being driven by social media. 

That’s two possible things that are happening. But if we add one more, it becomes a perfect storm of perceived awfulness. 

In a normal world, we all have different definitions of the ethical signals we’re paying attention to. What you are focused on right now in your balancing of what is right and wrong is probably different from what I’m currently focused on. I may be thinking about gun control while you’re thinking about reducing your carbon footprint.

But now, we’re all thinking about the same thing: surviving a pandemic. And this isn’t just some theoretical mind exercise. This is something that surrounds us, affecting us every single day. When it comes to this topic, our nerves have been rubbed raw and our patience has run out. 

Worst of all, we feel helpless. There seems to be nothing we can do to edge the world toward being a less awful place. Behaviors that in another reality and on another topic would have never crossed our radar now have us enraged. And, when we’re enraged, we do the one thing we can do: We share our rage on social media. Unfortunately, by doing so, we’re not part of the solution. We are just pouring fuel on the fire.

Yes, some people probably are awful. But are they more awful than they were this time last year? I don’t think so. I also can’t believe that the essential moral balance of our society has collectively nosedived in the last several months. 

What I do believe is that we are living in a time where we’re facing new challenges in how we perceive the world. Now, more than ever before, we’re on the lookout for what we believe to be awful. And if we’re looking for it, we’re sure to find it.

You Said, ‘Why Public Broadcasting?’ I Still Say, ‘Why Not?’

It appears my column a few weeks ago on public broadcasting hit a few raw nerves. Despite my trying to stickhandle around the emotionally charged use of the word “socialism” there were a few comments saying, in essence, why should taxpayers have to support broadcasting when there were private and corporate donors willing to do so? Why would we follow a socialist approach to ensuring fair and responsible journalism? We are the land of the free and open market. Let’s just let it do its job.

One commenter suggested that if people want to support responsible journalism, let them become subscribers. Make it a Netflix-based model for journalism. That is one solution put forward in my friend John Marshall’s  new book, “Free is Bad.”

It’s not wrong. It’s certainly one approach. I would encourage everyone to subscribe to at least one news publication that still practices real journalism.

Another commenter suggested that as long as there are donors who believe in journalism and are willing to put their money where their mouth is, we can let them carry the load. That’s another approach. 

Case in point, ProPublica. 

ProPublica is a nonprofit newsroom funded by donations. The quality of its reporting has garnered it six Pulitzers, five Peabodys, three Emmys and a number of other awards. It can certainly be pointed to as a great example of high-quality reporting that doesn’t rely on advertising dollars. But ProPublica has been around since 2008 and it only has a little over 100 journalists on the payroll. I’m sure its principals would love to hire more. They just don’t have enough money. 

The problem here — the one that prompted my suggestion to consider public broadcasting as an alternative — is that both subscriber and donor-based approaches are like trying to kill the elephant in the room with a flyswatter. The economics are hopelessly imbalanced and just can’t work.

Journalism is in full-scale attrition because its revenue model is irretrievably broken. Here’s why it’s broken: The usual winner in competitions based on capitalism is what’s most popular, not what’s the best. It’s a race to the shallow end of the pool.

And that’s what’s happened to real news reporting. Staying shallow in an advertising-supported marketplace is the best way to ensure profitability. 

But even the shallow end needs some water; there needs to be some news to act as the raw material for opinion and analysis content. In the news business, that water is the overflow from the deep end. And someone — somewhere — has to keep refilling the deep end.

In a market that is determined to cling to free-market capitalism, no one is willing to invest in the type of journalism required to keep the deep end full. It’s the Tragedy of the Commons, applied to journalism. There are too many taking, and no one is giving back. Incentives and required outcomes are not only not aligned, they are pointed in opposite directions. 

But, as my commenters noted, that is where subscriptions and donations can come in. Obviously, a subscriber-based model has worked very well for streaming services like Netflix. Why couldn’t the same be true for journalism? 

I don’t believe the same approach will work, for a few reasons. 

First, Netflix has the advantage of exclusivity. You have to subscribe to access their content. Journalism doesn’t work that way. Once a news story has broken, there is a whole downstream gaggle of news channels that will jump on it and endlessly spin and respin it with their own analysis and commentary.  

This respun content will always be more popular that the original story, because it’s been predigested to align with the target audience’s own beliefs and perspectives. As I’ve said before, when it comes to news, we have a junk food habit. And why would you buy broccoli when you can get a cheeseburger for free?

This exclusivity also gives Netflix the ability to program both for quality and popularity. For every “Queen’s Gambit,” there are dozens of “Tiger King’s” and other brain-food junk snacks. When all the money is being dumped into the same pool, it can fill both the shallow and deep ends at the same time.

But perhaps the biggest misconception about Netflix’s success is that it’s not determined if Netflix is, in fact, successful. It is still a model in transition and is still relying heavily on licensed content to prop up the profitability of its original programming. When it comes to successfully transitioning the majority of viewer streams to its own programming, the jury is still very much out, as this analysis notes.

There are more reasons why I don’t think a subscription model is the best answer to journalism attrition, but we’ll leave it there for now. 

But what about donor-based journalism, like that found on PBS affiliates or ProPublica? While I don’t doubt their intentions or the quality of the reporting, I do have issues with the scale. There are simply not enough donor dollars flowing into these organizations to fund the type of expensive journalism that we need. 

And these donor dollars are largely missing in local markets, where the attrition of true news reporting is progressing at an even faster rate. In the big picture — and to return to our previous analogy — this represents a mere trickle into the deep end. 

There are just some things that shouldn’t exist in a for-profit setting. The dynamics of capitalism and how it aligns incentive just don’t work for these examples. These things are almost always social obligations that we must have but that require a commitment that usually represents personal sacrifice. 

This is the basis of a social democracy where personal sacrifice is typically exacted through taxation. While you may not like it, taxation is still the best way we’ve found to prevent the Tragedy of the Commons. 

We are now to the point where access to true and reliable information has become a social obligation. And much as we may not like it, we all need to sacrifice a little bit to make sure we don’t lose it forever.

Analyzing the Problem with News “Analysis”

Last week, I talked about the Free News problem. In thinking about how to follow that up, I ran across an interesting study that was published earlier this year in the Science Advances Journal. One of the authors was Duncan Watts, who I’ve mentioned repeatedly in previous columns.

In the study, the research team tackled the problem of “Fake News” which is – of course – another symptom of the creeping malaise that is striking the industry of journalism. It certainly has become a buzzword in the last few years. But the team found that the problem of fake news may not be a problem at all. It makes up just 0.15% of our entire daily media diet. In fact, across all ages in the study, any type of news is – at the most – just 14.2% of our total media consumption.

The problem may be our overuse of the term “news” – applying it to things we think are news but are actually just content meant to drive advertising revenues. In most cases, this is opinion (sometimes informed but often not) masquerading as news in order to generate a lot of monetizable content. Once again, to get to the root of the problem, we have to follow the money.

If we look again at the Ad Fontes Media Bias chart, it’s not “news” that’s the problem. Most acknowledged leaders in true journalism are tightly clustered in the upper middle of the chart, which is where we want our news sources to be. They’re reliable and unbiased.

If we follow the two legs of the chart down to the right or left into the unreliable territory where we might encounter “fake” news, we find from the study mentioned above that this makes up an infinitesimal percentage of the media most of us actually pay attention to. The problem here can be found in the middle regions of the chart. This is where we find something called analysis. And that might just be our problem.

Again, we have to look at the creeping poison of incentive here. Some past students from Stanford University have an interesting essay about the economics of journalism that shows how cable tv and online have disrupted the tenuous value chain of news reporting.

The profitability of hard reporting was defined in the golden age of print journalism – specifically newspapers. The problem with reporting as a product is twofold. One is that news in non-excludable. Once news is reported anyone can use it. And two is that while reporting is expensive, the cost of distribution is independent of the cost of reporting. The cost of getting the news out is the same, regardless of how much news is produced.

While newspapers were the primary source of news, these two factors could be worked around. Newspapers came with a built-in 24-hour time lag. If you could get a one day jump on the competition, you could be very profitable indeed.

Secondly, the fixed distribution costs made newspapers a very cost-effective ad delivery vehicle. It cost the newspapers next to nothing to add advertising to the paper, thereby boosting revenues.

But these two factors were turned around by Internet and Cable News. If a newspaper bore the bulk of the costs by breaking a story, Cable TV and the Internet could immediately jump on board and rake in the benefits of using content they didn’t have to pay for.

And that brings us to the question of news “analysis”. Business models that rely on advertising need eyeballs. And those eyeballs need content. Original content – in the form of real reporting – is expensive and eats into profit. But analysis of news that comes from other sources costs almost nothing. You load up on talking heads and have them talk endlessly about the latest story. You can spin off never ending reams of content without having to invest anything in actually breaking the story.

This type of content has another benefit; customers love analysis. Real news can be tough to swallow. If done correctly, it should be objective and based on fact.  Sometimes it will force us to reconsider our beliefs. As is often the case with news, we may not like what we hear.

Analysis – or opinion – is much more palatable. It can be either partially or completely set free from facts and swayed and colored to match the audience’s beliefs and biases. It scores highly on the confirmation bias scale. It hits all the right (or left) emotional buttons. And by doing this, it stands a better chance of being shared on social media feeds. Eyeballs beget eyeballs. The gods of corporate finance smile benignly on analysis content because of its effectiveness at boosting profitability.

By understanding how the value chain of good reporting has broken down due to this parasitic piling on by online and cable platforms in the pursuit of profit, we begin to understand how we can perhaps save journalism. There is simply too much analytical superstructure built on top of the few real journalists that are doing real reporting. And the business model that once supported that reporting is gone.

The further that analysis gets away from the facts that fuel it, the more dangerous it becomes. At some point it crosses the lines from analysis to opinion to propaganda. The one thing it’s not is “news.” We need to financially support through subscription the few that are still reporting on the things that are actually happening.

Why Free News is (usually) Bad News

Pretty much everything about the next week will be unpredictable. But whatever happens on Nov. 3, I’m sure there will be much teeth-gnashing and navel-gazing about the state of journalism in the election aftermath.

And there should be. I have written much about the deplorable state of that particular industry. Many, many things need to be fixed. 

For example, let’s talk about the extreme polarization of both the U.S. population and their favored news sources. Last year about this time, the PEW Research Center released a study showing that over 30% of Americans distrust their news sources. 

But what’s more alarming is, when we break this down by Republicans versus Democrats, only 27% of Democrats didn’t trust the news for information about politics or elections. With Republicans, that climbed to a whopping 67%. 

The one news source Republicans do trust? Fox News. Sixty-five percent of them say Fox is reliable. 

And that’s a problem.

Earlier this year, Ad Fontes Media came out with its Media Bias Chart. It charts major news and media channels on two axes: source reliability and political bias. The correlation between bias and reliability is almost perfect. The further a news source is out to the right or left, the less reliable it is.

How does Fox fare? Not well. Ad Fontes separates Fox TV from Fox Online. Fox Online lies on the border between being “reliable for news, but high in analysis/opinion content” and “some reliability issues and/or extremism.” Fox TV falls squarely in the second category.

I’ve written before that media bias is not just a right-wing problem. Outlets like CNN and MSNBC show a significant left-leaning bias. But CNN Online, despite its bias, still falls within the “Most Reliable for News” category. According to Ad Fontes, MSNBC has the same reliability issues as Fox.

The question that has to be asked is “How did we get here?”  And that’s the question tackled head-on in a new book, “Free is Bad,” by John Marshall.

I’ve known Marshall for ages. He has covered a lot of the things I’ve been writing about in this column. 

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” 

Upton Sinclair

The problem here is one of incentive. Our respective media heads didn’t wake up one morning and say, “You know what we need to be? A lot more biased!” They have walked down that path step by step, driven by the need to find a revenue model that meets their need for profitability. 

When we talk about our news channels, the obvious choice to be profitable is to be supported by ads. And to be supported by ads, you have to be able to target those ads. One of the most effective targeting strategies is to target by political belief, because it comes reliably bundled with a bunch of other beliefs that makes it very easy to predict behaviors. And that makes these ads highly effective in converting prospects.

This is how we got to where we are. But there are all types of ways to prop up your profit through selling ads. Some are pretty open and transparent. Some are less so. And that brings us to a particularly interesting section of Marshall’s book. 

John Marshall is a quant geek at heart. He has been a serial tech entrepreneur — and, in one of those ventures, built a very popular web analytics platform. He also has intimate knowledge of how the sausages are made in the ad-tech business. He knows sketchy advertising practices when he sees them. 

Given all of this, Marshall was able to undertake a fascinating analysis of the ads we see on various news platforms that dovetails nicely with the Ad Fontes chart. 

Marshall created the Ad Shenanigans chart. Basically, he did a forensic analysis of the advertising approaches of various online news platforms. He was looking for those that gathered data about their users, sold traffic to multiple networks, featured clickbait chumboxes and other unsavory practices. Then he ranked them accordingly.

Not surprisingly, there’s a pretty strong correlation between reputable reporting and business ethics. Highly biased and less reputable sites on the Ad Fontes Bias Chart (Breitbart, NewsMax, and Fox News) all can also be found near the top of Marshall’s Ad Shenanigans Chart. Those that do seem to have some ethics when it comes to the types of ads they run also seem to take objective journalism seriously. Case in point, The Guardian in the UK and ProPublica in the U.S.

The one anomaly in the group seems to be CNN. While it does fare relatively well on reputable reporting according to Ad Fontes, CNN appears to be willing to do just about anything to turn a buck. It ranks just a few slots below Fox in terms of “ad shenanigans.”

Marshall also breaks out those platforms that have a mix of paid firewalls and advertising. While there are some culprits in the mix such as the Daily Caller, Slate and the National Review, most sites that have some sort of subscription model seem to be far less likely to fling the gates of their walled gardens open to the ethically challenged advertising hordes. 

All of this drives home Marshall’s message: When it comes to the quality of your news sources, free is bad. As soon as something costs you nothing, you are no longer the customer. You’re the product. Invisible hand market forces are no longer working for you. They are working for the advertiser. And that means they’re working against you if you’re looking for an unbiased, quality news source.

Our Brain And Its Junk News Habit

Today, I’m going to return to the Reuter’s Digital News Report and look at the relationship between us, news and social media. But what I’m going to talk about is probably not what you think I’m going to talk about.

Forget all the many, many problems that come with relying on social media to be informed. Forget about filter bubbles and echo chambers. Forget about misleading or outright false stories. Forget about algorithmic targeting. Forget about the gaping vulnerabilities that leave social media open to nefarious manipulation. Forget all that (but just for the moment, because those are all horrible and very real problems that we need to focus on).

Today, I want to talk about one specific problem that comes when we get our news through social media. When we do that, our brains don’t work the way they should if we want to be well informed.

First, let’s talk about the scope of the issue here. According to the Reuter’s study, in the U.S. more people — 72% — turn online for news than any other source. Television comes in second at 59%. If we single out social media, it comes in third at 48%. Trailing the pack is print media at just 20%.

Reuters Digital News Study 2020 – Sources of News in US

If we plot this on a chart over the last seven years, print and social media basically swapped spots, with their respective lines crossing each other in 2014; one trending up and one trending down. In 2013, 47% of us turned to print as a primary news source and just 27% of us went to social media.

If we further look at those under 35, accessing news through social media jumps to the number-one spot by a fairly wide margin. And because they’re young, we’re not talking Facebook here. Those aged 18 to 24 are getting their news through Instagram, Snapchat and TikTok.

The point, if it’s not clear by now, is that many of us get our news through a social media channel — and the younger we are, the more that’s true. The paradox is that the vast majority of us — over 70% — don’t trust the news we see on our social media feeds. If we were to pick an information source we trusted, we would never go to social media.

This brings up an interesting juxtaposition in how we’re being informed about the world: almost all of us are getting our news through social media, but almost none of us are looking for it when we do.

According to the Reuter’s Report, 72% of us (all ages, all markets) get our news through the “side door.” This means we are delivered news — primarily through social media and search — without us intentionally going directly to the source of the information. For those aged 18 to 24, “side door” access jumps to 84% and, of that, access through social media jumps to 38%.

Our loyalty to the brand and quality of an information provider is slipping between our fingers and we don’t seem to care. We say we want objective, non-biased, quality news sources, but in practice we lap up whatever dubious crap is spoon-fed to us by Facebook or Instagram. It’s the difference between telling our doctor what we intend to eat and what we actually eat when we get home to the leftover pizza and the pint of Häagen-Dazs in our fridge.

The difference between looking for and passively receiving information is key to understanding how our brain works. Let’s talk a little bit about “top-down” and “bottom-up” activation and the “priming” of our brain.

When our brain has a goal — like looking for COVID-19 information — it behaves significantly differently than when it is just bored and wanting to be entertained.

The goal sets a “top down” intent. It’s like an executive order to the various bits and pieces of our brain to get their shit together and start working as a team. Suddenly the entire brain focuses on the task at hand and things like reliability of information become much more important to us. If we’re going to go directly to a information source we trust, this is going to be when we do it.

If the brain isn’t actively engaged in a goal, then information has to initiate a “bottom-up” activation. And that is an entirely different animal.

We never go to social media looking for a specific piece of news. That’s not how social media works. We go to our preferred social channels either out of sheer boredom or a need for social affirmation. We hope there’s something in the highly addictive endlessly scrolling format that will catch our attention.

For a news piece to do that, it has to somehow find a “hook” in our brain.  Often, that “hook” is an existing belief. The parts of our brain that act as gatekeepers against unreliable information are bypassed because no one bothered to wake them up.

There is a further brain-related problem with relying on social media, and that’s the “priming” issue. This is where one stimulus sets a subconscious “lens” that will impact subsequent stimuli. Priming sets the brain on a track we’re not aware of, which makes it difficult to control.

Social media is the perfect priming platform. One post sets the stage for the next, even if they’re completely unrelated.

These are just two factors that make social media an inherently dangerous platform to rely on for being informed.

The third is that social media makes information digestion much too easy. Our brain barely needs to work at all. And if it does need to work, we quickly click back and scroll down to the next post. Because we’re looking to be entertained, not informed, the brain is reluctant to do any unnecessary heavy lifting.   

This is a big reason why we may know the news we get through social media channels is probably not good for us, but we gulp it down anyway, destroying our appetite for more trustworthy information sources.

These three things create a perfect cognitive storm for huge portions of the population to be continually and willingly misinformed. That’s not even factoring in all the other problems with social media that I mentioned at the outset of this column. We need to rethink this — soon!

Playing Fast and Loose with the Truth

A few months ago, I was having a conversation with someone and they said something that I was pretty sure was not true. I don’t know if it was a deliberate lie. It may have just been that this particular person was uninformed. But they said it with the full confidence that what they said was true. I pushed back a little and they instantly defended their position.

My first instinct was just to let it go. I typically don’t go out of my way to cause friction in social settings. Besides, it was an inconsequential thing. I didn’t really care about it. But I was feeling a little pissy at the time, so I fact checked her by looking it up on my phone. And I was right. She had stated something that wasn’t true and then doubled down on it.

Like I said, it was inconsequential – a trivial conversation point. But what if it wasn’t? What if there was a lot riding on whether or not what they said was true? What if this person was in a position of power, like – oh, I don’t know – the President of the United States?

The role of truth in our social environment is currently a thing in flux. I cannot remember a time when we have been more suspicious of what we see, read and hear on a daily basis. As I mentioned a couple weeks ago, less than 40% of us trust what we hear on the news. And when that news comes through our social media feed, the level of distrust jumps to a staggering 80%

Catching someone in a lie has significant social and cognitive implications. We humans like to start from a default position of trust. If we can do that, it eliminates a lot of social friction and cognitive effort. We only go to not trusting when we have to protect ourselves.

Our proclivity for trust is what has made a global commerce and human advancement possible. But, unfortunately, it does leave us vulnerable. Collectively, we usually play by the same playbook I was initially going to use in my opening example. It’s just easier to go along with what people say, even if we may doubt that it’s true. This is especially so if the untruth is delivered with confidence. We humans love confidence in others because it means we don’t have to work as hard. Confidence is a signal we use to decide to trust and trust is always easier than distrust. The more confident the delivery, the less likely we are to question it.

It’s this natural human tendency that put the “con” in “con artist.” “Con” is short for confidence, and it originates with an individual named William Thompson, who plied the streets of New York in the 1840’s. He would walk up to a total stranger who was obviously well off and greet them like a long-lost friend. After a few minutes of friendly conversation during which the target would be desperately trying to place this individual, Thompson would ask for the loan of something of value. He would then set his hook with this, “Do you have confidence in me to loan me this [item] til tomorrow?”  The success of this scam was totally dependent on an imbalance of confidence; extreme confidence on the part of the con artist and a lack of confidence on the part of the target.

It is ironic that in an era where it’s easier than ever to fact check, we are seeing increasing disregard for the truth. According to the Washington Post, Donald Trump passed a misinformation milestone on July 9, making 20,000 false or misleading claims since he became President. He surged past that particular post when he lied 62 times on that day alone. I don’t even think I talk 62 times per day.

This habit of playing fast and loose with the truth is not Trump’s alone. Unfortunately, egregious lying has been normalized in today’s world. We have now entered an era where full time fact checking is necessary. On July 7, NY Times columnist Thomas Friedman said we need a Biden-Trump debate, but only on two conditions: First, only if Trump releases his tax returns, and second, only if there is a non-partisan real-time fact-checking team keeping the debaters accountable.

We have accepted this as the new normal. But we shouldn’t. There is an unacceptable cost we’re paying by doing so. And that cost becomes apparent when we think about the consequence of lying on a personal basis.

If we catch an acquaintance in a deliberate lie, we put them in the untrustworthy column. We are forced into a default position of suspicion whenever we deal with them in the future. This puts a huge cognitive load on us. As I said before, it takes much more effort to not trust someone. It makes it exponentially harder to do business with them. It makes it more difficult to enjoy their company. It introduces friction into our relationship with them.

Even if the lie is not deliberate but stated with confidence, we label them as uninformed. Again, we trust them less.

Now multiply this effort by everyone. You quickly see where the model breaks down. Lying may give the liar a temporary advantage, but it’s akin to a self-limiting predator-prey model. If it went unchecked, soon the liars would only have other liars to deal with. It’s just not sustainable.

Truth exists for a reason. It’s the best social strategy for the long term. We should fight harder for it.

How We Forage for the News We Want

Reuters Institute out of the UK just released a comprehensive study looking at how people around the world are finding their news. There is a lot here, so I’ll break it into pieces over a few columns and look at the most interesting aspects. Today, I’ll look at the 50,000-foot view, which can best be summarized as a dysfunctional relationship between our news sources and ourselves. And like most dysfunctional relationships, the culprit here is a lack of trust.

Before we dive in, we should spend some time looking at how the way we access news has changed over the last several years.

Over my lifetime, we have trended in two general directions – less cognitively demanding news channels and less destination specific news sources. The most obvious shift has been away from print. According to Journalism.org and the Pew Research Center, circulation of U.S. Daily newspapers peaked around 1990, at about 62 and a half million. That’s one subscription for every 4 people in the country at that time.

In 2018, it was projected that circulation had dropped more than 50%, to less than 30 million. That would have been one subscription for every 10 people. We were no longer reading our news in a non-digital format. And that may have significant impact on our understanding of the news. I’ll return to this in another column, but for now, let’s just understand that our brain operates in a significantly different way when it’s reading rather than watching or listening.

Up the end of the last century, we generally trusted news destinations. Whether it be a daily newspaper like the New York Times, a news magazine like Time or a nightly newscast such as any of the network news shows, each was a destination that offered one thing above all others – the news. And whether you agreed with them or not, each had an editorial process that governed what news was shared. We had a loyalty to our chosen news destinations that was built on trust.

Over the past two decades, this trust has broken down due to one primary factor – our continuing use of social media. And that has dramatically shifted how we get our news.

In the US, three out of every four people use online sources to get their news. One in two use social media.  Those aged 18 to 24 are more than twice as likely to rely on social media. In the UK, under-35s get more of their news from Social Media than any other source.

Also, influencers have become a source of news, particularly amongst young people. In the US, a quarter of those 18 to 24 used Instagram as a source of news about COVID.

This means that most times, we’re getting our news through a social media lens. Let’s set aside for a moment the filtering and information veracity problems that introduces. Let’s just talk about intent for a moment.

I have talked extensively in the past about information foraging when it comes to search. When information is “patchy” and spread diversely, the brain has to make a quickly calculated guess about which patch it’s most likely to find the information in it’s looking for. With Information Foraging, the intent we have frames everything that comes after.

In today’s digital world, information sources have disaggregated into profoundly patchy environments. We still go to news-first destinations like CNN or Fox News but we also get much of our information about the world through our social media feeds. What was interesting about the Reuters report was that it was started before the COVID pandemic, but the second part of the study was conducted during COVID. And it highlights a fascinating truth about our relationship with the news when it comes to trust.

The study shows that the majority of us don’t trust the news we get through social media but most times, we’re okay with that. Less than 40% of people trust the news in general, and even when we pick a source, less than half of us trust that particular channel. Only 22% indicated they trust the news they see in social media. Yet half of us admit we use social media to get our news. The younger we are, the more reliant we are on social media for news. The fastest growing sources for news amongst all age groups – but especially those under 30 – are Instagram, SnapChat and WhatsApp.

Here’s another troubling fact that fell out of the study. Social platforms, especially Instagram and SnapChat, are dominated by influencers. That means that much of our news comes to us by way of a celebrity influencer reposting it on their feed. This is a far cry from the editorial review process that used to act as a gate keeper on our trusted news sources.

So why do we continue to use news sources we admit we don’t trust? I suspect it may have to do with something called the Meaning Maintenance Model. Proposed in 2006 by Heine, Proulx and Vohs, the model speculates that a primary driver for us is to maintain our beliefs in how the world works. This is related to the sense making loop (Klein, Moon and Hoffman) I’ve also talked about in the past. We make sense of the world by first starting with the existing frame of what we believe to be true. If what we’re experiencing is significantly different from what we believe, we will update our frame to align with the new evidence.

What the Meaning Maintenance Model suggests is that we will go to great lengths to avoid updating our frame. It’s much easier just to find supposed evidence that supports our current beliefs. So, if our intent is to get news that supports our existing world view, social media is the perfect source. It’s algorithmically filtered to match our current frame. Even if we believe the information is suspect, it still comforts us to have our beliefs confirmed. This works well for news about politics, societal concerns and other ideologically polarized topics.

We don’t like to admit this is the case. According to the Reuter’s study, 60% of us indicate we want news sources that are objective and not biased to any particular point of view. But this doesn’t jive with reality at all. As I wrote about in a previous column, almost all mainstream news sources in the US appear to have a significant bias to the right or left. If we’re talking about news that comes through social media channels, that bias is doubled down on. In practice, we are quite happy foraging from news sources that are biased, as long as that bias matches our own.

But then something like COVID comes along. Suddenly, we all have skin in the game in a very real and immediate way. Our information foraging intent changes and our minimum threshold for the reliability of our news sources goes way up. The Reuters study found that when it comes to sourcing COVID information, the most trusted sources are official sites of health and scientific organizations. The least trusted sources are random strangers, social media and messaging apps.

It requires some reading between the lines, but the Reuters study paints a troubling picture of the state of journalism and our relationship with it. Where we get our information directly impacts what we believe. And what we believe determines what we do.

These are high stakes in an all-in game of survival.

TV and My Generation

My Generation has been a dumpster fire of epic proportions. I am a baby boomer, born in 1961, at the tail end of the boom. And, according to Time magazine, we broke America.  We probably destroyed the planet. And, oh yeah, we’ve also screwed up the economy. I’d like to say it isn’t true, but I’m pretty sure it is. As a generation, we have an extensive rap sheet.

Statistically, baby boomers are one of the most politically polarized generations alive today. So, the vast chasm that exists between the right and the left may also be our fault. 

As I said, we’re a generational dumpster fire. 

A few columns back I said this: “We create the medium — which then becomes part of the environment we adapt to.”  I was referring to social media and its impact on today’s generations. 

But what about us? What about the generation that has wreaked all this havoc? If I am right and the media we make in turn makes us who we are, what the hell happened to our generation?

Television, that’s what. 

There have been innumerable treatises on how baby boomers got to be in the sorry state we’re in. Most blame the post-war affluence of America and the never-ending consumer orgy it sparked. 

But we were also the first generation to grow up in front of a television screen. Surely that must have had some impact. 

I suspect television was one of the factors that started driving the wedge between the right and left halves of our generation, creating a non-stretchable world in between. Further, I think it may have been the prime suspect.

Let’s plot the trends of what was on TV against my most influential formative years, and — by extension — my generation. 

When I was 5 years old, in 1966, the most popular TV shows fell into two categories: westerns like “Bonanza” and “Gunsmoke,” or cornfed comedies like “The Andy Griffith Show,” “The Beverly Hillbillies,” “Green Acres” and “Petticoat Junction.” Social commentary and satire were virtually nonexistent on American prime-time TV. The values of America were tightly censored, wholesome and non-confrontational. The only person of color in the line-up was Bill Cosby on “I Spy.” Thanks to “Hogan’s Heroes,” even the Nazis were lovable doofuses. 

I suspect when certain people of my generation want to Make America Great Again, it is this America they’re talking about. It was a white, wholesome America that was seen through the universally rose-colored glasses given to us by the three networks. 

It was also completely fictional, ignoring inconveniences like the civil rights movement, Vietnam and rampant gender inequality. This America never existed. 

When we talk about the cultural environment my generation literally cut our teeth in, this is what we refer to. There was no moral ambiguity. It was clear who the good guys were, because they all wore white hats. 

This moral baseline was spoon-fed to us right when we were first making sense of our own realities. Unfortunately, it bore little to no resemblance to what was actually real.

The fact was, through the late ’60s, America was already increasingly polarized politically. Left and right were drifting apart. Even Bob Hope felt the earth splitting beneath his feet. In November, 1969, he asked all the elected leaders of the country, no matter their politics, to join him in a week of national unity. One of those leaders called it “a time of crisis, greater today perhaps than since the Civil War.” 

But rather than trying to heal the wounds, politicians capitalized on them, further splitting the country apart by affixing labels like Nixon’s “The Silent Majority.” 

Now, let’s move ahead to my teen years. From our mid-teens to our mid-twenties, we create our social identities. Our values and morals take on some complexity. The foundations for our lifelong belief structures are formed during these years. 

In 1976, when I was 15, the TV line-up had become a lot more controversial. We had many shows regularly tackling social commentary: “All in the Family,” “M*A*S*H,” “Sanford and Son,” “Welcome Back, Kotter,” “Barney Miller” and “Good Times.” Of course, we still had heaps of wholesome, thanks to “Happy Days,” “Marcus Welby, M.D.” and “The Waltons.

Just when my generation was forming the values that would define us, our prime-time line-up was splitting left and right. You had the social moralizing of left-leaning show runners like Norman Lear (“All in the Family”) and Larry Gelbart (“M*A*S*H”) vs the God and Country values of “The Waltons” and “Little House on the Prairie.” 

I don’t know what happened in your hometown, but in mine, we started to be identified by the shows we watched (or, often, what our parents let us watch). You had the “All in the Family” Group and “The Waltons” Group. In the middle, we could generally agree on “Charlie’s Angels” and “The Six Million Dollar Man.” The cracks in the ideologies of my generation were starting to show.

I suspect as time went forward, the two halves of my generation started looking to television with two different intents: either to inform ourselves of the world that is, warts and all — or to escape to a world that never was. As our programming choices expanded, those two halves got further and further apart, and the middle ground disappeared. 

There are other factors, I’m sure. But speaking for myself, I spent an unhealthy amount of time watching TV when I was young. It couldn’t help but partially form the person I am today. And if that is true for me, I suspect it is also true for the rest of my generation.