Friendship: Uncoupled

This probably won’t come as a shock to anyone reading this: A recent study says that it’s not if you use social media that determines your happiness, but how you use social media. 

Derrick Wirtz, an associate professor of teaching in psychology at the University of British Columbia-Okanagan, took a close look at how people use three major social platforms—Facebook, Twitter and Instagram—and if how you use it can make you happier or sadder.

As I said, most of you probably said to yourself, “Yeah, that checks out.” But this study does bring up an interesting nuance with some far-reaching implications. 

In today’s world, we’re increasingly using Facebook to maintain our social connections. And, according to Facebook’s mission statement, that’s exactly what’s supposed to happen: “People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.”

The interesting thing in this study is the divide between our social activities — those aimed at bonding versus those aimed at gaining status — and how that impacts our moods and behaviors. It’s difficult to untangle the effect of those two factors, because they are so intertwined in our psyches. But according to this study, Dr. Wirtz found that some of us are spending far more time on social media “status-checking” then actually tending to our friendships.

“Passive use, scrolling through others’ posts and updates, involves little person-to-person reciprocal interaction while providing ample opportunity for upward comparison,” says Wirtz. 

We can scroll our newsfeed without any actual form of engagement — but that’s not what we were designed to do. Our social skills evolved to develop essential mutually beneficial bonds in a small group setting.

Friendship is meant to be nurtured and tended to organically and intimately in a face-to-face environment.  But the distal nature of social media is changing the dynamics of how we maintain relationships in our network. 

Take how we first establish friendships, for instance. When you meet someone for the very first time, how do you decide whether you’re going to become friendly or not? The answer, not surprisingly, is complex and nuanced. Our brain works overtime to determine whether we should bond or not. But, also not surprisingly, almost none of that work is based on rational thought.

UCLA psychologist Dr. Elizabeth Laugeson teaches young adults with social challenges, such as those on the autism spectrum, how to take those very first steps toward friendship when meeting a stranger. If you can’t pick up the face-to-face nuances of body language and unspoken social cues intuitively, becoming friends can be incredibly difficult. Essentially, we are constantly scanning the other person for small signs of common interest from which we can start working toward building trust. 

Even if you clear this first hurdle, it’s not easy to build an actual friendship. It requires a massive investment of our time and energy. A recent study from the University of Kansas found it takes about 50 hours of socializing just to go from acquaintance to casual friend. 

Want to make a “real” friend? Tack another 40 hours onto that. And if you goal is to become a “close” friend, you’d better be prepared to invest at least a total of 200 hours. 

So that begs the question, why would we make this investment in the first place? Why do we need friends? And why do we need at least a handful of really close friends? The answer lies in the concept of reciprocity. 

From an evolutionary perspective, having friends made it easier to survive and reproduce. We didn’t have to go it alone. We could help each other past the rough spots, even if we weren’t related to each other. Having friends stacked the odds in our favor. 

This is when our investment in all those hours of building friendships paid off. Again, this takes us back to the intimate and organic roots of friendship. 

Our brains, in turn, reinforced this behavior by making sure that having friends made us happy. 

Of course, like most human behaviors, it’s not nearly that simple or benign. Our brains also entwine the benefits of friendship with the specter of social status, making everything much more complicated. 

Status also confers an evolutionary advantage. For many generations, we have trod this fine line between being a true friend and being obsessed with our own status in the groups where we hang out.

And then came social media.

As Wirtz’s study shows, we now have this dangerous uncoupling between these two sides of our nature. With social media, friendship is now many steps removed from its physical, intimate and organic roots. It is stripped of the context in which it evolved. And, it appears, the intertwined strands of friendship and social status are unraveling. When this happens, time on social media can reap the anxiety and jealousy of status-checking without any of the joy that comes from connecting with and helping a friend. 

On a person-to-person basis, this uncoupling can be disturbing and unfortunate. But consider what may happen when these same tendencies are amplified and magnified through a massive, culture-wide network.

Why Free News is (usually) Bad News

Pretty much everything about the next week will be unpredictable. But whatever happens on Nov. 3, I’m sure there will be much teeth-gnashing and navel-gazing about the state of journalism in the election aftermath.

And there should be. I have written much about the deplorable state of that particular industry. Many, many things need to be fixed. 

For example, let’s talk about the extreme polarization of both the U.S. population and their favored news sources. Last year about this time, the PEW Research Center released a study showing that over 30% of Americans distrust their news sources. 

But what’s more alarming is, when we break this down by Republicans versus Democrats, only 27% of Democrats didn’t trust the news for information about politics or elections. With Republicans, that climbed to a whopping 67%. 

The one news source Republicans do trust? Fox News. Sixty-five percent of them say Fox is reliable. 

And that’s a problem.

Earlier this year, Ad Fontes Media came out with its Media Bias Chart. It charts major news and media channels on two axes: source reliability and political bias. The correlation between bias and reliability is almost perfect. The further a news source is out to the right or left, the less reliable it is.

How does Fox fare? Not well. Ad Fontes separates Fox TV from Fox Online. Fox Online lies on the border between being “reliable for news, but high in analysis/opinion content” and “some reliability issues and/or extremism.” Fox TV falls squarely in the second category.

I’ve written before that media bias is not just a right-wing problem. Outlets like CNN and MSNBC show a significant left-leaning bias. But CNN Online, despite its bias, still falls within the “Most Reliable for News” category. According to Ad Fontes, MSNBC has the same reliability issues as Fox.

The question that has to be asked is “How did we get here?”  And that’s the question tackled head-on in a new book, “Free is Bad,” by John Marshall.

I’ve known Marshall for ages. He has covered a lot of the things I’ve been writing about in this column. 

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” 

Upton Sinclair

The problem here is one of incentive. Our respective media heads didn’t wake up one morning and say, “You know what we need to be? A lot more biased!” They have walked down that path step by step, driven by the need to find a revenue model that meets their need for profitability. 

When we talk about our news channels, the obvious choice to be profitable is to be supported by ads. And to be supported by ads, you have to be able to target those ads. One of the most effective targeting strategies is to target by political belief, because it comes reliably bundled with a bunch of other beliefs that makes it very easy to predict behaviors. And that makes these ads highly effective in converting prospects.

This is how we got to where we are. But there are all types of ways to prop up your profit through selling ads. Some are pretty open and transparent. Some are less so. And that brings us to a particularly interesting section of Marshall’s book. 

John Marshall is a quant geek at heart. He has been a serial tech entrepreneur — and, in one of those ventures, built a very popular web analytics platform. He also has intimate knowledge of how the sausages are made in the ad-tech business. He knows sketchy advertising practices when he sees them. 

Given all of this, Marshall was able to undertake a fascinating analysis of the ads we see on various news platforms that dovetails nicely with the Ad Fontes chart. 

Marshall created the Ad Shenanigans chart. Basically, he did a forensic analysis of the advertising approaches of various online news platforms. He was looking for those that gathered data about their users, sold traffic to multiple networks, featured clickbait chumboxes and other unsavory practices. Then he ranked them accordingly.

Not surprisingly, there’s a pretty strong correlation between reputable reporting and business ethics. Highly biased and less reputable sites on the Ad Fontes Bias Chart (Breitbart, NewsMax, and Fox News) all can also be found near the top of Marshall’s Ad Shenanigans Chart. Those that do seem to have some ethics when it comes to the types of ads they run also seem to take objective journalism seriously. Case in point, The Guardian in the UK and ProPublica in the U.S.

The one anomaly in the group seems to be CNN. While it does fare relatively well on reputable reporting according to Ad Fontes, CNN appears to be willing to do just about anything to turn a buck. It ranks just a few slots below Fox in terms of “ad shenanigans.”

Marshall also breaks out those platforms that have a mix of paid firewalls and advertising. While there are some culprits in the mix such as the Daily Caller, Slate and the National Review, most sites that have some sort of subscription model seem to be far less likely to fling the gates of their walled gardens open to the ethically challenged advertising hordes. 

All of this drives home Marshall’s message: When it comes to the quality of your news sources, free is bad. As soon as something costs you nothing, you are no longer the customer. You’re the product. Invisible hand market forces are no longer working for you. They are working for the advertiser. And that means they’re working against you if you’re looking for an unbiased, quality news source.

Lockdown Advice For A Long Winter

No matter where you live in the world, it’s likely you’re going to be forced to spend a lot of time at home. And if that home includes others  — like your beloved life partner — living under the same roof, you may experience a little friction now and again. In anticipation of this, I thought I’d share a few insights on what might come.

There is No Gender Equality with COVID

recent study by Oxford, Cambridge and Zurich Universities found that women’s sense of mental well being took a more significant drop then men due to COVID. The researchers speculated on a number of reasons for this, but were unable to narrow it down to any identifiable factor. Perhaps, they reasoned,  it had something to do with women losing jobs at a greater rate than men, taking on a greater share of the burden of home schooling — or the fact that even when both men and women were home all the time, women still did more than their fair share of domestic chores. But no, even when controlling for these factors, it didn’t explain why women were becoming more distressed than men. 

Maybe it was something else.

Warriors and Worriers: Two Approaches to Survival

In 2014, psychologist Joyce Benenson published her book “Warriors and Worries: The Survival of the Sexes.”  As an evolutionary biologist, she has spent years studying children and primates, looking for innate rather than socialized differences between the sexes. 

Her findings turned conventional wisdom on its head. Women may not be more sociable than men, and men may not be more competitive than women. It’s just that they define those things differently. 

Men are quite comfortable forming packs of convenience to solve a particular problem, whether it is defending against an enemy or winning a pick-up basketball game. This could explain why team sports entertainment always seems to have a male bias.

Women, on the other hand, have fewer but much more complex relationships that they deem essential to their survival as the primary caregiver for their family. The following is from the abstract of a 1990 study by Berenson: “Although males and females did not differ in the number of best friends they reported, males were found to have larger social networks than females. Further, for males, position in a social network was more highly linked with acceptance by the peer group. Finally, males were concerned with attributes that could be construed as important for status in the peer group, and females were concerned with attributes that appeared essential to relationships with a few friends.”

If we apply this to the current COVID situation, we begin to see why women might be struggling more with lockdown then men. A male’s idea of socializing might be more easily met with a Zoom call or another type of digital connection, such as online gaming. But connecting in these way lacks the bandwidth necessary to properly convey the complexity of a female relationship. 

Introverts and Extroverts Revisited

Of course, gender isn’t the only variable at play here. I’ve written before about what happens when an extrovert and introvert are locked down in the same house together (those being my wife and myself). One of the things I’ve noticed is a different level of comfort we have at being left alone with our thoughts. 

Because I have always been a writer of one kind or another, I require time to ruminate on a fairly frequent basis. I am a little (lot?) dictatorial in my requirements for this: my environment needs to be silent and free from interruption. When the weather is good outside, this is fairly easy. I can grab my laptop and go outside. But in the winter, it’s a different story. My wife is subjected to forced silence so I can have my quiet time.

My wife functions best when there is some type of sensory stimuli, especially the sound of voices. She doesn’t have the same need to sit in silence and be alone with her thoughts. 

And she’s not unique in that. A 2014 study found that most of us fall into the same category. In fact, the researchers found that, “many of the people studied, particularly the men, chose to give themselves a mild electric shock rather than be deprived of external sensory stimuli.”

A Difference in Distraction

When we do look for distraction, we can also have different needs and approaches. Another area I’ve touched on in a past post is how our entertainment delivery platforms have now become entangled with multitasking. 

I like an immersive, interruption-free entertainment experience. The bigger the screen and the louder the sound, the better. I suspect this may be another “male” thing.  Again, this preference tends to cast a dictatorial tone on our household, so I generally retreat to my media cave in the basement. I also tuck my phone away while I’m watching. 

My wife prefers to multiscreen when watching TV and to do so in the highest traffic area of our house. For her, staying connected is more important than being immersed in whatever she might be watching. 

These differences in our entertainment preferences often means we’re not together when we seek distraction. 

I don’t think this is a bad thing. In a normal world filled with normal activities, this balancing of personal preference is probably accommodated by our normal routines. But in a decidedly abnormal world where we spend every minute together in the same house, these differences become more noticeable.

Try a Little Friluftsliv

In the end, winter is going to be long, lonely and cold for many of us. So we may just want to borrow a strategy from Norwegians: friluftsliv. Basically, it means “open-air living.” Most winters, my main activity is complaining. But this year, I’m going to get away from the screens and social media, strap on a pair of snowshoes and embrace winter.

How to Look Past the Nearest Crisis

I was talking to someone the other day who was trying to make plans for 2021. Those plans were dependent on the plans of others. In the course of our conversation, she said something interesting: “It’s so hard to plan because most of the people I’m talking to can’t see past COVID.” 

If anything sums up our current reality, it might be that. We’re all having a lot of trouble seeing past COVID. Or the upcoming U.S. election. Or catastrophic weather events. Or an impending economic crisis. Take your pick. There are so many looming storm clouds on the horizon that it’s difficult to even make out that horizon any more. 

We humans are pretty dependent on the past to tell us what may be happening in the future. We evolved in an environment that — thanks to its stability — was reasonably predictable. In evolutionary survival terms, it was smart to hedge our bets on the future by glancing over our shoulders at the past. If a saber-toothed tiger was likely to eat you yesterday, the odds were very much in favor of it also wanting to eat you tomorrow. 

But our ability to predict things gets thrown for a loop in the face of uncertainty like we’re currently processing. There are just too many variables forced into the equation for us to be able to rely on what has happened in the past. Both the number of variables and the range of variation pushes our prediction probability of error past the breaking point. 

When it comes to planning for the future, we become functionally paralyzed and start living day to day, waiting for the proverbial “other shoe to drop.” 

The bigger problem, however, is that when the world is going to hell in a hand basket, we don’t realize that the past is a poor foundation on which to build our future. Evolved habits die hard, and so we continue to use hindsight to try to move forward. 

And by “we,” I mean everyone — most especially the leaders we elect and the experts we rely on to point us in the right direction.  Many seem to think that a post-COVID world will snap back to be very much like a pre-COVID world.

And that, I’m afraid, may be the biggest problem. You’d think that when worrying about an uncertain future is above our pay grade, there would be someone wiser and smarter than us to rely on and save our collective asses. But if common folk tend to consistently bet on the past as a guide to our future, it’s been shown that people we think of as “experts” double down on that bet. 

A famous study by Philip Tetlock showed just how excruciatingly awful experts were at predicting the future. He assembled a group of 284 experts and got them to make predictions about future events, including those that fell into their area of expertise. Across the board, he found their track record of being correct was only slightly ahead of a random coin toss or a troupe of chimpanzees throwing darts. The more famous the expert, the worse their track record.

Expertise is rooted in experience. Both words spring from the same root: The Latin experiri for “try.” Experience is gained in the past. For experts, their worth comes from their experience in one particular area, so they are highly unlikely to ignore it when predicting the future. They are like the hedgehog in Isiah Berlin’s famous essay “The Hedgehog and The Fox“: They “know one important thing.”

But when it comes to predicting the future, Tetlock found it’s better to be a fox: to “know many little things.” In a complex, highly uncertain world, it’s the generalist  who thrives. 

The reason is pretty simple. In an uncertain world, we have to be more open to sense making in the classic cognitive sense. We have to be attuned to the signals that are playing out in real time and not be afraid to consider new information that may conflict with our current beliefs.

This is how generalists operate. It’s also how science is supposed to operate. Our view of the future should be no more than a hypothesis that we’re willing to have proven wrong. Hedgehogs dig in when their expertise about “one big thing” is questioned. Foxes use it as an opportunity to update their take on reality. 

Foxes have another advantage over hedgehogs. They tend to be dilettantes, spreading their interest over a wide range of topics without diving too deeply into any of them. This keeps their network diverse and expansive, giving them the opportunity to synthesize their sense of reality from the broadest range of signals possible. 

In a world that depends on being nimble enough to shift directions depending on the input your receive, this stacks the odds in favor of the fox. 

Still, it’s against human nature to be so cavalier about our future. We like certainty. We crave predictability. We are big fans of transparent causes and effects. If those things are clouded by complexity and uncertainty, we start constructing our own narratives. Hence the current spike of conspiracy theories, as I noted previously. This is especially true when the stakes are as high as they are now. 

I don’t blame those having a very hard time looking past COVID — or any other imminent disaster. But someone should be. 

It’s time to start honing those fox instincts. 

Why The World is Conspiring Against Us

With all the other things 2020 will go down in history for, it has also proven to be a high-water mark for conspiracy theories. And that shouldn’t surprise us. Science has proven that when the going get tough, the paranoid get weirder. Add to this the craziness multiplier effect of social media, and it’s no wonder that 2020 has given us a bumper crop of batshit crazy. 

As chronicled for you, my dear reader, I kicked over my own little hornet’s nest of conspiracy craziness a few weeks ago. I started with probing a little COVID anti-vaxxing lunacy right here in my home and native land, Canada.The next thing I knew, the QAnoners were lurching out of the woodwork like the coming of the zombie apocalypse.

I have since run for cover.

But as I was running, I noticed two things. One, most of the people sharing the theories were from the right side of the political spectrum. And two, while they’ve probably always been inclined to indulge in conspiratorial thinking, it seems (anecdotally, anyway) that it’s getting worse.

So I decided to dig a little deeper to find the answers to two questions: Why them, and why now?

Let’s start with why them?

My Facebook experience started with the people I grew up with in a small town in Alberta. It’s hard to imagine a more conservative place. The primary industries are oil, gas and farming. Cowboys — real cowboys wearing real Levi jeans — still saunter down Main Street. This was the first place in Western Canada to elect a representative whose goal was to take Western Canada out of a liberal (and Eastern intellectual elitist)—dominated confederation. If you wanted to find the equivalent of Trumpism in Canada, you’d stand a damn good chance of finding it in this part of Alberta. 

So I wondered: What is about conservatives, especially from the extreme right side of conservatism, that make them more susceptible to spreading conspiracy theories?

It turns out it’s not just the extreme right that believes in conspiracies. According to one study, those on the extreme right or left are more apt to believe in conspiracies. It’s just that it happens more often on the right.

And that could be explained by looking at the types of personalities who tend to believe in conspiracies. According to a 2017 analysis of U.S. data by Daniel Freeman and Richard Bentall, over a quarter of the American population are convinced that “there is a conspiracy behind many things in the world.” 

Not surprisingly, when you dig down to the roots of these beliefs, it comes down to a crippling lack of trust, closely tying those ideas to paranoia. Freeman and Bentall noted, “Unfounded conspiracy beliefs and paranoid ideas are both forms of excessive mistrust that may be corrosive at both an individual and societal level.”

So, if one out of every four people in the U.S. (and apparently a notable percentage of Canadians) lean this way, what are these people like? It turns out there are a cluster of personality traits  likely to lead to belief in conspiracy theories.  

First, these people tend to be anxious about things in general. They have a lower level of education and are typically in the bottom half of income ranges. More than anything, they feel disenfranchised and that the control that once kept their world on track has been lost. 

Because of this, they feel victimized by a powerful elite. They have a high “BS receptivity.” And they believe that only they and a small minority of the like-minded know the real truth. In this way, they gain back some of the individual control they feel they’ve lost.

Given the above, you could perhaps understand why, during the Obama years, conspiracy theorists tended to lean to the right. But if anything, there are more conspiratorial conservatives then ever after almost four years of Trump. Those in power were put there by people who don’t trust those in power. So that brings us to the second question: Why now?

Obviously, it’s been a crappy year that has cranked up everybody’s anxiety level. But the conspiracy wave was already well-established when COVID-19 came along. And that wave started when Republicans (and hard right-wing politicians worldwide) decided to embrace populism as a strategy. 

The only way a populist politician can win is by dividing the populace. Populism is – by its nature – antagonistic in nature. There needs to be an enemy, and that enemy is always on the other side of the political divide. As Ezra Klein points out in his book  “Why We’re Polarized,” population density and the U.S. Electoral College system makes populism a pretty effective strategy for the right.

This is why Republicans are actually stoking the conspiracy fires, including outright endorsement of the QAnon-sense. Amazing as it seems, Republicans are like Rocky Balboa: Even when they win, they seem able to continue being the underdog. 

The core that has been whipped up by populism keeps shadow boxing with their avowed enemy: the liberal elite. This political weaponization of conspiracy theories continues to find a willing audience who eagerly amplify it through social media. There is some evidence to show that extreme conservatives are more willing that embrace conspiracies than extreme liberals, but the biggest problem is that there is a highly effective conspiracy machine continually pumping out right-targeted theories.

It seems there were plenty of conspiracies theories making the rounds well before now. The shitstorm that became known as the year 2020 is simply adding fuel to an already raging fire.

Our Brain And Its Junk News Habit

Today, I’m going to return to the Reuter’s Digital News Report and look at the relationship between us, news and social media. But what I’m going to talk about is probably not what you think I’m going to talk about.

Forget all the many, many problems that come with relying on social media to be informed. Forget about filter bubbles and echo chambers. Forget about misleading or outright false stories. Forget about algorithmic targeting. Forget about the gaping vulnerabilities that leave social media open to nefarious manipulation. Forget all that (but just for the moment, because those are all horrible and very real problems that we need to focus on).

Today, I want to talk about one specific problem that comes when we get our news through social media. When we do that, our brains don’t work the way they should if we want to be well informed.

First, let’s talk about the scope of the issue here. According to the Reuter’s study, in the U.S. more people — 72% — turn online for news than any other source. Television comes in second at 59%. If we single out social media, it comes in third at 48%. Trailing the pack is print media at just 20%.

Reuters Digital News Study 2020 – Sources of News in US

If we plot this on a chart over the last seven years, print and social media basically swapped spots, with their respective lines crossing each other in 2014; one trending up and one trending down. In 2013, 47% of us turned to print as a primary news source and just 27% of us went to social media.

If we further look at those under 35, accessing news through social media jumps to the number-one spot by a fairly wide margin. And because they’re young, we’re not talking Facebook here. Those aged 18 to 24 are getting their news through Instagram, Snapchat and TikTok.

The point, if it’s not clear by now, is that many of us get our news through a social media channel — and the younger we are, the more that’s true. The paradox is that the vast majority of us — over 70% — don’t trust the news we see on our social media feeds. If we were to pick an information source we trusted, we would never go to social media.

This brings up an interesting juxtaposition in how we’re being informed about the world: almost all of us are getting our news through social media, but almost none of us are looking for it when we do.

According to the Reuter’s Report, 72% of us (all ages, all markets) get our news through the “side door.” This means we are delivered news — primarily through social media and search — without us intentionally going directly to the source of the information. For those aged 18 to 24, “side door” access jumps to 84% and, of that, access through social media jumps to 38%.

Our loyalty to the brand and quality of an information provider is slipping between our fingers and we don’t seem to care. We say we want objective, non-biased, quality news sources, but in practice we lap up whatever dubious crap is spoon-fed to us by Facebook or Instagram. It’s the difference between telling our doctor what we intend to eat and what we actually eat when we get home to the leftover pizza and the pint of Häagen-Dazs in our fridge.

The difference between looking for and passively receiving information is key to understanding how our brain works. Let’s talk a little bit about “top-down” and “bottom-up” activation and the “priming” of our brain.

When our brain has a goal — like looking for COVID-19 information — it behaves significantly differently than when it is just bored and wanting to be entertained.

The goal sets a “top down” intent. It’s like an executive order to the various bits and pieces of our brain to get their shit together and start working as a team. Suddenly the entire brain focuses on the task at hand and things like reliability of information become much more important to us. If we’re going to go directly to a information source we trust, this is going to be when we do it.

If the brain isn’t actively engaged in a goal, then information has to initiate a “bottom-up” activation. And that is an entirely different animal.

We never go to social media looking for a specific piece of news. That’s not how social media works. We go to our preferred social channels either out of sheer boredom or a need for social affirmation. We hope there’s something in the highly addictive endlessly scrolling format that will catch our attention.

For a news piece to do that, it has to somehow find a “hook” in our brain.  Often, that “hook” is an existing belief. The parts of our brain that act as gatekeepers against unreliable information are bypassed because no one bothered to wake them up.

There is a further brain-related problem with relying on social media, and that’s the “priming” issue. This is where one stimulus sets a subconscious “lens” that will impact subsequent stimuli. Priming sets the brain on a track we’re not aware of, which makes it difficult to control.

Social media is the perfect priming platform. One post sets the stage for the next, even if they’re completely unrelated.

These are just two factors that make social media an inherently dangerous platform to rely on for being informed.

The third is that social media makes information digestion much too easy. Our brain barely needs to work at all. And if it does need to work, we quickly click back and scroll down to the next post. Because we’re looking to be entertained, not informed, the brain is reluctant to do any unnecessary heavy lifting.   

This is a big reason why we may know the news we get through social media channels is probably not good for us, but we gulp it down anyway, destroying our appetite for more trustworthy information sources.

These three things create a perfect cognitive storm for huge portions of the population to be continually and willingly misinformed. That’s not even factoring in all the other problems with social media that I mentioned at the outset of this column. We need to rethink this — soon!

Playing Fast and Loose with the Truth

A few months ago, I was having a conversation with someone and they said something that I was pretty sure was not true. I don’t know if it was a deliberate lie. It may have just been that this particular person was uninformed. But they said it with the full confidence that what they said was true. I pushed back a little and they instantly defended their position.

My first instinct was just to let it go. I typically don’t go out of my way to cause friction in social settings. Besides, it was an inconsequential thing. I didn’t really care about it. But I was feeling a little pissy at the time, so I fact checked her by looking it up on my phone. And I was right. She had stated something that wasn’t true and then doubled down on it.

Like I said, it was inconsequential – a trivial conversation point. But what if it wasn’t? What if there was a lot riding on whether or not what they said was true? What if this person was in a position of power, like – oh, I don’t know – the President of the United States?

The role of truth in our social environment is currently a thing in flux. I cannot remember a time when we have been more suspicious of what we see, read and hear on a daily basis. As I mentioned a couple weeks ago, less than 40% of us trust what we hear on the news. And when that news comes through our social media feed, the level of distrust jumps to a staggering 80%

Catching someone in a lie has significant social and cognitive implications. We humans like to start from a default position of trust. If we can do that, it eliminates a lot of social friction and cognitive effort. We only go to not trusting when we have to protect ourselves.

Our proclivity for trust is what has made a global commerce and human advancement possible. But, unfortunately, it does leave us vulnerable. Collectively, we usually play by the same playbook I was initially going to use in my opening example. It’s just easier to go along with what people say, even if we may doubt that it’s true. This is especially so if the untruth is delivered with confidence. We humans love confidence in others because it means we don’t have to work as hard. Confidence is a signal we use to decide to trust and trust is always easier than distrust. The more confident the delivery, the less likely we are to question it.

It’s this natural human tendency that put the “con” in “con artist.” “Con” is short for confidence, and it originates with an individual named William Thompson, who plied the streets of New York in the 1840’s. He would walk up to a total stranger who was obviously well off and greet them like a long-lost friend. After a few minutes of friendly conversation during which the target would be desperately trying to place this individual, Thompson would ask for the loan of something of value. He would then set his hook with this, “Do you have confidence in me to loan me this [item] til tomorrow?”  The success of this scam was totally dependent on an imbalance of confidence; extreme confidence on the part of the con artist and a lack of confidence on the part of the target.

It is ironic that in an era where it’s easier than ever to fact check, we are seeing increasing disregard for the truth. According to the Washington Post, Donald Trump passed a misinformation milestone on July 9, making 20,000 false or misleading claims since he became President. He surged past that particular post when he lied 62 times on that day alone. I don’t even think I talk 62 times per day.

This habit of playing fast and loose with the truth is not Trump’s alone. Unfortunately, egregious lying has been normalized in today’s world. We have now entered an era where full time fact checking is necessary. On July 7, NY Times columnist Thomas Friedman said we need a Biden-Trump debate, but only on two conditions: First, only if Trump releases his tax returns, and second, only if there is a non-partisan real-time fact-checking team keeping the debaters accountable.

We have accepted this as the new normal. But we shouldn’t. There is an unacceptable cost we’re paying by doing so. And that cost becomes apparent when we think about the consequence of lying on a personal basis.

If we catch an acquaintance in a deliberate lie, we put them in the untrustworthy column. We are forced into a default position of suspicion whenever we deal with them in the future. This puts a huge cognitive load on us. As I said before, it takes much more effort to not trust someone. It makes it exponentially harder to do business with them. It makes it more difficult to enjoy their company. It introduces friction into our relationship with them.

Even if the lie is not deliberate but stated with confidence, we label them as uninformed. Again, we trust them less.

Now multiply this effort by everyone. You quickly see where the model breaks down. Lying may give the liar a temporary advantage, but it’s akin to a self-limiting predator-prey model. If it went unchecked, soon the liars would only have other liars to deal with. It’s just not sustainable.

Truth exists for a reason. It’s the best social strategy for the long term. We should fight harder for it.

How We Forage for the News We Want

Reuters Institute out of the UK just released a comprehensive study looking at how people around the world are finding their news. There is a lot here, so I’ll break it into pieces over a few columns and look at the most interesting aspects. Today, I’ll look at the 50,000-foot view, which can best be summarized as a dysfunctional relationship between our news sources and ourselves. And like most dysfunctional relationships, the culprit here is a lack of trust.

Before we dive in, we should spend some time looking at how the way we access news has changed over the last several years.

Over my lifetime, we have trended in two general directions – less cognitively demanding news channels and less destination specific news sources. The most obvious shift has been away from print. According to Journalism.org and the Pew Research Center, circulation of U.S. Daily newspapers peaked around 1990, at about 62 and a half million. That’s one subscription for every 4 people in the country at that time.

In 2018, it was projected that circulation had dropped more than 50%, to less than 30 million. That would have been one subscription for every 10 people. We were no longer reading our news in a non-digital format. And that may have significant impact on our understanding of the news. I’ll return to this in another column, but for now, let’s just understand that our brain operates in a significantly different way when it’s reading rather than watching or listening.

Up the end of the last century, we generally trusted news destinations. Whether it be a daily newspaper like the New York Times, a news magazine like Time or a nightly newscast such as any of the network news shows, each was a destination that offered one thing above all others – the news. And whether you agreed with them or not, each had an editorial process that governed what news was shared. We had a loyalty to our chosen news destinations that was built on trust.

Over the past two decades, this trust has broken down due to one primary factor – our continuing use of social media. And that has dramatically shifted how we get our news.

In the US, three out of every four people use online sources to get their news. One in two use social media.  Those aged 18 to 24 are more than twice as likely to rely on social media. In the UK, under-35s get more of their news from Social Media than any other source.

Also, influencers have become a source of news, particularly amongst young people. In the US, a quarter of those 18 to 24 used Instagram as a source of news about COVID.

This means that most times, we’re getting our news through a social media lens. Let’s set aside for a moment the filtering and information veracity problems that introduces. Let’s just talk about intent for a moment.

I have talked extensively in the past about information foraging when it comes to search. When information is “patchy” and spread diversely, the brain has to make a quickly calculated guess about which patch it’s most likely to find the information in it’s looking for. With Information Foraging, the intent we have frames everything that comes after.

In today’s digital world, information sources have disaggregated into profoundly patchy environments. We still go to news-first destinations like CNN or Fox News but we also get much of our information about the world through our social media feeds. What was interesting about the Reuters report was that it was started before the COVID pandemic, but the second part of the study was conducted during COVID. And it highlights a fascinating truth about our relationship with the news when it comes to trust.

The study shows that the majority of us don’t trust the news we get through social media but most times, we’re okay with that. Less than 40% of people trust the news in general, and even when we pick a source, less than half of us trust that particular channel. Only 22% indicated they trust the news they see in social media. Yet half of us admit we use social media to get our news. The younger we are, the more reliant we are on social media for news. The fastest growing sources for news amongst all age groups – but especially those under 30 – are Instagram, SnapChat and WhatsApp.

Here’s another troubling fact that fell out of the study. Social platforms, especially Instagram and SnapChat, are dominated by influencers. That means that much of our news comes to us by way of a celebrity influencer reposting it on their feed. This is a far cry from the editorial review process that used to act as a gate keeper on our trusted news sources.

So why do we continue to use news sources we admit we don’t trust? I suspect it may have to do with something called the Meaning Maintenance Model. Proposed in 2006 by Heine, Proulx and Vohs, the model speculates that a primary driver for us is to maintain our beliefs in how the world works. This is related to the sense making loop (Klein, Moon and Hoffman) I’ve also talked about in the past. We make sense of the world by first starting with the existing frame of what we believe to be true. If what we’re experiencing is significantly different from what we believe, we will update our frame to align with the new evidence.

What the Meaning Maintenance Model suggests is that we will go to great lengths to avoid updating our frame. It’s much easier just to find supposed evidence that supports our current beliefs. So, if our intent is to get news that supports our existing world view, social media is the perfect source. It’s algorithmically filtered to match our current frame. Even if we believe the information is suspect, it still comforts us to have our beliefs confirmed. This works well for news about politics, societal concerns and other ideologically polarized topics.

We don’t like to admit this is the case. According to the Reuter’s study, 60% of us indicate we want news sources that are objective and not biased to any particular point of view. But this doesn’t jive with reality at all. As I wrote about in a previous column, almost all mainstream news sources in the US appear to have a significant bias to the right or left. If we’re talking about news that comes through social media channels, that bias is doubled down on. In practice, we are quite happy foraging from news sources that are biased, as long as that bias matches our own.

But then something like COVID comes along. Suddenly, we all have skin in the game in a very real and immediate way. Our information foraging intent changes and our minimum threshold for the reliability of our news sources goes way up. The Reuters study found that when it comes to sourcing COVID information, the most trusted sources are official sites of health and scientific organizations. The least trusted sources are random strangers, social media and messaging apps.

It requires some reading between the lines, but the Reuters study paints a troubling picture of the state of journalism and our relationship with it. Where we get our information directly impacts what we believe. And what we believe determines what we do.

These are high stakes in an all-in game of survival.

Are We Killing Politeness?

One of the many casualties of our changing culture seems to be politeness. When the President of the United States is the poster child for rude behavior, it’s tough for politeness to survive. This is especially true in the no-holds-barred, digitally distanced world of social media.

I consider myself to be reasonably polite. Being so, I also expect this in others. Mild rudeness makes me anxious. Excessive rudeness makes me angry. This being the case, I am troubled by the apparent decline of civility. So today I wanted to take a look at politeness and why it might be slipping away from us.

First of all, we have to understand the politeness is not universal. What is considered polite in one culture is not in another.

Secondly, being polite is not the same as being friendly. Or empathetic. Or being respectful of others. Or being compassionate, according to this post  from The Conversation. There is a question of degree and intent here. Being polite is a rather unique behavior that encompasses both desirable and less desirable qualities. And that begs the question: What is the purpose of politeness? Is a less-polite world a good or a bad thing?

First, let’s look at the origin of the world. It comes from the Latin “politus,” meaning “polished — made smooth.” Just in case you’re wondering, “politics” does not come from the same root. That comes from the Greek word for “citizen” — “polites.”

One last etymological nugget. The closest comparison to polite may be “nice,” which originates from the Latin “nescius,” meaning “ignorant”. Take that for what it’s worth.

This idea of politeness as a type of social “polish” really comes from Europe — and especially Britain. There, politeness was linked with class hierarchies. Being polite was a sign of good breeding — a dividing line between the high-born and the riffraff. This class-bound definition came along with the transference of the concept to North America.

Canada is typically considered one of the most polite nations in the world. As a Canadian who has traveled a fair amount, I would say that’s probably true.

But again, there are variations in the concept of politeness and how it applies to both Canadians and Americans.

When we consider the British definition of politeness, you begin to see how Americans and Canadians might respond differently to it. To understand that is to understand much of what makes up our respective characters.

As a Canadian doing much of my business in the U.S. for many years, I was always struck by the difference in approaches I found north and south of the 49th parallel. Canadians businesses we met with were unfailingly polite, but seldom bought anything. Negotiating the prospect path in Canada was a long and often frustrating journey.

American businesses were much more likely to sign a contract. On the whole, I would also say they were friendlier in a more open and less-guarded way. I have to admit that in a business setting, I preferred the American approach.

According to anthropologists Penelope Brown and Stephen Levinson, who have extensively researched politeness, there is negative and positive politeness. Negative politeness is concern with adhering to social norms, often by deferring to someone or something else.

This is Canadian politeness personified. Our entire history is one of deference to greater powers, first to our colonial masters — the British and French — and, more recently, from our proximity to the cultural and economic master that is the U.S.

For Canadians, deferral is survival. As former Prime Minister Pierre Trudeau once said about the U.S., “Living next to you is in some ways like sleeping with an elephant. No matter how friendly and even-tempered is the beast, if I can call it that, one is affected by every twitch and grunt.”

Negative politeness is a way to smooth out social friction, but there is good and bad here. Ideally it should establish a baseline of trust, respect and social capital. But ultimately, politeness is a consensus of compromise.  And that’s why Canadians are so good at it.  Negative politeness wants everything to be fair.

But then there is positive politeness, which is more American in tone and nature. This is a desire to help others, making it more closely linked to compassion. But in this noble motive there is also a unilateral defining of what is right and wrong. Positive politeness tries to make everything right, based on the protagonist’s definition of what that is.

The two sides of politeness actually come from different parts of the brain. Negative politeness comes from the part of the brain that governs aggression. It is all about applying brakes to our natural instincts. But positive politeness comes from the part of the brain that regulates social bonding and affiliation.

When you understand this, you understand the difference between Canadians and Americans in what we consider polite. For the former, our definition comes handed down from the British class-linked origins, and has morphed into a culture of compromise and deferral.

The American definition comes from many generations of being the de facto moral leaders of the free world.

We (Canadians) want to be nice. You (Americans) want to be right. The two are not mutually exclusive, but they are also not the same thing. Not by a long shot.

What Trump has done (with a certain kind of perverse genius) has played on this national baseline of compassion. He has wantonly discarded any vestiges of politeness and split the nation on what it means to be right.

But by eliminating politeness, you have also eliminated that governor of our behavior. Reactions about what is right and wrong are now immediate, rough and unfiltered.

The polish that politeness brings — that deferral of spoken judgement for even a brief moment in order to foster cooperation — is gone. We have no opportunity to consider other perspectives. We have no motive to cooperate. This is abundantly apparent on every social media platform.

In game theory, politeness is a highly successful strategy commonly called “tit for tat.” It starts from assuming a default position of fairness from the other party, continuing to cooperate if this proves to be true, and escalating to retaliation if it’s not. But this tactic evolved in a world of face-to-face encounters. Somehow, it seems less needed in a divided world where rudeness and immediate judgement are the norm.

Still, I will cling to my notions of politeness. Yes, sometimes it seems to get in the way of definitive action. But on the whole, I would rather live in a world that’s a little nicer and a little more polite, even if that seems foolish to some of you.

Crisis? What Crisis?

You would think that a global pandemic would hold our attention for a while.

Nope.

We’re tired of it. We’re moving on.  We’re off to the next thing.

Granted, in this case the next thing deserves to be focused on. It is abysmal that it still exists. So it should be focused on. Probably for the rest of our lives and beyond – going forward until it ceases to be a thing. But it won’t be. Soon we’ll be talking about something else.

And that’s the point of this post – our collective inability to remain focused on anything without being distracted by the next breaking story in our news feed. How did we come to this?

I blame memes.

To a certain extent, our culture is the product of who we are and who we are is a product of our culture. Each is shaped by the other, going forward in a constantly improvised pas de deux. Humans create the medium – which then becomes part of the environment we adapt too.

Books and the printed word changed who we were for over five centuries.  Cinema has been helping to define us for almost 150 years. And radio and television has been moulding us for the past century. Our creations have helped create who we are.

This has never been truer than with social media. Unlike other media which took discrete chunks of our time and attention, social media is ubiquitous and pervasive. According to a recent survey, we spend on average 2 hours and 23 minutes per day on social media. That is about 13% of our waking hours.  Social media has become intertwined with our lives to the point that we had to start qualifying what happens where with labels like “IRL” (In Real Life).

There is another difference between social media and what has come before it. Almost every previous entertainment medium that has demanded our attention has been built on the foundation of a long form narrative arc. Interacting with each medium has been a process – a commitment to invest a certain amount of time to go on a journey with the storyteller. The construction of a story depends on patterns that are instantly recognized by us. Once we identify them, we are invested in discovering the outcome. We understand that our part of the bargain is to exchange our time and attention. The payoff is the joy that comes from us making sense of a new world or situation, even if it is imaginary.  

But social media depends on a different exchange. Rather than tapping into our inherent love of the structure of a story it depends on something called variable intermittent rewards. Essentially, it’s the same hook that casinos use to keep people at a slot machine or table. Not only is it highly addictive, it also pushes us to continually scroll to the next thing. It completely bypasses the thinking part of our brains and connects directly to the reward center buried in our limbic system. Rather than ask for our time and attention social media dangles a never-ending array of bright, shiny memes that asks nothing from us: no thinking, almost no attention and a few seconds of our time at most. For a lazy brain, this is the bargain of a lifetime.

It’s probably not a coincidence that the media that are most dependent on advertising are also the media that avoids locking our attention on a single topic for an extended period. This makes social media the perfect match for interruptive ad forms. They are simply slotted into the never-ending scroll of memes.

Social media has only been around for a little over 2 decades. It has been a significant part of our lives for half that time. If even a little bit of what I suspect is happening is indeed taking place, that scares the hell out of me. It would mean that no other medium has changed us so much and so quickly.

That is something worth paying attention to.