Amazon Prime: Buy Today, Pay Tomorrow?

This column goes live on the most eagerly anticipated day of the year. My neighbor, who has a never-ending parade of delivery vans stopping in front of her door, has it circled on her calendar. At least one of my daughters has been planning for it for several months. Even I, who tends to take a curmudgeonly view of many celebrations, has a soft spot in my heart for this particular one.

No, it’s not the day after Canadian Thanksgiving. This, my friends, is Amazon Prime Day!

Today, in our COVID-clouded reality, the day will likely hit a new peak of “Prime-ness.” Housebound and tired of being bludgeoned to death by WTF news headlines, we will undoubtedly treat ourselves with an unprecedented orgy of one-click shopping. And who can blame us? We can’t go to Disneyland, so leave me alone and let me order that smart home toilet plunger and the matching set of Fawlty Towers tea towels that I’ve been eyeing. 

Of course, me being me, I do think about the consequences of Amazon’s rise to retail dominance. 

I think we’re at a watershed moment in our retail behaviors, and this moment has been driven forward precipitously by the current pandemic. Being locked down has forced many of us to make Amazon our default destination for buying. Speaking solely as a sample of one, I know check Amazon first and then use that as my baseline for comparison shopping. But I do so for purely selfish reasons – buying stuff on Amazon is as convenient as hell!

I don’t think I’m alone. We do seem to love us some Amazon. In a 2018 survey conducted by Recode, respondents said that Amazon had the most positive impact on society out of any major tech company. And that was pre-Pandemic. I suspect this halo effect has only increased since Amazon has become the consumer lifeline for a world forced to stay at home.

As I give into to the siren call of Bezos and Co., I wonder what forces I might be unleashing. What unintended consequences might come home to roost in years hence? Here are a few possibilities. 

The Corporate Conundrum

First of all, let’s not kid ourselves. Amazon is a for-profit corporation. It has shareholders that demand results. The biggest of those shareholders is Jeff Bezos, who is the world’s richest man. 

But amazingly, not all of Amazon’s shareholders are focused on the quarterly financials. Many of them – with an eye to the long game – are demanding that Amazon adopt a more ethical balance sheet.  At the 2019 Annual Shareholder Meeting, a list of 12 resolutions were brought forward to be voted on. The recommendations included zero tolerance for sexual harassment and hate speech, curbing Amazon’s facial recognition technology, addressing climate change and Amazon’s own environmental impact. These last two were supported by a letter signed by 7600 of Amazon’s own employees. 

The result? Amazon strenuously fought every one of them and none were adopted. So, before we get all warm and gooey about how wonderful Amazon is, let’s remember that the people running the joint have made it very clear that they will absolutely put profit before ethics. 

A Dagger in the Heart of Our Communities

For hundreds of years, we have been building a supply chain that was bound by the realities of geography. That supply chain required some type of physical presence within a stone’s throw of where we live. Amazon has broken that chain and we are beginning to feel the impact of that. 

Community shopping districts around the world were being gutted by the “Amazon Effect” even before COVID. In the last 6 months, that dangerous trend has accelerated exponentially. In a commentary from CNBC in 2018, venture capitalist Alan Patricof worried about the social impact of losing our community gathering spots, “This decline has brought a deterioration in places where people congregated, socialized, made friends and were greeted by a friendly face offering an intangible element of belonging to a community.”

The social glue that held us together has been dissolving over the past two decades. Whether you’re a fan of shopping malls or not (I fall into the “not” category) they were at least a common space where you might run into your neighbor. In his book Bowling Alone, from 2000, Harvard political scientist Robert Putnam documented the erosion of social capital in America. We are now 20 years hence and Putnam’s worst case scenario seems quaintly optimistic now. With the loss of our common ground – in the most literal sense – we increasingly retreat to the echo chambers of social media. 

Frictionless Consumerism

This last point is perhaps the most worrying. Amazon has made it stupid simple to buy stuff. They have relentlessly squeezed every last bit of friction out of the path to purchase. That worries me greatly.

If we could rely on a rational marketplace filled with buyers acting in the best homo economicus tradition, then I perhaps rest easier, knowing that there was some type of intelligence driving Adam Smith’s Invisible Hand. But experience has shown that is not the case. Rampant consumerism appears to be one of the three horsemen of the modern apocalypse. And, if this is true, then Amazon has put us squarely in their path. 

This is not to even mention things like Amazon’s emerging monopoly-like dominance in a formerly competitive marketplace, the relentless downward pressure it exerts on wages within its supply chain, the evaporation of jobs outside its supply chain or the privacy considerations of Alexa. 

Still, enjoy your Amazon Prime Day. I’m sure everything will be fine.

How to Look Past the Nearest Crisis

I was talking to someone the other day who was trying to make plans for 2021. Those plans were dependent on the plans of others. In the course of our conversation, she said something interesting: “It’s so hard to plan because most of the people I’m talking to can’t see past COVID.” 

If anything sums up our current reality, it might be that. We’re all having a lot of trouble seeing past COVID. Or the upcoming U.S. election. Or catastrophic weather events. Or an impending economic crisis. Take your pick. There are so many looming storm clouds on the horizon that it’s difficult to even make out that horizon any more. 

We humans are pretty dependent on the past to tell us what may be happening in the future. We evolved in an environment that — thanks to its stability — was reasonably predictable. In evolutionary survival terms, it was smart to hedge our bets on the future by glancing over our shoulders at the past. If a saber-toothed tiger was likely to eat you yesterday, the odds were very much in favor of it also wanting to eat you tomorrow. 

But our ability to predict things gets thrown for a loop in the face of uncertainty like we’re currently processing. There are just too many variables forced into the equation for us to be able to rely on what has happened in the past. Both the number of variables and the range of variation pushes our prediction probability of error past the breaking point. 

When it comes to planning for the future, we become functionally paralyzed and start living day to day, waiting for the proverbial “other shoe to drop.” 

The bigger problem, however, is that when the world is going to hell in a hand basket, we don’t realize that the past is a poor foundation on which to build our future. Evolved habits die hard, and so we continue to use hindsight to try to move forward. 

And by “we,” I mean everyone — most especially the leaders we elect and the experts we rely on to point us in the right direction.  Many seem to think that a post-COVID world will snap back to be very much like a pre-COVID world.

And that, I’m afraid, may be the biggest problem. You’d think that when worrying about an uncertain future is above our pay grade, there would be someone wiser and smarter than us to rely on and save our collective asses. But if common folk tend to consistently bet on the past as a guide to our future, it’s been shown that people we think of as “experts” double down on that bet. 

A famous study by Philip Tetlock showed just how excruciatingly awful experts were at predicting the future. He assembled a group of 284 experts and got them to make predictions about future events, including those that fell into their area of expertise. Across the board, he found their track record of being correct was only slightly ahead of a random coin toss or a troupe of chimpanzees throwing darts. The more famous the expert, the worse their track record.

Expertise is rooted in experience. Both words spring from the same root: The Latin experiri for “try.” Experience is gained in the past. For experts, their worth comes from their experience in one particular area, so they are highly unlikely to ignore it when predicting the future. They are like the hedgehog in Isiah Berlin’s famous essay “The Hedgehog and The Fox“: They “know one important thing.”

But when it comes to predicting the future, Tetlock found it’s better to be a fox: to “know many little things.” In a complex, highly uncertain world, it’s the generalist  who thrives. 

The reason is pretty simple. In an uncertain world, we have to be more open to sense making in the classic cognitive sense. We have to be attuned to the signals that are playing out in real time and not be afraid to consider new information that may conflict with our current beliefs.

This is how generalists operate. It’s also how science is supposed to operate. Our view of the future should be no more than a hypothesis that we’re willing to have proven wrong. Hedgehogs dig in when their expertise about “one big thing” is questioned. Foxes use it as an opportunity to update their take on reality. 

Foxes have another advantage over hedgehogs. They tend to be dilettantes, spreading their interest over a wide range of topics without diving too deeply into any of them. This keeps their network diverse and expansive, giving them the opportunity to synthesize their sense of reality from the broadest range of signals possible. 

In a world that depends on being nimble enough to shift directions depending on the input your receive, this stacks the odds in favor of the fox. 

Still, it’s against human nature to be so cavalier about our future. We like certainty. We crave predictability. We are big fans of transparent causes and effects. If those things are clouded by complexity and uncertainty, we start constructing our own narratives. Hence the current spike of conspiracy theories, as I noted previously. This is especially true when the stakes are as high as they are now. 

I don’t blame those having a very hard time looking past COVID — or any other imminent disaster. But someone should be. 

It’s time to start honing those fox instincts. 

Tired of Reality? Take 2 Full-Strength Schitt’s Creeks

“Schitt’s Creek” stormed the Emmys by winning awards in every comedy series category — a new record. It was co-creators Dan and Eugene Levy’s gift to the world: a warm bowl of hot cultural soup, brimming with life-affirming values, acceptance and big-hearted Canadian corniness.

It was the perfect entertainment solution to an imperfect time. It was good for what ails us.

It’s not the first time we’ve turned to entertainment for comfort. In fact, if there is anything as predictable as death and taxes, it’s that during times of trial, we need to be entertained.

There is a direct correlation between feel-good fantasy and feeling-shitty reality. The worse things get, the more we want to escape it.

But the ways we choose to be entertained have changed. And maybe — just maybe — the media channels we’re looking to for our entertainment are adding to the problem. 

The Immersiveness of Media

A medium’s ability to distract us from reality depends on how much it removes us from that reality.

Our media channels have historically been quite separate from the real world. Each channel offered its own opportunity to escape. But as the technology we rely on to be entertained has become more capable of doing multiple things, that escape from the real world has become more difficult.

Books, for example, require a cognitive commitment unlike any other form of entertainment. When we read a book, we — in effect — enter into a co-work partnership with the author. Our brains have to pick up where theirs left off, and we together build a fictional world to which we can escape. 

As the science of interpreting our brain’s behavior has advanced, we have discovered that our brains actually change while we read.

Maryanne Wolf explains in her book, “Proust and the Squid: The Story and Science of the Reading Brain”: “Human beings invented reading only a few thousand years ago. And with this invention, we rearranged the very organization of our brain, which in turn expanded the ways we were able to think, which altered the intellectual evolution of our species. . . . Our ancestors’ invention could come about only because of the human brain’s extraordinary ability to make new connections among its existing structures, a process made possible by the brain’s ability to be reshaped by experience.”

Even movies, which dramatically lowered the bar for the cognitive commitments they ask by supplying content specifically designed for two of our senses, do so by immersing us in a dedicated single-purpose environment. The distraction of the real world is locked outside the theater doors.

But today’s entertainment media platforms not only live in the real world, they are the very same platforms we use to function in said world. They are our laptops, our tablets, our phones and our connected TVs.

It’s hard to ignore that world when the flotsam and jetsam of reality is constantly bumping into us. And that brings us to the problem of the multitasking myth.

Multitasking Anxiety

The problem is not so much that we can’t escape from the real world for a brief reprise in a fictional one. It’s that we don’t want to. 

Even if we’re watching our entertainment in our home theater room on a big screen, the odds are very good that we have a small screen in our hands at the same time. We mistakenly believe we can successfully multitask, and our mental health is paying the price for that mistake.

Research has found that trying to multitask brings on a toxic mix of social anxiety, depression, a lessening of our ability to focus attention, and a sociopsychological impairment that impacts our ability to have rewarding relationships. 

When we use the same technology to be entertained that we use to stay on top of our social networks we fall prey to the fear of missing out.

It’s called Internet Communication Disorder, and it’s an addictive need to continually scroll through Facebook, Twitter, WhatsApp and our other social media platforms. It’s these same platforms that are feeding us a constant stream of the very things we’re looking to escape from. 

It may be that laughter is the best medicine, but the efficacy of that medicine is wholly dependent on where we get our laughs.

The ability for entertainment to smooth the jagged edges of reality depend on our being able to shift our minds off the track that leads to chronic anxiety and depression — and successfully escape into a fictional kinder, gentler, funnier world.

For entertainment to be a beneficial distraction, we first have to mentally disengage from the real world, and then fully engage in the fictional one.

That doesn’t work nearly as well when our entertainment delivery channel also happens to be the same addictive channel that is constantly tempting us to tiptoe through the anxiety-strewn landscape that is our social media feed. 

In other words, before going to “Schitt’s Creek,” unpack your other shit and leave it behind. I guarantee it will be waiting for you when you get back.

The Day My Facebook Bubble Popped

I learned this past week just how ideologically homogenous my Facebook bubble usually is. Politically, I lean left of center. Almost all the people in my bubble are the same.

Said bubble has been built from the people I have met in the past 40 years or so. Most of these people are in marketing, digital media or tech. I seldom see anything in my feed I don’t agree with — at least to some extent.

But before all that, I grew up in a small town in a very right-wing part of Alberta, Canada. Last summer, I went to my 40-year high-school reunion. Many of my fellow graduates stayed close to our hometown for those 40 years. Some are farmers. Many work in the oil and gas industry. Most of them would fall somewhere to the right of where I sit in my beliefs and political leanings.

At the reunion, we did what people do at such things — we reconnected. Which in today’s world meant we friended each other on Facebook. What I didn’t realize at the time is that I had started a sort of sociological experiment. I had poked a conservative pin into my liberal social media bubble.

Soon, I started to see posts that were definitely coming from outside my typical bubble. But most of them fell into the “we can agree to disagree” camp of political debate. My new Facebook friends and I might not see eye-to-eye on certain things, but hell — you are good people, I’m good people, we can all live together in this big ideological tent.

On May 1, 2020, things began to change. That was when Canadian Prime Minister Justin Trudeau announced that 1,500 models of “assault-style” weapons would be classified as prohibited, effective immediately. This came after Gabriel Wortman killed 22 people in Nova Scotia, making it Canada’s deadliest shooting spree. Now, suddenly posts I didn’t politically agree with were hitting a very sensitive raw nerve. Still, I kept my mouth shut. I believed arguing on Facebook was pointless.

Through everything that’s happened in the four months since (it seems like four decades), I have resisted commenting when I see posts I don’t agree with. I know how pointless it is. I realize that I am never going to change anyone’s mind through a comment on a Facebook post.

I understand this is just an expression of free speech, and we are all constitutionally entitled to exercise it. I stuck with the Facebook rule I imposed for myself — keep scrolling and bite your tongue. Don’t engage.

I broke that rule last week. One particular post did it. This post implied that with a COVID-19 survival rate of almost 100%, why did we need a vaccine? I knew better, but I couldn’t help it.

I engaged. It was limited engagement to begin with. I posted a quick comment suggesting that with 800,000 (and counting) already gone, saving hundreds of thousands of lives might be a pretty good reason. Right or left, I couldn’t fathom anyone arguing with that.

I was wrong. Oh my God, was I wrong. My little comment unleashed a social media shit storm. Anti-vaxxing screeds, mind-control plots via China, government conspiracies to artificially over-count the death toll and calling out the sheer stupidity of people wearing face masks proliferated in the comment string for the next five days. I watched the comment string grow in stunned disbelief. I had never seen this side of Facebook before.

Or had I? Perhaps the left-leaning posts I am used are just as conspiratorial, but I don’t realize it because I happen to agree with them. I hope not, but perspective does strange things to our grasp of the things we believe to be true. Are we all — right or left — just exercising our right to free speech through a new platform? And — if we are — who am I to object?

Free speech is held up by Mark Zuckerberg and others as hallowed ground in the social-media universe. In a speech last fall at Georgetown University, Zuckerberg said: “The ability to speak freely has been central in the fight for democracy worldwide.”

It’s hard to argue that. The ability to publicly disagree with the government or any other holder of power over you is much better than any alternative. And the drafters of the U.S. Bill of Rights agreed. Freedom of speech was enshrined in the First Amendment. But the authors of that amendment — perhaps presciently — never defined exactly what constituted free speech. Maybe they knew it would be a moving target.

Over the history of the First Amendment, it has been left to the courts to decide what the exceptions would be.

In general, it has tightened the definitions around one area — what types of expression constitute a “clear and present danger” to others.  Currently, unless you’re specifically asking someone to break the law in the very near future, you’re protected under the First Amendment.

But is there a bigger picture here —one very specific to social media? Yes, legally in the U.S. (or Canada), you can post almost anything on Facebook.

Certainly, taking a stand against face masks and vaccines would qualify as free speech. But it’s not only the law that keeps society functioning. Most of the credit for that falls to social norms.

Social norms are the unwritten laws that govern much of our behavior. They are the “soft guard rails” of society that nudge us back on track when we veer off-course. They rely on us conforming to behaviors accepted by the majority.

If you agree with social norms, there is little nudging required. But if you happen to disagree with them, your willingness to follow them depends on how many others also disagree with them.

Famed sociologist Mark Granovetter showed in his Threshold Models of Collective Behavior that there can be tipping points in groups. If there are enough people who disagree with a social norm, it will create a cascade that can lead to a revolt against the norm.

Prior to social media, the thresholds for this type of behavior were quite high. Even if some of us were quick to act anti-socially, we were generally acting alone.

Most of us felt we needed a substantial number of like-minded people before we were willing to upend a social norm. And when our groups were determined geographically and comprised of ideologically diverse members, this was generally sufficient to keep things on track.

But your social-media feed dramatically lowers this threshold.

Suddenly, all you see are supporting posts of like-minded people. It seems that everyone agrees with you. Emboldened, you are more likely to go against social norms.

The problem here is that social norms are generally there because they are in the best interests of the majority of the people in society. If you go against them, by refusing a vaccine or to wear a face mask,  thereby allowing a disease to spread, you endanger others. Perhaps it doesn’t meet the legal definition of “imminent lawlessness,” but it does present a “clear and present danger.”

That’s a long explanation of why I broke my rule about arguing on Facebook.

Did I change anyone’s mind? No. But I did notice that the person who made the original post has changed their settings, so I don’t see the political ones anymore. I just see posts about grandkids and puppies.

Maybe it’s better that way.

Our Brain And Its Junk News Habit

Today, I’m going to return to the Reuter’s Digital News Report and look at the relationship between us, news and social media. But what I’m going to talk about is probably not what you think I’m going to talk about.

Forget all the many, many problems that come with relying on social media to be informed. Forget about filter bubbles and echo chambers. Forget about misleading or outright false stories. Forget about algorithmic targeting. Forget about the gaping vulnerabilities that leave social media open to nefarious manipulation. Forget all that (but just for the moment, because those are all horrible and very real problems that we need to focus on).

Today, I want to talk about one specific problem that comes when we get our news through social media. When we do that, our brains don’t work the way they should if we want to be well informed.

First, let’s talk about the scope of the issue here. According to the Reuter’s study, in the U.S. more people — 72% — turn online for news than any other source. Television comes in second at 59%. If we single out social media, it comes in third at 48%. Trailing the pack is print media at just 20%.

Reuters Digital News Study 2020 – Sources of News in US

If we plot this on a chart over the last seven years, print and social media basically swapped spots, with their respective lines crossing each other in 2014; one trending up and one trending down. In 2013, 47% of us turned to print as a primary news source and just 27% of us went to social media.

If we further look at those under 35, accessing news through social media jumps to the number-one spot by a fairly wide margin. And because they’re young, we’re not talking Facebook here. Those aged 18 to 24 are getting their news through Instagram, Snapchat and TikTok.

The point, if it’s not clear by now, is that many of us get our news through a social media channel — and the younger we are, the more that’s true. The paradox is that the vast majority of us — over 70% — don’t trust the news we see on our social media feeds. If we were to pick an information source we trusted, we would never go to social media.

This brings up an interesting juxtaposition in how we’re being informed about the world: almost all of us are getting our news through social media, but almost none of us are looking for it when we do.

According to the Reuter’s Report, 72% of us (all ages, all markets) get our news through the “side door.” This means we are delivered news — primarily through social media and search — without us intentionally going directly to the source of the information. For those aged 18 to 24, “side door” access jumps to 84% and, of that, access through social media jumps to 38%.

Our loyalty to the brand and quality of an information provider is slipping between our fingers and we don’t seem to care. We say we want objective, non-biased, quality news sources, but in practice we lap up whatever dubious crap is spoon-fed to us by Facebook or Instagram. It’s the difference between telling our doctor what we intend to eat and what we actually eat when we get home to the leftover pizza and the pint of Häagen-Dazs in our fridge.

The difference between looking for and passively receiving information is key to understanding how our brain works. Let’s talk a little bit about “top-down” and “bottom-up” activation and the “priming” of our brain.

When our brain has a goal — like looking for COVID-19 information — it behaves significantly differently than when it is just bored and wanting to be entertained.

The goal sets a “top down” intent. It’s like an executive order to the various bits and pieces of our brain to get their shit together and start working as a team. Suddenly the entire brain focuses on the task at hand and things like reliability of information become much more important to us. If we’re going to go directly to a information source we trust, this is going to be when we do it.

If the brain isn’t actively engaged in a goal, then information has to initiate a “bottom-up” activation. And that is an entirely different animal.

We never go to social media looking for a specific piece of news. That’s not how social media works. We go to our preferred social channels either out of sheer boredom or a need for social affirmation. We hope there’s something in the highly addictive endlessly scrolling format that will catch our attention.

For a news piece to do that, it has to somehow find a “hook” in our brain.  Often, that “hook” is an existing belief. The parts of our brain that act as gatekeepers against unreliable information are bypassed because no one bothered to wake them up.

There is a further brain-related problem with relying on social media, and that’s the “priming” issue. This is where one stimulus sets a subconscious “lens” that will impact subsequent stimuli. Priming sets the brain on a track we’re not aware of, which makes it difficult to control.

Social media is the perfect priming platform. One post sets the stage for the next, even if they’re completely unrelated.

These are just two factors that make social media an inherently dangerous platform to rely on for being informed.

The third is that social media makes information digestion much too easy. Our brain barely needs to work at all. And if it does need to work, we quickly click back and scroll down to the next post. Because we’re looking to be entertained, not informed, the brain is reluctant to do any unnecessary heavy lifting.   

This is a big reason why we may know the news we get through social media channels is probably not good for us, but we gulp it down anyway, destroying our appetite for more trustworthy information sources.

These three things create a perfect cognitive storm for huge portions of the population to be continually and willingly misinformed. That’s not even factoring in all the other problems with social media that I mentioned at the outset of this column. We need to rethink this — soon!

Playing Fast and Loose with the Truth

A few months ago, I was having a conversation with someone and they said something that I was pretty sure was not true. I don’t know if it was a deliberate lie. It may have just been that this particular person was uninformed. But they said it with the full confidence that what they said was true. I pushed back a little and they instantly defended their position.

My first instinct was just to let it go. I typically don’t go out of my way to cause friction in social settings. Besides, it was an inconsequential thing. I didn’t really care about it. But I was feeling a little pissy at the time, so I fact checked her by looking it up on my phone. And I was right. She had stated something that wasn’t true and then doubled down on it.

Like I said, it was inconsequential – a trivial conversation point. But what if it wasn’t? What if there was a lot riding on whether or not what they said was true? What if this person was in a position of power, like – oh, I don’t know – the President of the United States?

The role of truth in our social environment is currently a thing in flux. I cannot remember a time when we have been more suspicious of what we see, read and hear on a daily basis. As I mentioned a couple weeks ago, less than 40% of us trust what we hear on the news. And when that news comes through our social media feed, the level of distrust jumps to a staggering 80%

Catching someone in a lie has significant social and cognitive implications. We humans like to start from a default position of trust. If we can do that, it eliminates a lot of social friction and cognitive effort. We only go to not trusting when we have to protect ourselves.

Our proclivity for trust is what has made a global commerce and human advancement possible. But, unfortunately, it does leave us vulnerable. Collectively, we usually play by the same playbook I was initially going to use in my opening example. It’s just easier to go along with what people say, even if we may doubt that it’s true. This is especially so if the untruth is delivered with confidence. We humans love confidence in others because it means we don’t have to work as hard. Confidence is a signal we use to decide to trust and trust is always easier than distrust. The more confident the delivery, the less likely we are to question it.

It’s this natural human tendency that put the “con” in “con artist.” “Con” is short for confidence, and it originates with an individual named William Thompson, who plied the streets of New York in the 1840’s. He would walk up to a total stranger who was obviously well off and greet them like a long-lost friend. After a few minutes of friendly conversation during which the target would be desperately trying to place this individual, Thompson would ask for the loan of something of value. He would then set his hook with this, “Do you have confidence in me to loan me this [item] til tomorrow?”  The success of this scam was totally dependent on an imbalance of confidence; extreme confidence on the part of the con artist and a lack of confidence on the part of the target.

It is ironic that in an era where it’s easier than ever to fact check, we are seeing increasing disregard for the truth. According to the Washington Post, Donald Trump passed a misinformation milestone on July 9, making 20,000 false or misleading claims since he became President. He surged past that particular post when he lied 62 times on that day alone. I don’t even think I talk 62 times per day.

This habit of playing fast and loose with the truth is not Trump’s alone. Unfortunately, egregious lying has been normalized in today’s world. We have now entered an era where full time fact checking is necessary. On July 7, NY Times columnist Thomas Friedman said we need a Biden-Trump debate, but only on two conditions: First, only if Trump releases his tax returns, and second, only if there is a non-partisan real-time fact-checking team keeping the debaters accountable.

We have accepted this as the new normal. But we shouldn’t. There is an unacceptable cost we’re paying by doing so. And that cost becomes apparent when we think about the consequence of lying on a personal basis.

If we catch an acquaintance in a deliberate lie, we put them in the untrustworthy column. We are forced into a default position of suspicion whenever we deal with them in the future. This puts a huge cognitive load on us. As I said before, it takes much more effort to not trust someone. It makes it exponentially harder to do business with them. It makes it more difficult to enjoy their company. It introduces friction into our relationship with them.

Even if the lie is not deliberate but stated with confidence, we label them as uninformed. Again, we trust them less.

Now multiply this effort by everyone. You quickly see where the model breaks down. Lying may give the liar a temporary advantage, but it’s akin to a self-limiting predator-prey model. If it went unchecked, soon the liars would only have other liars to deal with. It’s just not sustainable.

Truth exists for a reason. It’s the best social strategy for the long term. We should fight harder for it.

How We Forage for the News We Want

Reuters Institute out of the UK just released a comprehensive study looking at how people around the world are finding their news. There is a lot here, so I’ll break it into pieces over a few columns and look at the most interesting aspects. Today, I’ll look at the 50,000-foot view, which can best be summarized as a dysfunctional relationship between our news sources and ourselves. And like most dysfunctional relationships, the culprit here is a lack of trust.

Before we dive in, we should spend some time looking at how the way we access news has changed over the last several years.

Over my lifetime, we have trended in two general directions – less cognitively demanding news channels and less destination specific news sources. The most obvious shift has been away from print. According to Journalism.org and the Pew Research Center, circulation of U.S. Daily newspapers peaked around 1990, at about 62 and a half million. That’s one subscription for every 4 people in the country at that time.

In 2018, it was projected that circulation had dropped more than 50%, to less than 30 million. That would have been one subscription for every 10 people. We were no longer reading our news in a non-digital format. And that may have significant impact on our understanding of the news. I’ll return to this in another column, but for now, let’s just understand that our brain operates in a significantly different way when it’s reading rather than watching or listening.

Up the end of the last century, we generally trusted news destinations. Whether it be a daily newspaper like the New York Times, a news magazine like Time or a nightly newscast such as any of the network news shows, each was a destination that offered one thing above all others – the news. And whether you agreed with them or not, each had an editorial process that governed what news was shared. We had a loyalty to our chosen news destinations that was built on trust.

Over the past two decades, this trust has broken down due to one primary factor – our continuing use of social media. And that has dramatically shifted how we get our news.

In the US, three out of every four people use online sources to get their news. One in two use social media.  Those aged 18 to 24 are more than twice as likely to rely on social media. In the UK, under-35s get more of their news from Social Media than any other source.

Also, influencers have become a source of news, particularly amongst young people. In the US, a quarter of those 18 to 24 used Instagram as a source of news about COVID.

This means that most times, we’re getting our news through a social media lens. Let’s set aside for a moment the filtering and information veracity problems that introduces. Let’s just talk about intent for a moment.

I have talked extensively in the past about information foraging when it comes to search. When information is “patchy” and spread diversely, the brain has to make a quickly calculated guess about which patch it’s most likely to find the information in it’s looking for. With Information Foraging, the intent we have frames everything that comes after.

In today’s digital world, information sources have disaggregated into profoundly patchy environments. We still go to news-first destinations like CNN or Fox News but we also get much of our information about the world through our social media feeds. What was interesting about the Reuters report was that it was started before the COVID pandemic, but the second part of the study was conducted during COVID. And it highlights a fascinating truth about our relationship with the news when it comes to trust.

The study shows that the majority of us don’t trust the news we get through social media but most times, we’re okay with that. Less than 40% of people trust the news in general, and even when we pick a source, less than half of us trust that particular channel. Only 22% indicated they trust the news they see in social media. Yet half of us admit we use social media to get our news. The younger we are, the more reliant we are on social media for news. The fastest growing sources for news amongst all age groups – but especially those under 30 – are Instagram, SnapChat and WhatsApp.

Here’s another troubling fact that fell out of the study. Social platforms, especially Instagram and SnapChat, are dominated by influencers. That means that much of our news comes to us by way of a celebrity influencer reposting it on their feed. This is a far cry from the editorial review process that used to act as a gate keeper on our trusted news sources.

So why do we continue to use news sources we admit we don’t trust? I suspect it may have to do with something called the Meaning Maintenance Model. Proposed in 2006 by Heine, Proulx and Vohs, the model speculates that a primary driver for us is to maintain our beliefs in how the world works. This is related to the sense making loop (Klein, Moon and Hoffman) I’ve also talked about in the past. We make sense of the world by first starting with the existing frame of what we believe to be true. If what we’re experiencing is significantly different from what we believe, we will update our frame to align with the new evidence.

What the Meaning Maintenance Model suggests is that we will go to great lengths to avoid updating our frame. It’s much easier just to find supposed evidence that supports our current beliefs. So, if our intent is to get news that supports our existing world view, social media is the perfect source. It’s algorithmically filtered to match our current frame. Even if we believe the information is suspect, it still comforts us to have our beliefs confirmed. This works well for news about politics, societal concerns and other ideologically polarized topics.

We don’t like to admit this is the case. According to the Reuter’s study, 60% of us indicate we want news sources that are objective and not biased to any particular point of view. But this doesn’t jive with reality at all. As I wrote about in a previous column, almost all mainstream news sources in the US appear to have a significant bias to the right or left. If we’re talking about news that comes through social media channels, that bias is doubled down on. In practice, we are quite happy foraging from news sources that are biased, as long as that bias matches our own.

But then something like COVID comes along. Suddenly, we all have skin in the game in a very real and immediate way. Our information foraging intent changes and our minimum threshold for the reliability of our news sources goes way up. The Reuters study found that when it comes to sourcing COVID information, the most trusted sources are official sites of health and scientific organizations. The least trusted sources are random strangers, social media and messaging apps.

It requires some reading between the lines, but the Reuters study paints a troubling picture of the state of journalism and our relationship with it. Where we get our information directly impacts what we believe. And what we believe determines what we do.

These are high stakes in an all-in game of survival.

Are We Killing Politeness?

One of the many casualties of our changing culture seems to be politeness. When the President of the United States is the poster child for rude behavior, it’s tough for politeness to survive. This is especially true in the no-holds-barred, digitally distanced world of social media.

I consider myself to be reasonably polite. Being so, I also expect this in others. Mild rudeness makes me anxious. Excessive rudeness makes me angry. This being the case, I am troubled by the apparent decline of civility. So today I wanted to take a look at politeness and why it might be slipping away from us.

First of all, we have to understand the politeness is not universal. What is considered polite in one culture is not in another.

Secondly, being polite is not the same as being friendly. Or empathetic. Or being respectful of others. Or being compassionate, according to this post  from The Conversation. There is a question of degree and intent here. Being polite is a rather unique behavior that encompasses both desirable and less desirable qualities. And that begs the question: What is the purpose of politeness? Is a less-polite world a good or a bad thing?

First, let’s look at the origin of the world. It comes from the Latin “politus,” meaning “polished — made smooth.” Just in case you’re wondering, “politics” does not come from the same root. That comes from the Greek word for “citizen” — “polites.”

One last etymological nugget. The closest comparison to polite may be “nice,” which originates from the Latin “nescius,” meaning “ignorant”. Take that for what it’s worth.

This idea of politeness as a type of social “polish” really comes from Europe — and especially Britain. There, politeness was linked with class hierarchies. Being polite was a sign of good breeding — a dividing line between the high-born and the riffraff. This class-bound definition came along with the transference of the concept to North America.

Canada is typically considered one of the most polite nations in the world. As a Canadian who has traveled a fair amount, I would say that’s probably true.

But again, there are variations in the concept of politeness and how it applies to both Canadians and Americans.

When we consider the British definition of politeness, you begin to see how Americans and Canadians might respond differently to it. To understand that is to understand much of what makes up our respective characters.

As a Canadian doing much of my business in the U.S. for many years, I was always struck by the difference in approaches I found north and south of the 49th parallel. Canadians businesses we met with were unfailingly polite, but seldom bought anything. Negotiating the prospect path in Canada was a long and often frustrating journey.

American businesses were much more likely to sign a contract. On the whole, I would also say they were friendlier in a more open and less-guarded way. I have to admit that in a business setting, I preferred the American approach.

According to anthropologists Penelope Brown and Stephen Levinson, who have extensively researched politeness, there is negative and positive politeness. Negative politeness is concern with adhering to social norms, often by deferring to someone or something else.

This is Canadian politeness personified. Our entire history is one of deference to greater powers, first to our colonial masters — the British and French — and, more recently, from our proximity to the cultural and economic master that is the U.S.

For Canadians, deferral is survival. As former Prime Minister Pierre Trudeau once said about the U.S., “Living next to you is in some ways like sleeping with an elephant. No matter how friendly and even-tempered is the beast, if I can call it that, one is affected by every twitch and grunt.”

Negative politeness is a way to smooth out social friction, but there is good and bad here. Ideally it should establish a baseline of trust, respect and social capital. But ultimately, politeness is a consensus of compromise.  And that’s why Canadians are so good at it.  Negative politeness wants everything to be fair.

But then there is positive politeness, which is more American in tone and nature. This is a desire to help others, making it more closely linked to compassion. But in this noble motive there is also a unilateral defining of what is right and wrong. Positive politeness tries to make everything right, based on the protagonist’s definition of what that is.

The two sides of politeness actually come from different parts of the brain. Negative politeness comes from the part of the brain that governs aggression. It is all about applying brakes to our natural instincts. But positive politeness comes from the part of the brain that regulates social bonding and affiliation.

When you understand this, you understand the difference between Canadians and Americans in what we consider polite. For the former, our definition comes handed down from the British class-linked origins, and has morphed into a culture of compromise and deferral.

The American definition comes from many generations of being the de facto moral leaders of the free world.

We (Canadians) want to be nice. You (Americans) want to be right. The two are not mutually exclusive, but they are also not the same thing. Not by a long shot.

What Trump has done (with a certain kind of perverse genius) has played on this national baseline of compassion. He has wantonly discarded any vestiges of politeness and split the nation on what it means to be right.

But by eliminating politeness, you have also eliminated that governor of our behavior. Reactions about what is right and wrong are now immediate, rough and unfiltered.

The polish that politeness brings — that deferral of spoken judgement for even a brief moment in order to foster cooperation — is gone. We have no opportunity to consider other perspectives. We have no motive to cooperate. This is abundantly apparent on every social media platform.

In game theory, politeness is a highly successful strategy commonly called “tit for tat.” It starts from assuming a default position of fairness from the other party, continuing to cooperate if this proves to be true, and escalating to retaliation if it’s not. But this tactic evolved in a world of face-to-face encounters. Somehow, it seems less needed in a divided world where rudeness and immediate judgement are the norm.

Still, I will cling to my notions of politeness. Yes, sometimes it seems to get in the way of definitive action. But on the whole, I would rather live in a world that’s a little nicer and a little more polite, even if that seems foolish to some of you.

The Potential Woes of Working from Home

Many of you have now had a few months under your belt working virtually from home rather than going to the office. At least some of you are probably considering continuing to do so even after COVID recedes and the all clear is given to return to normal. A virtual workplace makes all kinds of rational sense – both for employees and employers. But there are irrational reasons why you might want to think twice before you fully embrace going virtual.

About a decade ago, my company also went with a hybrid virtual/physical workplace. As the CEO, there was a lot I liked about it. It was a lot more economical than leasing more office space. It gave us the flexibility to recruit top talent in areas where we had no physical presence. And it seemed that technology was up to the task of providing the communication and work-flow tools we needed to support our virtual members.

On the whole, our virtual employees also seemed to like it. It gave them more flexibility in their workday. It also made it less formal. If you wanted to work in pajamas and bunny slippers, so be it. And with a customer base spread across many time zones, it also made it easier to shift client calls to times that were mutually acceptable.

It seemed to be a win-win. For awhile. Then we noticed that all was not wonderful in work-from-home land.

I can’t say productivity declined. We were always a results-based workplace so as long as the work got done, we were happy. But we started to feel a shift in our previously strong corporate culture. We found team-member complaints about seemingly minor things skyrocket. We found less cohesion across teams. Finally – and most critically – it started to impact our relationships with our customers.

Right about the time all this was happening, we were acquired by a much bigger company. One of the dictates that was handed down from the new owners was that we establish physical offices and bring our virtual employees back to the mothership for the majority of their work-week. At the time, I wasn’t fully aware of the negative consequences of going virtual so I initially fought the decision. But to be honest, I was secretly happy. I knew something wasn’t quite right. I just wasn’t sure what it was. I suspected it might have been our new virtual team members.

The move back to a physical workplace was a tough one. Our virtual team members were very vocal about how this was a loss of their personal freedom. New HR fires were erupting daily and I spent much of my time fighting them. This, combined with the inevitable cultural consequences of being acquired, often made me shake my head in bewilderment. Life in our company was turning into a shit-show.

I wish I could say that after we all returned to the same workplace, we joined hands and sang a rousing chorus of Kumbaya. We didn’t. The damage had been done. Many of the disgruntled former virtual team members ended up moving on. The cultural core of the company remained with our original team members who had worked in the same office location for several years. I eventually completed my contract and went my own way.

I never fully determined what the culprit was. Was it our virtual team members? Or was it the fact that we embraced a virtual workplace without considering unintended consequences. I suspected it was a little of both.

Like I said, that was a decade ago. From a rational perspective, all the benefits of a virtual workplace seem even more enticing than they did then. But in the last 10 years, there has been research done on those irrational factors that can lead to the cracks in a corporate culture that we experienced.

Mahdi Roghanizad is an organizational behavior specialist from Ryerson University in Toronto. He has long looked at the limitations of computerized communication. And his research provides a little more clarity into our failed experiment with a virtual workplace.

Roghanizad has found that without real-life contact, the parts of our brain that provide us with the connections needed to build trust never turn on. In order to build a true relationship with another person, we need something called the Theory of Mind. According to Wikipedia, “Theory of mind is necessary to understand that others have beliefs, desires, intentions, and perspectives that are different from one’s own.”

But unless we’re physically face-to-face with another person, our brain doesn’t engage in this critical activity. “Eye contact is required to activate that theory of mind and when the eye contact is not there, the whole other signal information is not processed by our brain,” said Roghanizad. Even wearing a pair of sunglasses is enough to short circuit the process. Relegating contact to a periodic Zoom call guarantees that this empathetic part of our brains will never kick in.

But it’s not just being eye-ball to eye-ball. There are other non-verbal cues we rely on to connect with other people and create a Theory of Mind. Other research has shown the importance of pheromones and physical gestures like crossing your arms and leaning forward or back. This is why we subconsciously start to physically imitate people we’re talking to. The stronger the connection with someone, the more we imitate them.

This all comes back to the importance of bandwidth in the real world. A digital connection cannot possibly incorporate all the nuance of a face-to-face connection. And whether we realize it or not, we rely on that bandwidth to understand other people. From that understanding comes the foundations of trusted relationships. And trusted relationships are the difference between a high-functioning work team and a dysfunctional one.

I wish I knew that ten years ago.

Crisis? What Crisis?

You would think that a global pandemic would hold our attention for a while.

Nope.

We’re tired of it. We’re moving on.  We’re off to the next thing.

Granted, in this case the next thing deserves to be focused on. It is abysmal that it still exists. So it should be focused on. Probably for the rest of our lives and beyond – going forward until it ceases to be a thing. But it won’t be. Soon we’ll be talking about something else.

And that’s the point of this post – our collective inability to remain focused on anything without being distracted by the next breaking story in our news feed. How did we come to this?

I blame memes.

To a certain extent, our culture is the product of who we are and who we are is a product of our culture. Each is shaped by the other, going forward in a constantly improvised pas de deux. Humans create the medium – which then becomes part of the environment we adapt too.

Books and the printed word changed who we were for over five centuries.  Cinema has been helping to define us for almost 150 years. And radio and television has been moulding us for the past century. Our creations have helped create who we are.

This has never been truer than with social media. Unlike other media which took discrete chunks of our time and attention, social media is ubiquitous and pervasive. According to a recent survey, we spend on average 2 hours and 23 minutes per day on social media. That is about 13% of our waking hours.  Social media has become intertwined with our lives to the point that we had to start qualifying what happens where with labels like “IRL” (In Real Life).

There is another difference between social media and what has come before it. Almost every previous entertainment medium that has demanded our attention has been built on the foundation of a long form narrative arc. Interacting with each medium has been a process – a commitment to invest a certain amount of time to go on a journey with the storyteller. The construction of a story depends on patterns that are instantly recognized by us. Once we identify them, we are invested in discovering the outcome. We understand that our part of the bargain is to exchange our time and attention. The payoff is the joy that comes from us making sense of a new world or situation, even if it is imaginary.  

But social media depends on a different exchange. Rather than tapping into our inherent love of the structure of a story it depends on something called variable intermittent rewards. Essentially, it’s the same hook that casinos use to keep people at a slot machine or table. Not only is it highly addictive, it also pushes us to continually scroll to the next thing. It completely bypasses the thinking part of our brains and connects directly to the reward center buried in our limbic system. Rather than ask for our time and attention social media dangles a never-ending array of bright, shiny memes that asks nothing from us: no thinking, almost no attention and a few seconds of our time at most. For a lazy brain, this is the bargain of a lifetime.

It’s probably not a coincidence that the media that are most dependent on advertising are also the media that avoids locking our attention on a single topic for an extended period. This makes social media the perfect match for interruptive ad forms. They are simply slotted into the never-ending scroll of memes.

Social media has only been around for a little over 2 decades. It has been a significant part of our lives for half that time. If even a little bit of what I suspect is happening is indeed taking place, that scares the hell out of me. It would mean that no other medium has changed us so much and so quickly.

That is something worth paying attention to.