Lockdown Advice For A Long Winter

No matter where you live in the world, it’s likely you’re going to be forced to spend a lot of time at home. And if that home includes others  — like your beloved life partner — living under the same roof, you may experience a little friction now and again. In anticipation of this, I thought I’d share a few insights on what might come.

There is No Gender Equality with COVID

recent study by Oxford, Cambridge and Zurich Universities found that women’s sense of mental well being took a more significant drop then men due to COVID. The researchers speculated on a number of reasons for this, but were unable to narrow it down to any identifiable factor. Perhaps, they reasoned,  it had something to do with women losing jobs at a greater rate than men, taking on a greater share of the burden of home schooling — or the fact that even when both men and women were home all the time, women still did more than their fair share of domestic chores. But no, even when controlling for these factors, it didn’t explain why women were becoming more distressed than men. 

Maybe it was something else.

Warriors and Worriers: Two Approaches to Survival

In 2014, psychologist Joyce Benenson published her book “Warriors and Worries: The Survival of the Sexes.”  As an evolutionary biologist, she has spent years studying children and primates, looking for innate rather than socialized differences between the sexes. 

Her findings turned conventional wisdom on its head. Women may not be more sociable than men, and men may not be more competitive than women. It’s just that they define those things differently. 

Men are quite comfortable forming packs of convenience to solve a particular problem, whether it is defending against an enemy or winning a pick-up basketball game. This could explain why team sports entertainment always seems to have a male bias.

Women, on the other hand, have fewer but much more complex relationships that they deem essential to their survival as the primary caregiver for their family. The following is from the abstract of a 1990 study by Berenson: “Although males and females did not differ in the number of best friends they reported, males were found to have larger social networks than females. Further, for males, position in a social network was more highly linked with acceptance by the peer group. Finally, males were concerned with attributes that could be construed as important for status in the peer group, and females were concerned with attributes that appeared essential to relationships with a few friends.”

If we apply this to the current COVID situation, we begin to see why women might be struggling more with lockdown then men. A male’s idea of socializing might be more easily met with a Zoom call or another type of digital connection, such as online gaming. But connecting in these way lacks the bandwidth necessary to properly convey the complexity of a female relationship. 

Introverts and Extroverts Revisited

Of course, gender isn’t the only variable at play here. I’ve written before about what happens when an extrovert and introvert are locked down in the same house together (those being my wife and myself). One of the things I’ve noticed is a different level of comfort we have at being left alone with our thoughts. 

Because I have always been a writer of one kind or another, I require time to ruminate on a fairly frequent basis. I am a little (lot?) dictatorial in my requirements for this: my environment needs to be silent and free from interruption. When the weather is good outside, this is fairly easy. I can grab my laptop and go outside. But in the winter, it’s a different story. My wife is subjected to forced silence so I can have my quiet time.

My wife functions best when there is some type of sensory stimuli, especially the sound of voices. She doesn’t have the same need to sit in silence and be alone with her thoughts. 

And she’s not unique in that. A 2014 study found that most of us fall into the same category. In fact, the researchers found that, “many of the people studied, particularly the men, chose to give themselves a mild electric shock rather than be deprived of external sensory stimuli.”

A Difference in Distraction

When we do look for distraction, we can also have different needs and approaches. Another area I’ve touched on in a past post is how our entertainment delivery platforms have now become entangled with multitasking. 

I like an immersive, interruption-free entertainment experience. The bigger the screen and the louder the sound, the better. I suspect this may be another “male” thing.  Again, this preference tends to cast a dictatorial tone on our household, so I generally retreat to my media cave in the basement. I also tuck my phone away while I’m watching. 

My wife prefers to multiscreen when watching TV and to do so in the highest traffic area of our house. For her, staying connected is more important than being immersed in whatever she might be watching. 

These differences in our entertainment preferences often means we’re not together when we seek distraction. 

I don’t think this is a bad thing. In a normal world filled with normal activities, this balancing of personal preference is probably accommodated by our normal routines. But in a decidedly abnormal world where we spend every minute together in the same house, these differences become more noticeable.

Try a Little Friluftsliv

In the end, winter is going to be long, lonely and cold for many of us. So we may just want to borrow a strategy from Norwegians: friluftsliv. Basically, it means “open-air living.” Most winters, my main activity is complaining. But this year, I’m going to get away from the screens and social media, strap on a pair of snowshoes and embrace winter.

Amazon Prime: Buy Today, Pay Tomorrow?

This column goes live on the most eagerly anticipated day of the year. My neighbor, who has a never-ending parade of delivery vans stopping in front of her door, has it circled on her calendar. At least one of my daughters has been planning for it for several months. Even I, who tends to take a curmudgeonly view of many celebrations, has a soft spot in my heart for this particular one.

No, it’s not the day after Canadian Thanksgiving. This, my friends, is Amazon Prime Day!

Today, in our COVID-clouded reality, the day will likely hit a new peak of “Prime-ness.” Housebound and tired of being bludgeoned to death by WTF news headlines, we will undoubtedly treat ourselves with an unprecedented orgy of one-click shopping. And who can blame us? We can’t go to Disneyland, so leave me alone and let me order that smart home toilet plunger and the matching set of Fawlty Towers tea towels that I’ve been eyeing. 

Of course, me being me, I do think about the consequences of Amazon’s rise to retail dominance. 

I think we’re at a watershed moment in our retail behaviors, and this moment has been driven forward precipitously by the current pandemic. Being locked down has forced many of us to make Amazon our default destination for buying. Speaking solely as a sample of one, I know check Amazon first and then use that as my baseline for comparison shopping. But I do so for purely selfish reasons – buying stuff on Amazon is as convenient as hell!

I don’t think I’m alone. We do seem to love us some Amazon. In a 2018 survey conducted by Recode, respondents said that Amazon had the most positive impact on society out of any major tech company. And that was pre-Pandemic. I suspect this halo effect has only increased since Amazon has become the consumer lifeline for a world forced to stay at home.

As I give into to the siren call of Bezos and Co., I wonder what forces I might be unleashing. What unintended consequences might come home to roost in years hence? Here are a few possibilities. 

The Corporate Conundrum

First of all, let’s not kid ourselves. Amazon is a for-profit corporation. It has shareholders that demand results. The biggest of those shareholders is Jeff Bezos, who is the world’s richest man. 

But amazingly, not all of Amazon’s shareholders are focused on the quarterly financials. Many of them – with an eye to the long game – are demanding that Amazon adopt a more ethical balance sheet.  At the 2019 Annual Shareholder Meeting, a list of 12 resolutions were brought forward to be voted on. The recommendations included zero tolerance for sexual harassment and hate speech, curbing Amazon’s facial recognition technology, addressing climate change and Amazon’s own environmental impact. These last two were supported by a letter signed by 7600 of Amazon’s own employees. 

The result? Amazon strenuously fought every one of them and none were adopted. So, before we get all warm and gooey about how wonderful Amazon is, let’s remember that the people running the joint have made it very clear that they will absolutely put profit before ethics. 

A Dagger in the Heart of Our Communities

For hundreds of years, we have been building a supply chain that was bound by the realities of geography. That supply chain required some type of physical presence within a stone’s throw of where we live. Amazon has broken that chain and we are beginning to feel the impact of that. 

Community shopping districts around the world were being gutted by the “Amazon Effect” even before COVID. In the last 6 months, that dangerous trend has accelerated exponentially. In a commentary from CNBC in 2018, venture capitalist Alan Patricof worried about the social impact of losing our community gathering spots, “This decline has brought a deterioration in places where people congregated, socialized, made friends and were greeted by a friendly face offering an intangible element of belonging to a community.”

The social glue that held us together has been dissolving over the past two decades. Whether you’re a fan of shopping malls or not (I fall into the “not” category) they were at least a common space where you might run into your neighbor. In his book Bowling Alone, from 2000, Harvard political scientist Robert Putnam documented the erosion of social capital in America. We are now 20 years hence and Putnam’s worst case scenario seems quaintly optimistic now. With the loss of our common ground – in the most literal sense – we increasingly retreat to the echo chambers of social media. 

Frictionless Consumerism

This last point is perhaps the most worrying. Amazon has made it stupid simple to buy stuff. They have relentlessly squeezed every last bit of friction out of the path to purchase. That worries me greatly.

If we could rely on a rational marketplace filled with buyers acting in the best homo economicus tradition, then I perhaps rest easier, knowing that there was some type of intelligence driving Adam Smith’s Invisible Hand. But experience has shown that is not the case. Rampant consumerism appears to be one of the three horsemen of the modern apocalypse. And, if this is true, then Amazon has put us squarely in their path. 

This is not to even mention things like Amazon’s emerging monopoly-like dominance in a formerly competitive marketplace, the relentless downward pressure it exerts on wages within its supply chain, the evaporation of jobs outside its supply chain or the privacy considerations of Alexa. 

Still, enjoy your Amazon Prime Day. I’m sure everything will be fine.

How to Look Past the Nearest Crisis

I was talking to someone the other day who was trying to make plans for 2021. Those plans were dependent on the plans of others. In the course of our conversation, she said something interesting: “It’s so hard to plan because most of the people I’m talking to can’t see past COVID.” 

If anything sums up our current reality, it might be that. We’re all having a lot of trouble seeing past COVID. Or the upcoming U.S. election. Or catastrophic weather events. Or an impending economic crisis. Take your pick. There are so many looming storm clouds on the horizon that it’s difficult to even make out that horizon any more. 

We humans are pretty dependent on the past to tell us what may be happening in the future. We evolved in an environment that — thanks to its stability — was reasonably predictable. In evolutionary survival terms, it was smart to hedge our bets on the future by glancing over our shoulders at the past. If a saber-toothed tiger was likely to eat you yesterday, the odds were very much in favor of it also wanting to eat you tomorrow. 

But our ability to predict things gets thrown for a loop in the face of uncertainty like we’re currently processing. There are just too many variables forced into the equation for us to be able to rely on what has happened in the past. Both the number of variables and the range of variation pushes our prediction probability of error past the breaking point. 

When it comes to planning for the future, we become functionally paralyzed and start living day to day, waiting for the proverbial “other shoe to drop.” 

The bigger problem, however, is that when the world is going to hell in a hand basket, we don’t realize that the past is a poor foundation on which to build our future. Evolved habits die hard, and so we continue to use hindsight to try to move forward. 

And by “we,” I mean everyone — most especially the leaders we elect and the experts we rely on to point us in the right direction.  Many seem to think that a post-COVID world will snap back to be very much like a pre-COVID world.

And that, I’m afraid, may be the biggest problem. You’d think that when worrying about an uncertain future is above our pay grade, there would be someone wiser and smarter than us to rely on and save our collective asses. But if common folk tend to consistently bet on the past as a guide to our future, it’s been shown that people we think of as “experts” double down on that bet. 

A famous study by Philip Tetlock showed just how excruciatingly awful experts were at predicting the future. He assembled a group of 284 experts and got them to make predictions about future events, including those that fell into their area of expertise. Across the board, he found their track record of being correct was only slightly ahead of a random coin toss or a troupe of chimpanzees throwing darts. The more famous the expert, the worse their track record.

Expertise is rooted in experience. Both words spring from the same root: The Latin experiri for “try.” Experience is gained in the past. For experts, their worth comes from their experience in one particular area, so they are highly unlikely to ignore it when predicting the future. They are like the hedgehog in Isiah Berlin’s famous essay “The Hedgehog and The Fox“: They “know one important thing.”

But when it comes to predicting the future, Tetlock found it’s better to be a fox: to “know many little things.” In a complex, highly uncertain world, it’s the generalist  who thrives. 

The reason is pretty simple. In an uncertain world, we have to be more open to sense making in the classic cognitive sense. We have to be attuned to the signals that are playing out in real time and not be afraid to consider new information that may conflict with our current beliefs.

This is how generalists operate. It’s also how science is supposed to operate. Our view of the future should be no more than a hypothesis that we’re willing to have proven wrong. Hedgehogs dig in when their expertise about “one big thing” is questioned. Foxes use it as an opportunity to update their take on reality. 

Foxes have another advantage over hedgehogs. They tend to be dilettantes, spreading their interest over a wide range of topics without diving too deeply into any of them. This keeps their network diverse and expansive, giving them the opportunity to synthesize their sense of reality from the broadest range of signals possible. 

In a world that depends on being nimble enough to shift directions depending on the input your receive, this stacks the odds in favor of the fox. 

Still, it’s against human nature to be so cavalier about our future. We like certainty. We crave predictability. We are big fans of transparent causes and effects. If those things are clouded by complexity and uncertainty, we start constructing our own narratives. Hence the current spike of conspiracy theories, as I noted previously. This is especially true when the stakes are as high as they are now. 

I don’t blame those having a very hard time looking past COVID — or any other imminent disaster. But someone should be. 

It’s time to start honing those fox instincts. 

Tired of Reality? Take 2 Full-Strength Schitt’s Creeks

“Schitt’s Creek” stormed the Emmys by winning awards in every comedy series category — a new record. It was co-creators Dan and Eugene Levy’s gift to the world: a warm bowl of hot cultural soup, brimming with life-affirming values, acceptance and big-hearted Canadian corniness.

It was the perfect entertainment solution to an imperfect time. It was good for what ails us.

It’s not the first time we’ve turned to entertainment for comfort. In fact, if there is anything as predictable as death and taxes, it’s that during times of trial, we need to be entertained.

There is a direct correlation between feel-good fantasy and feeling-shitty reality. The worse things get, the more we want to escape it.

But the ways we choose to be entertained have changed. And maybe — just maybe — the media channels we’re looking to for our entertainment are adding to the problem. 

The Immersiveness of Media

A medium’s ability to distract us from reality depends on how much it removes us from that reality.

Our media channels have historically been quite separate from the real world. Each channel offered its own opportunity to escape. But as the technology we rely on to be entertained has become more capable of doing multiple things, that escape from the real world has become more difficult.

Books, for example, require a cognitive commitment unlike any other form of entertainment. When we read a book, we — in effect — enter into a co-work partnership with the author. Our brains have to pick up where theirs left off, and we together build a fictional world to which we can escape. 

As the science of interpreting our brain’s behavior has advanced, we have discovered that our brains actually change while we read.

Maryanne Wolf explains in her book, “Proust and the Squid: The Story and Science of the Reading Brain”: “Human beings invented reading only a few thousand years ago. And with this invention, we rearranged the very organization of our brain, which in turn expanded the ways we were able to think, which altered the intellectual evolution of our species. . . . Our ancestors’ invention could come about only because of the human brain’s extraordinary ability to make new connections among its existing structures, a process made possible by the brain’s ability to be reshaped by experience.”

Even movies, which dramatically lowered the bar for the cognitive commitments they ask by supplying content specifically designed for two of our senses, do so by immersing us in a dedicated single-purpose environment. The distraction of the real world is locked outside the theater doors.

But today’s entertainment media platforms not only live in the real world, they are the very same platforms we use to function in said world. They are our laptops, our tablets, our phones and our connected TVs.

It’s hard to ignore that world when the flotsam and jetsam of reality is constantly bumping into us. And that brings us to the problem of the multitasking myth.

Multitasking Anxiety

The problem is not so much that we can’t escape from the real world for a brief reprise in a fictional one. It’s that we don’t want to. 

Even if we’re watching our entertainment in our home theater room on a big screen, the odds are very good that we have a small screen in our hands at the same time. We mistakenly believe we can successfully multitask, and our mental health is paying the price for that mistake.

Research has found that trying to multitask brings on a toxic mix of social anxiety, depression, a lessening of our ability to focus attention, and a sociopsychological impairment that impacts our ability to have rewarding relationships. 

When we use the same technology to be entertained that we use to stay on top of our social networks we fall prey to the fear of missing out.

It’s called Internet Communication Disorder, and it’s an addictive need to continually scroll through Facebook, Twitter, WhatsApp and our other social media platforms. It’s these same platforms that are feeding us a constant stream of the very things we’re looking to escape from. 

It may be that laughter is the best medicine, but the efficacy of that medicine is wholly dependent on where we get our laughs.

The ability for entertainment to smooth the jagged edges of reality depend on our being able to shift our minds off the track that leads to chronic anxiety and depression — and successfully escape into a fictional kinder, gentler, funnier world.

For entertainment to be a beneficial distraction, we first have to mentally disengage from the real world, and then fully engage in the fictional one.

That doesn’t work nearly as well when our entertainment delivery channel also happens to be the same addictive channel that is constantly tempting us to tiptoe through the anxiety-strewn landscape that is our social media feed. 

In other words, before going to “Schitt’s Creek,” unpack your other shit and leave it behind. I guarantee it will be waiting for you when you get back.

Why The World is Conspiring Against Us

With all the other things 2020 will go down in history for, it has also proven to be a high-water mark for conspiracy theories. And that shouldn’t surprise us. Science has proven that when the going get tough, the paranoid get weirder. Add to this the craziness multiplier effect of social media, and it’s no wonder that 2020 has given us a bumper crop of batshit crazy. 

As chronicled for you, my dear reader, I kicked over my own little hornet’s nest of conspiracy craziness a few weeks ago. I started with probing a little COVID anti-vaxxing lunacy right here in my home and native land, Canada.The next thing I knew, the QAnoners were lurching out of the woodwork like the coming of the zombie apocalypse.

I have since run for cover.

But as I was running, I noticed two things. One, most of the people sharing the theories were from the right side of the political spectrum. And two, while they’ve probably always been inclined to indulge in conspiratorial thinking, it seems (anecdotally, anyway) that it’s getting worse.

So I decided to dig a little deeper to find the answers to two questions: Why them, and why now?

Let’s start with why them?

My Facebook experience started with the people I grew up with in a small town in Alberta. It’s hard to imagine a more conservative place. The primary industries are oil, gas and farming. Cowboys — real cowboys wearing real Levi jeans — still saunter down Main Street. This was the first place in Western Canada to elect a representative whose goal was to take Western Canada out of a liberal (and Eastern intellectual elitist)—dominated confederation. If you wanted to find the equivalent of Trumpism in Canada, you’d stand a damn good chance of finding it in this part of Alberta. 

So I wondered: What is about conservatives, especially from the extreme right side of conservatism, that make them more susceptible to spreading conspiracy theories?

It turns out it’s not just the extreme right that believes in conspiracies. According to one study, those on the extreme right or left are more apt to believe in conspiracies. It’s just that it happens more often on the right.

And that could be explained by looking at the types of personalities who tend to believe in conspiracies. According to a 2017 analysis of U.S. data by Daniel Freeman and Richard Bentall, over a quarter of the American population are convinced that “there is a conspiracy behind many things in the world.” 

Not surprisingly, when you dig down to the roots of these beliefs, it comes down to a crippling lack of trust, closely tying those ideas to paranoia. Freeman and Bentall noted, “Unfounded conspiracy beliefs and paranoid ideas are both forms of excessive mistrust that may be corrosive at both an individual and societal level.”

So, if one out of every four people in the U.S. (and apparently a notable percentage of Canadians) lean this way, what are these people like? It turns out there are a cluster of personality traits  likely to lead to belief in conspiracy theories.  

First, these people tend to be anxious about things in general. They have a lower level of education and are typically in the bottom half of income ranges. More than anything, they feel disenfranchised and that the control that once kept their world on track has been lost. 

Because of this, they feel victimized by a powerful elite. They have a high “BS receptivity.” And they believe that only they and a small minority of the like-minded know the real truth. In this way, they gain back some of the individual control they feel they’ve lost.

Given the above, you could perhaps understand why, during the Obama years, conspiracy theorists tended to lean to the right. But if anything, there are more conspiratorial conservatives then ever after almost four years of Trump. Those in power were put there by people who don’t trust those in power. So that brings us to the second question: Why now?

Obviously, it’s been a crappy year that has cranked up everybody’s anxiety level. But the conspiracy wave was already well-established when COVID-19 came along. And that wave started when Republicans (and hard right-wing politicians worldwide) decided to embrace populism as a strategy. 

The only way a populist politician can win is by dividing the populace. Populism is – by its nature – antagonistic in nature. There needs to be an enemy, and that enemy is always on the other side of the political divide. As Ezra Klein points out in his book  “Why We’re Polarized,” population density and the U.S. Electoral College system makes populism a pretty effective strategy for the right.

This is why Republicans are actually stoking the conspiracy fires, including outright endorsement of the QAnon-sense. Amazing as it seems, Republicans are like Rocky Balboa: Even when they win, they seem able to continue being the underdog. 

The core that has been whipped up by populism keeps shadow boxing with their avowed enemy: the liberal elite. This political weaponization of conspiracy theories continues to find a willing audience who eagerly amplify it through social media. There is some evidence to show that extreme conservatives are more willing that embrace conspiracies than extreme liberals, but the biggest problem is that there is a highly effective conspiracy machine continually pumping out right-targeted theories.

It seems there were plenty of conspiracies theories making the rounds well before now. The shitstorm that became known as the year 2020 is simply adding fuel to an already raging fire.

The Fickle Fate of Memes

“In the future, everyone will be world-famous for 15 minutes.”

Attributed to Andy Warhol

If your name is Karen, I’m sorry. The internet has not been kind to you over the past 2 years. You’re probably to the point where you hesitate before you tell people your name. And it’s not your fault that your name has meme-famous for being synonymous with bitchy white privilege.

The odds are that you’re a nice person. I know several Karens and not one of them is a “Karen.” On the other hand, I do know a few “Karen”s (as my Facebook adventure from last week makes clear) and not one of them is named Karen.

But that’s the way memes roll. You’re not at the wheel. The trolling masses have claimed your fate and you just have to go along for the ride. That’s true for Karen, where there doesn’t seem to be an actual “Karen” to which the meme can be attributed. But it’s also true when the meme starts with an actual person – like Rebecca Black.

Remember Rebecca Black? No?  I’ll jog your memory –

Yesterday was Thursday, Thursday
Today it is Friday, Friday (partyin’)
We-we-we so excited
We so excited
We gonna have a ball today

Rebecca Black

Yes, that Rebecca Black – star of “Friday”, which for many years was the most hated video in YouTube history (it still ranks at number 15 according to Wikipedia).

Admit it, when you remembered Rebecca Black, you did not do so fondly. But you know nothing about Rebecca Black. Memes seldom come bundled with a back story. So here are a few facts about Friday you didn’t know.

  • Black didn’t write the song. It was written by two LA music producers
  • Black was 13 at the time the video was shot
  • She had no input into the production or the heavy use of Autotune on her vocals
  • She didn’t see the video or hear the final version of the song before it was posted to YouTube

Although Black was put front and center into the onslaught of negativity the video produced, she had very little to do with the finished product. She was just a 13-year-old girl who was hoping to become a professional singer. And suddenly, she was one of the most hated and ridiculed people in the world. The trolls came out in force. And, unsurprisingly, they were merciless. But then mainstream media jumped on the bandwagon. Billboard and Time magazines, CNN, Stephen Colbert, Jimmy Fallon, Jimmy Kimmel and more all heaped ridicule on Black.

That’s a lot for any 13-year-old to handle.  To understand the impact a meme can have, take 11 minutes to watch the video above about Black from Vice. Black seems to have emerged from the experience as a pretty well-adjusted 22-year-old who is still hoping to turn the fame she got into a positive. She is – more than anything – just trying to regain control of her own story.

The fame Rebecca Black found may have turned out to be of the caustic kind when she found it, but at least she was looking for it. Ghyslain Raza never asked for it and never wanted it. He became a meme by accident.

Ghyslain who? Allow your memory to be jogged once again. You probably know Raza better as the Star Wars Kid.

In 2002, Ghyslain Raza was a shy 14-year-old from Quebec who liked to make videos. One of those videos was shot in the school AV room while Raza was “goofing around,” wielding a makeshift light saber he made from a golf ball retriever. That video fell into the hands of a classmate, who – with all the restraint middle schoolers are known for – promptly posted it online. Soon, a torrent of cyber bullying was unleashed on Raza as views climbed into the tens of millions.

The online comments were hurtful enough. More than a few commenters suggested that Raza commit suicide. Some offered to help. But it was no better for Razain in his real life. He had to change schools when what few friends he had evaporated. At the new school, it got worse, “In the common room, students climbed onto tabletops to insult me.”

Imagine for a moment yourself being 14 and dealing with this. Hell, imagine it at the age you are now. Life would be hell. It certainly was for Raza. In an interview with a Canadian news magazine, he said, “No matter how hard I tried to ignore people telling me to commit suicide, I couldn’t help but feel worthless, like my life wasn’t worth living.”

Both Black and Raza survived their ordeals. Aleksey Varner wasn’t so lucky. The over-the-top video resume he made in 2006, Impossible is Nothing, also became a meme when it was posted online without his permission. Actor Michael Cera was one of the many who did a parody. Like Black and Raza, Vayner battled to get his life back. He lost that battle in 2013. He died from a heart attack that a relative has said was brought on by an overdose of medication.

In our culture, online seems to equal open season. Everyone –  even celebrities that should know better – seem to think it’s okay to parody, ridicule, bully or even threaten death. What we conveniently forget is that there is a very real person with very real feelings on the other side of the meme. No one deserves that kind of fame.

Even if their name is Karen.

The Day My Facebook Bubble Popped

I learned this past week just how ideologically homogenous my Facebook bubble usually is. Politically, I lean left of center. Almost all the people in my bubble are the same.

Said bubble has been built from the people I have met in the past 40 years or so. Most of these people are in marketing, digital media or tech. I seldom see anything in my feed I don’t agree with — at least to some extent.

But before all that, I grew up in a small town in a very right-wing part of Alberta, Canada. Last summer, I went to my 40-year high-school reunion. Many of my fellow graduates stayed close to our hometown for those 40 years. Some are farmers. Many work in the oil and gas industry. Most of them would fall somewhere to the right of where I sit in my beliefs and political leanings.

At the reunion, we did what people do at such things — we reconnected. Which in today’s world meant we friended each other on Facebook. What I didn’t realize at the time is that I had started a sort of sociological experiment. I had poked a conservative pin into my liberal social media bubble.

Soon, I started to see posts that were definitely coming from outside my typical bubble. But most of them fell into the “we can agree to disagree” camp of political debate. My new Facebook friends and I might not see eye-to-eye on certain things, but hell — you are good people, I’m good people, we can all live together in this big ideological tent.

On May 1, 2020, things began to change. That was when Canadian Prime Minister Justin Trudeau announced that 1,500 models of “assault-style” weapons would be classified as prohibited, effective immediately. This came after Gabriel Wortman killed 22 people in Nova Scotia, making it Canada’s deadliest shooting spree. Now, suddenly posts I didn’t politically agree with were hitting a very sensitive raw nerve. Still, I kept my mouth shut. I believed arguing on Facebook was pointless.

Through everything that’s happened in the four months since (it seems like four decades), I have resisted commenting when I see posts I don’t agree with. I know how pointless it is. I realize that I am never going to change anyone’s mind through a comment on a Facebook post.

I understand this is just an expression of free speech, and we are all constitutionally entitled to exercise it. I stuck with the Facebook rule I imposed for myself — keep scrolling and bite your tongue. Don’t engage.

I broke that rule last week. One particular post did it. This post implied that with a COVID-19 survival rate of almost 100%, why did we need a vaccine? I knew better, but I couldn’t help it.

I engaged. It was limited engagement to begin with. I posted a quick comment suggesting that with 800,000 (and counting) already gone, saving hundreds of thousands of lives might be a pretty good reason. Right or left, I couldn’t fathom anyone arguing with that.

I was wrong. Oh my God, was I wrong. My little comment unleashed a social media shit storm. Anti-vaxxing screeds, mind-control plots via China, government conspiracies to artificially over-count the death toll and calling out the sheer stupidity of people wearing face masks proliferated in the comment string for the next five days. I watched the comment string grow in stunned disbelief. I had never seen this side of Facebook before.

Or had I? Perhaps the left-leaning posts I am used are just as conspiratorial, but I don’t realize it because I happen to agree with them. I hope not, but perspective does strange things to our grasp of the things we believe to be true. Are we all — right or left — just exercising our right to free speech through a new platform? And — if we are — who am I to object?

Free speech is held up by Mark Zuckerberg and others as hallowed ground in the social-media universe. In a speech last fall at Georgetown University, Zuckerberg said: “The ability to speak freely has been central in the fight for democracy worldwide.”

It’s hard to argue that. The ability to publicly disagree with the government or any other holder of power over you is much better than any alternative. And the drafters of the U.S. Bill of Rights agreed. Freedom of speech was enshrined in the First Amendment. But the authors of that amendment — perhaps presciently — never defined exactly what constituted free speech. Maybe they knew it would be a moving target.

Over the history of the First Amendment, it has been left to the courts to decide what the exceptions would be.

In general, it has tightened the definitions around one area — what types of expression constitute a “clear and present danger” to others.  Currently, unless you’re specifically asking someone to break the law in the very near future, you’re protected under the First Amendment.

But is there a bigger picture here —one very specific to social media? Yes, legally in the U.S. (or Canada), you can post almost anything on Facebook.

Certainly, taking a stand against face masks and vaccines would qualify as free speech. But it’s not only the law that keeps society functioning. Most of the credit for that falls to social norms.

Social norms are the unwritten laws that govern much of our behavior. They are the “soft guard rails” of society that nudge us back on track when we veer off-course. They rely on us conforming to behaviors accepted by the majority.

If you agree with social norms, there is little nudging required. But if you happen to disagree with them, your willingness to follow them depends on how many others also disagree with them.

Famed sociologist Mark Granovetter showed in his Threshold Models of Collective Behavior that there can be tipping points in groups. If there are enough people who disagree with a social norm, it will create a cascade that can lead to a revolt against the norm.

Prior to social media, the thresholds for this type of behavior were quite high. Even if some of us were quick to act anti-socially, we were generally acting alone.

Most of us felt we needed a substantial number of like-minded people before we were willing to upend a social norm. And when our groups were determined geographically and comprised of ideologically diverse members, this was generally sufficient to keep things on track.

But your social-media feed dramatically lowers this threshold.

Suddenly, all you see are supporting posts of like-minded people. It seems that everyone agrees with you. Emboldened, you are more likely to go against social norms.

The problem here is that social norms are generally there because they are in the best interests of the majority of the people in society. If you go against them, by refusing a vaccine or to wear a face mask,  thereby allowing a disease to spread, you endanger others. Perhaps it doesn’t meet the legal definition of “imminent lawlessness,” but it does present a “clear and present danger.”

That’s a long explanation of why I broke my rule about arguing on Facebook.

Did I change anyone’s mind? No. But I did notice that the person who made the original post has changed their settings, so I don’t see the political ones anymore. I just see posts about grandkids and puppies.

Maybe it’s better that way.

Do We Still Need Cities?

In 2011, Harvard economist Edward Glaeser called the city “man’s greatest invention” in his book “Triumph of the City,” noting that “there is a near-perfect correlation between urbanization and prosperity across nations.”

Why is this so? It’s because historically we needed a critical mass of connection in order to accelerate human achievement.  Cities bring large numbers of people into closer, more frequent and productive contact than other places.  This direct, face-to-face contact is critical for facilitating the exchange of knowledge and ideas that lead to the next new venture business, medical discovery or social innovation.

This has been true throughout our history. While cities can be messy and crowded, they also spin off an amazing amount of ingenuity and creativity, driving us all forward.

But the very same things that make cities hot beds of productive activity also make them a human petri dish in the midst of a pandemic.

Example: New York

If the advantages that Glaeser lists are true for cities in general, it’s doubly true for New York, which just might be the greatest city in the world. Manhattan’s population density is 66,940 people per square mile, which makes it the highest of any area in the U.S. It’s also diverse, with 36% of population foreign-born. It attracts talent in all types of fields from around the world.

Unfortunately, all these things also set New York up to be particularly hard hit by COVID-19. To date, according to Google’s tracker, it has 236,000 confirmed cases of COVID-19 and a mortality rate of 10%. That case rate would put it ahead of all but 18 countries in the world. What has made New York great has also made it tragically vulnerable to a pandemic.

New York is famous for its gritty resilience. But at least one New Yorker thinks this might be the last straw for the Big Apple. In an essay entitled “New York City is dead forever,” self-published and then reprinted by the New York Post, comedy club owner James Altucher talks about how everyone he knows is high-tailing it out of town for safer, less crowded destinations, leaving a ghost town in their wake.

He doesn’t believe they’re coming back. The connections that once relied on physical proximity can now be replicated by technology. Not perfectly, perhaps, but well enough. Certainly, well enough to tip the balance away from the compromises you have to be prepared to swallow to live in a city like New York: higher costs of living, exorbitant real estate, higher crime rates and the other grittier, less-glittery sides of living in a crowded, dense metropolis.


Example: Silicon Valley

So, perhaps tech is partly (or largely) to blame for the disruption to the interconnectedness of cities. But, ironically, thanks to COVID-19, the same thing is happening to the birthplace of tech, Silicon Valley and the Bay area of Northern California.

Barb is a friend of mine who was born in Canada but has lived much of her life in Palo Alto, California — a stone’s throw from the campus of Stanford University. She recently beat a temporary retreat back to her home and native land north of the 49th Parallel.  When Barb explained to her Palo Alto friends and neighbors why Canada seemed to be a safer place right now, she explained it like this,

“My county — Santa Clara — with a population of less than 2 million people, has had almost as many COVID cases in the last three weeks as the entire country of Canada.”

She’s been spending her time visiting her Canadian-based son and exploring the natural nooks and crannies of British Columbia while doing some birdwatching along the way.  COVID-19 is just one of the factors that has caused her to start seriously thinking about life choices she couldn’t have imagined just a few short years ago. As Barb said to me as we chatted, “I have a flight home booked — but as it gets closer to that date, it’s becoming harder and harder to think about going back.”  

These are just two examples of the reordering of what will become the new normal. Many of us have retreated in search of a little social distance from what our lives were. Increasingly, we are relying on tech to bridge the distances that we are imposing between ourselves and others. Breathing room — in its most literal sense — has become our most immediate priority.

This won’t change anytime soon. We can expect this move to continue for at least the next year. It could be — and I suspect it will be — much longer. Perhaps James Altucher is right. Could this pandemic – aided and abetted by tech – finally be what kills mankind’s greatest invention? As he writes in his essay,

“Everyone has choices now. You can live in the music capital of Nashville, you can live in the ‘next Silicon Valley’ of Austin. You can live in your hometown in the middle of wherever. And you can be just as productive, make the same salary, have higher quality of life with a cheaper cost.”

If Altucher is right, there’s another thing we need to think about. According to Glaeser, cities are not only great for driving forward innovation. They also put some much-needed distance between us and nature:

“We human are a destructive species. We tend to destroy stuff when we’re around it. And if you love nature, stay away from it.”

As we look to escape one crisis, we might be diving headlong into the next.

Our Brain And Its Junk News Habit

Today, I’m going to return to the Reuter’s Digital News Report and look at the relationship between us, news and social media. But what I’m going to talk about is probably not what you think I’m going to talk about.

Forget all the many, many problems that come with relying on social media to be informed. Forget about filter bubbles and echo chambers. Forget about misleading or outright false stories. Forget about algorithmic targeting. Forget about the gaping vulnerabilities that leave social media open to nefarious manipulation. Forget all that (but just for the moment, because those are all horrible and very real problems that we need to focus on).

Today, I want to talk about one specific problem that comes when we get our news through social media. When we do that, our brains don’t work the way they should if we want to be well informed.

First, let’s talk about the scope of the issue here. According to the Reuter’s study, in the U.S. more people — 72% — turn online for news than any other source. Television comes in second at 59%. If we single out social media, it comes in third at 48%. Trailing the pack is print media at just 20%.

Reuters Digital News Study 2020 – Sources of News in US

If we plot this on a chart over the last seven years, print and social media basically swapped spots, with their respective lines crossing each other in 2014; one trending up and one trending down. In 2013, 47% of us turned to print as a primary news source and just 27% of us went to social media.

If we further look at those under 35, accessing news through social media jumps to the number-one spot by a fairly wide margin. And because they’re young, we’re not talking Facebook here. Those aged 18 to 24 are getting their news through Instagram, Snapchat and TikTok.

The point, if it’s not clear by now, is that many of us get our news through a social media channel — and the younger we are, the more that’s true. The paradox is that the vast majority of us — over 70% — don’t trust the news we see on our social media feeds. If we were to pick an information source we trusted, we would never go to social media.

This brings up an interesting juxtaposition in how we’re being informed about the world: almost all of us are getting our news through social media, but almost none of us are looking for it when we do.

According to the Reuter’s Report, 72% of us (all ages, all markets) get our news through the “side door.” This means we are delivered news — primarily through social media and search — without us intentionally going directly to the source of the information. For those aged 18 to 24, “side door” access jumps to 84% and, of that, access through social media jumps to 38%.

Our loyalty to the brand and quality of an information provider is slipping between our fingers and we don’t seem to care. We say we want objective, non-biased, quality news sources, but in practice we lap up whatever dubious crap is spoon-fed to us by Facebook or Instagram. It’s the difference between telling our doctor what we intend to eat and what we actually eat when we get home to the leftover pizza and the pint of Häagen-Dazs in our fridge.

The difference between looking for and passively receiving information is key to understanding how our brain works. Let’s talk a little bit about “top-down” and “bottom-up” activation and the “priming” of our brain.

When our brain has a goal — like looking for COVID-19 information — it behaves significantly differently than when it is just bored and wanting to be entertained.

The goal sets a “top down” intent. It’s like an executive order to the various bits and pieces of our brain to get their shit together and start working as a team. Suddenly the entire brain focuses on the task at hand and things like reliability of information become much more important to us. If we’re going to go directly to a information source we trust, this is going to be when we do it.

If the brain isn’t actively engaged in a goal, then information has to initiate a “bottom-up” activation. And that is an entirely different animal.

We never go to social media looking for a specific piece of news. That’s not how social media works. We go to our preferred social channels either out of sheer boredom or a need for social affirmation. We hope there’s something in the highly addictive endlessly scrolling format that will catch our attention.

For a news piece to do that, it has to somehow find a “hook” in our brain.  Often, that “hook” is an existing belief. The parts of our brain that act as gatekeepers against unreliable information are bypassed because no one bothered to wake them up.

There is a further brain-related problem with relying on social media, and that’s the “priming” issue. This is where one stimulus sets a subconscious “lens” that will impact subsequent stimuli. Priming sets the brain on a track we’re not aware of, which makes it difficult to control.

Social media is the perfect priming platform. One post sets the stage for the next, even if they’re completely unrelated.

These are just two factors that make social media an inherently dangerous platform to rely on for being informed.

The third is that social media makes information digestion much too easy. Our brain barely needs to work at all. And if it does need to work, we quickly click back and scroll down to the next post. Because we’re looking to be entertained, not informed, the brain is reluctant to do any unnecessary heavy lifting.   

This is a big reason why we may know the news we get through social media channels is probably not good for us, but we gulp it down anyway, destroying our appetite for more trustworthy information sources.

These three things create a perfect cognitive storm for huge portions of the population to be continually and willingly misinformed. That’s not even factoring in all the other problems with social media that I mentioned at the outset of this column. We need to rethink this — soon!

What Would Aaron Do?

I am a big Aaron Sorkin fan. And before you rain on my parade, I say that fully understanding that he epitomizes the liberal intellectual elitist, sanctimonious cabal that has helped cleave American culture in two. I get that. And I don’t care.

I get that his message is from the left side of the ideological divide. I get that he is preaching to the choir. And I get that I am part of the choir. Still, given the times, I felt that a little Sorkin sermon was just what I needed. So I started rewatching Sorkin’s HBO series “The Newsroom.”

If you aren’t part of this particular choir, let me bring you up to speed. The Newsroom in this case is at the fictional cable network ACN. One of the primary characters is lead anchor Will McEvoy (played by Jeff Daniels), who has built his audience by being noncontroversial and affable — the Jay Leno of journalism. 

This brings us to the entrance of the second main character: Mackenzie McHale, played by Emily Mortimer. Exhausted from years as an embedded journalist covering multiple conflicts in Afghanistan, Pakistan and Iraq, she comes on board as McEvoy’s new executive producer (and also happens to be his ex-girlfriend). 

In typical Sorkin fashion, she goads everyone to do better. She wants to reimagine the news by “reclaiming journalism as an honorable profession,” with “civility, respect, and a return to what’s important; the death of bitchiness; the death of gossip and voyeurism; speaking truth to stupid.”

I made it to episode 3 before becoming profoundly sad and world-weary. Sorkin’s sermon from 2012—– just eight years ago —  did not age well. It certainly didn’t foreshadow what was to come. 

Instead of trying to be better, the news business — especially cable news — has gone in exactly the opposite direction, heading straight for Aaron Sorkin’s worst-case scenario. This scenario formed part of a Will McEvoy speech in that third episode: “I’m a leader in an industry that miscalled election results, hyped up terror scares, ginned up controversy, and failed to report on tectonic shifts in our country — from the collapse of the financial system to the truths about how strong we are to the dangers we actually face.”

That pretty much sums up where we are. But even Sorkin couldn’t anticipate what horrors social media would throw into the mix. The reality is actually worse than his worst-case scenario. 

Sorkin’s appeal for me was that he always showed what “better” could be. That was certainly true in his breakthrough political hit “The West Wing.” 

He brought the same message to the jaded world of journalism in “The Newsroom. He was saying, “Yes, we are flawed people working in a flawed system set in a flawed nation. But it can be better….Our future is in our hands. And whatever that future may be, we will be held accountable for it when it happens.”

This message is not new. It was the blood and bones of Abraham Lincoln’s annual address to Congress on December 1, 1862, just one month before the Emancipation Proclamation was signed into law. Lincoln was preparing the nation for the choice of a path which may have been unprecedented and unimaginably difficult, but would ultimately be proven to be the more moral one: “It is not ‘can any of us imagine better?’ but, ‘can we all do better?’ The dogmas of the quiet past, are inadequate to the stormy present. The occasion is piled high with difficulty, and we must rise — with the occasion.”

“The Newsroom” was Sorkin’s last involvement with a continuing TV series. He was working on his directorial movie debut, “Molly’s World,” when Trump got elected. 

Since then, he has adapted Harper Lee’s “To Kill a Mockingbird” for Broadway, with “The Newsroom’”s Jeff Daniels as Atticus Finch. 

Sorkin being Sorkin, he ran into a legal dispute with Lee’s estate when he updated the source material to be a little more open about the racial tension that underlies the story. Aaron Sorkin is not one to let sleeping dogmas lie. 

Aaron Sorkin also wrote a letter to his daughter and wife on the day after the 2016 election, a letter than perhaps says it all. 

It began, “Well the world changed late last night in a way I couldn’t protect us from.”

He was saying that as a husband and father. But I think it was a message for us all — a message of frustration and sadness. He closed the letter by saying “I will not hand [my daughter] a country shaped by hateful and stupid men. Your tears last night woke me up, and I’ll never go to sleep on you again.”

Yes, Sorkin was preaching when he was scripting “The Newsroom.” But he was right. We should do better. 

In that spirit, I’ll continue to dissect the Reuters study on the current state of journalism I mentioned last week. And I’ll do this because I think we have to hold our information sources to “doing better.” We have to do a better job of supporting those journalists that are doing better. We have to be willing to reject the “dogmas of the quiet past.” 

One of those dogmas is news supported by advertising. The two are mutually incompatible. Ad-supported journalism is a popularity contest, with the end product a huge audience custom sliced, diced and delivered to advertisers — instead of a well-informed populace.

We have to do better than that.