Are We Killing Politeness?

One of the many casualties of our changing culture seems to be politeness. When the President of the United States is the poster child for rude behavior, it’s tough for politeness to survive. This is especially true in the no-holds-barred, digitally distanced world of social media.

I consider myself to be reasonably polite. Being so, I also expect this in others. Mild rudeness makes me anxious. Excessive rudeness makes me angry. This being the case, I am troubled by the apparent decline of civility. So today I wanted to take a look at politeness and why it might be slipping away from us.

First of all, we have to understand the politeness is not universal. What is considered polite in one culture is not in another.

Secondly, being polite is not the same as being friendly. Or empathetic. Or being respectful of others. Or being compassionate, according to this post  from The Conversation. There is a question of degree and intent here. Being polite is a rather unique behavior that encompasses both desirable and less desirable qualities. And that begs the question: What is the purpose of politeness? Is a less-polite world a good or a bad thing?

First, let’s look at the origin of the world. It comes from the Latin “politus,” meaning “polished — made smooth.” Just in case you’re wondering, “politics” does not come from the same root. That comes from the Greek word for “citizen” — “polites.”

One last etymological nugget. The closest comparison to polite may be “nice,” which originates from the Latin “nescius,” meaning “ignorant”. Take that for what it’s worth.

This idea of politeness as a type of social “polish” really comes from Europe — and especially Britain. There, politeness was linked with class hierarchies. Being polite was a sign of good breeding — a dividing line between the high-born and the riffraff. This class-bound definition came along with the transference of the concept to North America.

Canada is typically considered one of the most polite nations in the world. As a Canadian who has traveled a fair amount, I would say that’s probably true.

But again, there are variations in the concept of politeness and how it applies to both Canadians and Americans.

When we consider the British definition of politeness, you begin to see how Americans and Canadians might respond differently to it. To understand that is to understand much of what makes up our respective characters.

As a Canadian doing much of my business in the U.S. for many years, I was always struck by the difference in approaches I found north and south of the 49th parallel. Canadians businesses we met with were unfailingly polite, but seldom bought anything. Negotiating the prospect path in Canada was a long and often frustrating journey.

American businesses were much more likely to sign a contract. On the whole, I would also say they were friendlier in a more open and less-guarded way. I have to admit that in a business setting, I preferred the American approach.

According to anthropologists Penelope Brown and Stephen Levinson, who have extensively researched politeness, there is negative and positive politeness. Negative politeness is concern with adhering to social norms, often by deferring to someone or something else.

This is Canadian politeness personified. Our entire history is one of deference to greater powers, first to our colonial masters — the British and French — and, more recently, from our proximity to the cultural and economic master that is the U.S.

For Canadians, deferral is survival. As former Prime Minister Pierre Trudeau once said about the U.S., “Living next to you is in some ways like sleeping with an elephant. No matter how friendly and even-tempered is the beast, if I can call it that, one is affected by every twitch and grunt.”

Negative politeness is a way to smooth out social friction, but there is good and bad here. Ideally it should establish a baseline of trust, respect and social capital. But ultimately, politeness is a consensus of compromise.  And that’s why Canadians are so good at it.  Negative politeness wants everything to be fair.

But then there is positive politeness, which is more American in tone and nature. This is a desire to help others, making it more closely linked to compassion. But in this noble motive there is also a unilateral defining of what is right and wrong. Positive politeness tries to make everything right, based on the protagonist’s definition of what that is.

The two sides of politeness actually come from different parts of the brain. Negative politeness comes from the part of the brain that governs aggression. It is all about applying brakes to our natural instincts. But positive politeness comes from the part of the brain that regulates social bonding and affiliation.

When you understand this, you understand the difference between Canadians and Americans in what we consider polite. For the former, our definition comes handed down from the British class-linked origins, and has morphed into a culture of compromise and deferral.

The American definition comes from many generations of being the de facto moral leaders of the free world.

We (Canadians) want to be nice. You (Americans) want to be right. The two are not mutually exclusive, but they are also not the same thing. Not by a long shot.

What Trump has done (with a certain kind of perverse genius) has played on this national baseline of compassion. He has wantonly discarded any vestiges of politeness and split the nation on what it means to be right.

But by eliminating politeness, you have also eliminated that governor of our behavior. Reactions about what is right and wrong are now immediate, rough and unfiltered.

The polish that politeness brings — that deferral of spoken judgement for even a brief moment in order to foster cooperation — is gone. We have no opportunity to consider other perspectives. We have no motive to cooperate. This is abundantly apparent on every social media platform.

In game theory, politeness is a highly successful strategy commonly called “tit for tat.” It starts from assuming a default position of fairness from the other party, continuing to cooperate if this proves to be true, and escalating to retaliation if it’s not. But this tactic evolved in a world of face-to-face encounters. Somehow, it seems less needed in a divided world where rudeness and immediate judgement are the norm.

Still, I will cling to my notions of politeness. Yes, sometimes it seems to get in the way of definitive action. But on the whole, I would rather live in a world that’s a little nicer and a little more polite, even if that seems foolish to some of you.

TV and My Generation

My Generation has been a dumpster fire of epic proportions. I am a baby boomer, born in 1961, at the tail end of the boom. And, according to Time magazine, we broke America.  We probably destroyed the planet. And, oh yeah, we’ve also screwed up the economy. I’d like to say it isn’t true, but I’m pretty sure it is. As a generation, we have an extensive rap sheet.

Statistically, baby boomers are one of the most politically polarized generations alive today. So, the vast chasm that exists between the right and the left may also be our fault. 

As I said, we’re a generational dumpster fire. 

A few columns back I said this: “We create the medium — which then becomes part of the environment we adapt to.”  I was referring to social media and its impact on today’s generations. 

But what about us? What about the generation that has wreaked all this havoc? If I am right and the media we make in turn makes us who we are, what the hell happened to our generation?

Television, that’s what. 

There have been innumerable treatises on how baby boomers got to be in the sorry state we’re in. Most blame the post-war affluence of America and the never-ending consumer orgy it sparked. 

But we were also the first generation to grow up in front of a television screen. Surely that must have had some impact. 

I suspect television was one of the factors that started driving the wedge between the right and left halves of our generation, creating a non-stretchable world in between. Further, I think it may have been the prime suspect.

Let’s plot the trends of what was on TV against my most influential formative years, and — by extension — my generation. 

When I was 5 years old, in 1966, the most popular TV shows fell into two categories: westerns like “Bonanza” and “Gunsmoke,” or cornfed comedies like “The Andy Griffith Show,” “The Beverly Hillbillies,” “Green Acres” and “Petticoat Junction.” Social commentary and satire were virtually nonexistent on American prime-time TV. The values of America were tightly censored, wholesome and non-confrontational. The only person of color in the line-up was Bill Cosby on “I Spy.” Thanks to “Hogan’s Heroes,” even the Nazis were lovable doofuses. 

I suspect when certain people of my generation want to Make America Great Again, it is this America they’re talking about. It was a white, wholesome America that was seen through the universally rose-colored glasses given to us by the three networks. 

It was also completely fictional, ignoring inconveniences like the civil rights movement, Vietnam and rampant gender inequality. This America never existed. 

When we talk about the cultural environment my generation literally cut our teeth in, this is what we refer to. There was no moral ambiguity. It was clear who the good guys were, because they all wore white hats. 

This moral baseline was spoon-fed to us right when we were first making sense of our own realities. Unfortunately, it bore little to no resemblance to what was actually real.

The fact was, through the late ’60s, America was already increasingly polarized politically. Left and right were drifting apart. Even Bob Hope felt the earth splitting beneath his feet. In November, 1969, he asked all the elected leaders of the country, no matter their politics, to join him in a week of national unity. One of those leaders called it “a time of crisis, greater today perhaps than since the Civil War.” 

But rather than trying to heal the wounds, politicians capitalized on them, further splitting the country apart by affixing labels like Nixon’s “The Silent Majority.” 

Now, let’s move ahead to my teen years. From our mid-teens to our mid-twenties, we create our social identities. Our values and morals take on some complexity. The foundations for our lifelong belief structures are formed during these years. 

In 1976, when I was 15, the TV line-up had become a lot more controversial. We had many shows regularly tackling social commentary: “All in the Family,” “M*A*S*H,” “Sanford and Son,” “Welcome Back, Kotter,” “Barney Miller” and “Good Times.” Of course, we still had heaps of wholesome, thanks to “Happy Days,” “Marcus Welby, M.D.” and “The Waltons.

Just when my generation was forming the values that would define us, our prime-time line-up was splitting left and right. You had the social moralizing of left-leaning show runners like Norman Lear (“All in the Family”) and Larry Gelbart (“M*A*S*H”) vs the God and Country values of “The Waltons” and “Little House on the Prairie.” 

I don’t know what happened in your hometown, but in mine, we started to be identified by the shows we watched (or, often, what our parents let us watch). You had the “All in the Family” Group and “The Waltons” Group. In the middle, we could generally agree on “Charlie’s Angels” and “The Six Million Dollar Man.” The cracks in the ideologies of my generation were starting to show.

I suspect as time went forward, the two halves of my generation started looking to television with two different intents: either to inform ourselves of the world that is, warts and all — or to escape to a world that never was. As our programming choices expanded, those two halves got further and further apart, and the middle ground disappeared. 

There are other factors, I’m sure. But speaking for myself, I spent an unhealthy amount of time watching TV when I was young. It couldn’t help but partially form the person I am today. And if that is true for me, I suspect it is also true for the rest of my generation.

A World Flattened by Social Media

“Life is What Happens to You While You’re Busy Making Other Plans”

John Lennon

The magic of our lives is in the nuance, the unexpected and – sometimes – the mundane. It depends on bandwidth – a full spectrum of experience and stimuli that extends beyond the best attempts of our imagination to put boundaries around it. As Mr. Lennon knew, life is lived in a continual parade of moments that keeps marching past us, whether we’ve planned them or not.

Of course, our current ability to make life plans is not what it once was. In fact, most aspects of our former lives have gone into a forced hibernation. Suddenly, our calendars are completely clear and we have a lot of unexpected time on our hands. So many of us have been spending more of that time on social media. I don’t know about you, but I’m finding that a poor substitute for the real world.

I’ve noticed a few of my friends have recently posted that they’re taking a break from Facebook. That’s not unprecedented. But I think this time might be different. Speaking for myself, I have recently been experiencing a strange combination of anxiety and ennui when I spend any time on Facebook.

First, there are the various posts of political and moral outrage. I agree with almost all of them, in direction if not necessarily in degree.

And then there are the various posts of inspirational quotes and assorted pictures of loaves of bread, pets, gardens, favorite albums, our latest hobby, walks in the woods and kids doing adorable things. It is the assorted bric-a-brac of our new normal under COVID.

I like and/or agree with almost all these things. Facebook’s targeting algorithm has me pretty much pegged. But if the sum total of my Facebook feed defined the actual world I had to live in, I would get pretty bored with it in the first 15 minutes.

It would be like seeing the world only in blue and orange. I like blue. I’m okay with orange. But I don’t want to see the world in only those two colours. And that’s what Facebook does.

This is not a slight against Facebook. None of us (with the possible exception of Mark Zuckerberg) should expect it to be a substitute for the real world. But now that a lot of us have been restricted from experiencing big chunks of the real world and have substituted time with social media for it, we should realize the limitations of what it can provide.

Facebook and other social media platforms give us a world without nuance, without bandwidth, without serendipity and without context. Further, it is a world that has been algorithmically altered and filtered specifically for a data defined avatar of who we really are. We’re not even getting the full bandwidth of what is on the platform. We’re getting what happens to squeeze past the content filters that act as our own personalized gatekeepers.

What the past 3 months has taught me is that when we rely on social media for experience, information or perspective, we have to take it for what it is. As a source of information, it is at best highly restricted and biased. As a source of social connection and experience, it is mercilessly flattened and stripped of all nuance. As a substitute for the real world, it comes up woefully short.

Perhaps the biggest restriction with social media is that everything we see is planned and premeditated, either by humans or an algorithm. The content that is posted is done so with clear intent. And the content we actually see has been targeted to fit within some data driven pigeonhole that an algorithm has decided represents us. What we’re missing is exactly what John Lennon was referring to when he talked about what life is: the unplanned, the unexpected, the unintended.

We’re missing an entire spectrum of color beyond blue and orange.

Crisis? What Crisis?

You would think that a global pandemic would hold our attention for a while.

Nope.

We’re tired of it. We’re moving on.  We’re off to the next thing.

Granted, in this case the next thing deserves to be focused on. It is abysmal that it still exists. So it should be focused on. Probably for the rest of our lives and beyond – going forward until it ceases to be a thing. But it won’t be. Soon we’ll be talking about something else.

And that’s the point of this post – our collective inability to remain focused on anything without being distracted by the next breaking story in our news feed. How did we come to this?

I blame memes.

To a certain extent, our culture is the product of who we are and who we are is a product of our culture. Each is shaped by the other, going forward in a constantly improvised pas de deux. Humans create the medium – which then becomes part of the environment we adapt too.

Books and the printed word changed who we were for over five centuries.  Cinema has been helping to define us for almost 150 years. And radio and television has been moulding us for the past century. Our creations have helped create who we are.

This has never been truer than with social media. Unlike other media which took discrete chunks of our time and attention, social media is ubiquitous and pervasive. According to a recent survey, we spend on average 2 hours and 23 minutes per day on social media. That is about 13% of our waking hours.  Social media has become intertwined with our lives to the point that we had to start qualifying what happens where with labels like “IRL” (In Real Life).

There is another difference between social media and what has come before it. Almost every previous entertainment medium that has demanded our attention has been built on the foundation of a long form narrative arc. Interacting with each medium has been a process – a commitment to invest a certain amount of time to go on a journey with the storyteller. The construction of a story depends on patterns that are instantly recognized by us. Once we identify them, we are invested in discovering the outcome. We understand that our part of the bargain is to exchange our time and attention. The payoff is the joy that comes from us making sense of a new world or situation, even if it is imaginary.  

But social media depends on a different exchange. Rather than tapping into our inherent love of the structure of a story it depends on something called variable intermittent rewards. Essentially, it’s the same hook that casinos use to keep people at a slot machine or table. Not only is it highly addictive, it also pushes us to continually scroll to the next thing. It completely bypasses the thinking part of our brains and connects directly to the reward center buried in our limbic system. Rather than ask for our time and attention social media dangles a never-ending array of bright, shiny memes that asks nothing from us: no thinking, almost no attention and a few seconds of our time at most. For a lazy brain, this is the bargain of a lifetime.

It’s probably not a coincidence that the media that are most dependent on advertising are also the media that avoids locking our attention on a single topic for an extended period. This makes social media the perfect match for interruptive ad forms. They are simply slotted into the never-ending scroll of memes.

Social media has only been around for a little over 2 decades. It has been a significant part of our lives for half that time. If even a little bit of what I suspect is happening is indeed taking place, that scares the hell out of me. It would mean that no other medium has changed us so much and so quickly.

That is something worth paying attention to. 

How Social Media is Rewiring our Morality

Just a few short months ago, I never dreamed that one of the many fault lines in our society would be who wore a face mask and who didn’t. But on one day last week, most of the stories on CNN.com were about just that topic.

For reasons I’ll explain at the end of this post, the debate has some interesting moral and sociological implications. But before we get to that, let’s address this question: What is morality anyway?

Who’s On First?

In the simplest form possible, there is one foundational evolutionary spectrum to what we consider our own morality, which is: Are we more inclined to worry about ourselves or worry about others? Each of us plots our own morals somewhere on this spectrum.

At one end we have the individualist, the one who continually puts “me first.” Typically, the morals of those focused only on themselves concern individual rights, freedoms and beliefs specific to them. This concern for these rights does not extend to anyone considered outside their own “in” group.

As we move across the spectrum, we next find the familial moralist: Those who worry first about their own kin. Morality is always based on “family first.”

Next comes those who are more altruistic, as long as that altruism is directed at those who share common ground with themselves.  You could call this the “we first” group.

Finally, we have the true altruist, who believes in a type of universal altruism and that a rising tide truly lifts all boats.  

This concept of altruism has always been a bit of a puzzle for early evolutionists. In sociological parlance, it’s called proactive prosociality — doing something nice for someone who is not closely related to you without being asked. It seems at odds with the concept of the Selfish Gene, first introduced by evolutionary biologist Richard Dawkins in his book of the same name in 1976.

But as Dawkins has clarified over and over again since the publication of the book, selfish genes and prosociality are not mutually exclusive. They are, in fact, symbiotic.

Moral Collaboration

We have spent about 95% or our entire time as a species as hunter-gatherers. If we have evolved a mechanism of morality,  it would make sense to be most functional in that type of environment.

Hunter-gatherer societies need to collaborate. This is where the seeds of reciprocal altruism can be found. A group of people who work together to ensure continued food supplies will outlive and out-reproduce a group of people who don’t.  From a selfish gene perspective, collaboration will beat stubborn individualism.

But this type of collaboration comes with an important caveat: It only applies to individuals that live together in the same communal group.

Social conformity acts as a manual override on our own moral beliefs. Even in situations where we may initially have a belief of what is right and wrong, most of us will end up going with what the crowd is doing.

It’s an evolutionary version of the wisdom of crowds. But our evolved social conformity safety net comes with an important caveat: it assumes that everyone in the group is  in the same physical location and dealing with the same challenge.  

There is also a threshold effect there that determines how likely we are to conform. How we will act in any given situation will depend on a number of factors: how strong our existing beliefs are, the situation we’re in, and how the crowd is acting. This makes sense. Our conformity is inversely related to our level of perceived knowledge. The more we think we know, the less likely it is that we’ll conform to what the crowd is doing.

We should expect that a reasonably “rugged” evolutionary environment where survival is a continual struggle would tend to produce an optimal moral framework somewhere in the middle of familial and community altruism, where the group benefits from collaboration but does not let its guard down against outside threats.

But something interesting happens when the element of chronic struggle is removed, as it is in our culture. It appears that our morality tends to polarize to opposite ends of the spectrum.

Morality Rewired

What happens when our morality becomes our personal brand, part of who we believe we are? When that happens, our sense of morality migrates from the evolutionary core of our limbic brain to our cortex, the home of our personal brand. And our morals morph into a sort of tribal identity badge.

In this case, social media can short-circuit the evolutionary mechanisms of morality.

For example, there has been a proven correlation  between prosociality and the concept of “watching eye.” We are more likely to be good people when we have an audience.

But social media twists the concept of audience and can nudge our behavior from the prosocial to the more insular and individualistic end of the spectrum.

The successfulness of social conformity and the wisdom of crowds depends on a certain heterogeneity in the ideological makeup of the crowd. The filter bubble of social media strips this from our perceived audience, as I have written. It reinforces our moral beliefs by surrounding us with an audience that also shares those beliefs. The confidence that comes from this tends to push us away from the middle ground of conformed morality toward outlier territory. Perhaps this is why we’re seeing the polarization of morality all too evident today.

As I mentioned at the beginning, there may never have been  a more observable indicator of our own brand of morality than the current face-mask debate.

In an article on Businessinsider.com, Daniel Ackerman compared it to the crusade against seat belts in the 1970’s. Certainly when it comes to our perceived individual rights and not wanting to be told what to do, there are similarities. But there is one crucial difference. You wear seat belts to save your own life. You wear a face mask to save other lives.

We’ve been told repeatedly that the main purpose of face masks is to stop you spreading the virus to others, not the other way around. That makes the decision of whether you wear a face mask or not the ultimate indicator of your openness to reciprocal altruism.

The cultural crucible in which our morality is formed has changed. Our own belief structure of right and wrong is becoming more inflexible. And I have to believe that social media may be the culprit.

A.I. and Our Current Rugged Landscape

In evolution, there’s something called the adaptive landscape. It’s a complex concept, but in the smallest nutshell possible, it refers to how fit species are for a particular environment. In a relatively static landscape, status quos tend to be maintained. It’s business as usual. 

But a rugged adaptive landscape —-one beset by disruption and adversity — drives evolutionary change through speciation, the introduction of new and distinct species. 

The concept is not unique to evolution. Adapting to adversity is a feature in all complex, dynamic systems. Our economy has its own version. Economist Joseph Schumpeter called them Gales of Creative Destruction.

The same is true for cultural evolution. When shit gets real, the status quo crumbles like a sandcastle at high tide. When it comes to life today and everything we know about it, we are definitely in a rugged landscape. COVID-19 might be driving us to our new future faster than we ever suspected. The question is, what does that future look like?

Homo Deus

In his follow up to his best-seller “Sapiens: A Brief History of Humankind,” author Yuval Noah Harari takes a shot at predicting just that. “Homo Deus: A Brief History of Tomorrow” looks at what our future might be. Written well before the pandemic (in 2015) the book deals frankly with the impending irrelevance of humanity. 

The issue, according to Harari, is the decoupling of intelligence and consciousness. Once we break the link between the two, the human vessels that have traditionally carried intelligence become superfluous. 

In his book, Harari foresees two possible paths: techno-humanism and Dataism. 

Techno-humanism

In this version of our future, we humans remain essential, but not in our current form. Thanks to technology, we get an upgrade and become “super-human.”

Dataism

Alternatively, why do we need humans at all? Once intelligence becomes decoupled from human consciousness, will it simply decide that our corporeal forms are a charming but antiquated oddity and just start with a clean slate?

Our Current Landscape

Speaking of clean slates, many have been talking about the opportunity COVID-19 has presented to us to start anew. As I was writing this column, I received a press release from MIT promoting a new book “Building the New Economy,” edited by Alex Pentland. I haven’t read it yet, but based on the first two lines in the release, it certainly seems to be following this type of thinking:“With each major crisis, be it war, pandemic, or major new technology, there has been a need to reinvent the relationships between individuals, businesses, and governments. Today’s pandemic, joined with the tsunami of data, crypto and AI technologies, is such a crisis.”

We are intrigued by the idea of using the technologies we have available to us to build a societal framework less susceptible to inevitable Black Swans. But is this just an invitation to pry open Pandora’s Box and allow the future Yuval Noah Harari is warning us about?

The Debate 

Harari isn’t the only one seeing the impending doom of the human race. Elon Musk has been warning us about it for years. As we race to embrace artificial intelligence, Musk sees the biggest threat to human existence we have ever faced. 

“I am really quite close, I am very close, to the cutting edge in AI and it scares the hell out of me,” warns Musk. “It’s capable of vastly more than almost anyone knows and the rate of improvement is exponential.”

There are those that pooh-pooh Musk’s alarmism, calling it much ado about nothing. Noted Harvard cognitive psychologist and author Steven Pinker, whose rose-colored vision of humanity’s future reliably trends up and to the right, dismissed Musk’s warnings with this: “If Elon Musk was really serious about the AI threat, he’d stop building those self-driving cars, which are the first kind of advanced AI that we’re going to see.”

In turn, Musk puts Pinker’s Pollyanna perspective down to human hubris: “This tends to plague smart people. They define themselves by their intelligence and they don’t like the idea that a machine could be way smarter than them, so they discount the idea — which is fundamentally flawed.”

From Today Forward

This brings us back to our current adaptive landscape. It’s rugged. The peaks and valleys of our day-to-day reality are more rugged then they have ever been — at least in our lifetimes. 

We need help. And when you’re dealing with a massive threat that involves probability modeling and statistical inference, more advanced artificial intelligence is a natural place to look. 

Would we trade more invasive monitoring of our own bio-status and aggregation of that data to prevent more deaths? In a heartbeat.

Would we put our trust in algorithms that can instantly crunch vast amounts of data our own brains couldn’t possibly comprehend? We already have.

Will we even adopt connected devices constantly streaming the bits of data that define our existence to some corporate third party or government agency in return for a promise of better odds that we can extend that existence? Sign us up.

We are willingly tossing the keys to our future to the Googles, Apples, Amazons and Facebooks of the world. As much as the present may be frightening, we should consider the steps we’re taking carefully.

If we continue rushing down the path towards Yuval Noah Harari’s Dataism, we should be prepared for what we find there: “This cosmic data-processing system would be like God. It will be everywhere and will control everything, and humans are destined to merge into it.”

The Mother of all Mood Swings

How are you doing? 

Yes, you. 

I know how I’m doing — today, anyway. It varies day to day. It depends on the news. It depends on the weather. It depends on Trump’s Twitter stream.

Generally, I’m trying to process the abnormal with the tools I have. I don’t know precisely how you’re doing, but I suspect you’re going through your own processing with your own tools.

I do know one thing. The tools I have are pretty much the same tools you have, at least when we look at them in the broad strokes. It’s one of the surprising things about humans. We all go through some variation of the same process when we deal with life’s big events. 

Take grief and other traumatic life changes. We’re pretty predictable in how we deal with it. So predictable, in fact, that there’s a psychological model with its own acronym for it: DABDA. It’s known as the five stages of grief:  denial, anger, bargaining, depression and acceptance. It was first introduced in 1969 by Swiss-American psychiatrist Elisabeth Kübler-Ross.  

Noted American neurobiologist and author Robert Sapolsky marvels on the universality of our processing of grief in his book  “The Trouble with Testosterone”:  “Poems, paintings, symphonies by the most creative artists who ever lived, have been born out of mourning… We cry, we rage, we demand that the oceans’ waves stop, that the planets halt their movements in the sky, all because the earth will no longer be graced by the one who sang lullabies as no one else could; yet that, too, is reducible to DABDA. Why should grief be so stereotypical?”

But it’s not just bad stuff we process this way. If you look at how we process any big change, you’ll find there are pretty predictable stages we humans go through.

So why are we so predictable in how we deal with change? In general, these are all variations of the sensemaking cycle, which is how we parse the world around us. We start with a frame — an understanding of what we believe to be true — and we constantly compare this to new information we get from our environment. 

Because we are cognitively energy-efficient, we are reluctant to throw out old frames and adopt new ones, especially when those new ones are being forced upon us. It’s just the way we’re wired. 

But life change is usually a solo journey, and we rely on anchors to help us along the way. We rely on our psychoscapes, the cognitive environments in which our minds typically operate. Friends, families, favorite activities, social diversions: these are the things that we can rely on for an emotional boost, even if only temporarily.

But what if everyone is experiencing trauma at the same time? What if our normal psychoscape is no longer there? What then?

Then we enter the SNAFU zone.

SNAFU is an acronym coined in World War II:  “situation normal, all f*cked up.”  It was used to refer to a situation that is bad, but is also a normal state of affairs. 

We are talking a lot about the new normal. But here’s the thing: The new situation normal is going to be a shit show, guaranteed to be all f*cked up. And it’s going to be that way because everyone  — and I mean everyone — is going to be going through the Mother of all Mood Swings. 

First of all, although the stages of managing change may be somewhat universal, the path we take through them is anything but. Some will get stuck at the denial and anger stage and storm the state legislature with assault weapons demanding a haircut. Some are already at acceptance, trying to navigate through a world that is officially SNAFU. We are all processing the same catalyst of change, but we’re at different places in that process. 

Secondly, the psychological anchors we depend on may not be there for us. When we are going through collective stress, we tend to rely on community. We revert to our evolutionary roots of being natural herders. Without exception, the way humans have always dealt with massive waves of change is to come together with others. And this is where a pandemic that requires social distancing throws a king-sized wrench in the works. We can’t even get a hug to help us through a bad day.

As the levels of our collective stress climb, there are bound to be a lot of WTF moments. Nerves will fray and tempers will flare. We will be walking on eggshells. There will be little patience for perspectives that differ from our own. Societal divides will deepen and widen. The whole world will become moodier than a hormonal teenager. 

Finally, we have all of the above playing out in a media landscape that was already fractured to an unprecedented level going into this. All the many things that are FU in this particular SNAFU will be posted, tweeted, shared and reshared. There will be no escape from it. 

Unlike the hormonal teenager, we can’t send COVID-19 to its room.

Media’s Mea Culpa Moment

It’s hard to see when you’re stuck inside. And I’m not talking about self-isolating during a pandemic. I’m talking about our perspective of the media landscape.

The Problem with Politics

Currently, the concept of “Us vs Them” is embedded into our modern idea of politics. Populist politics, by its very nature, needs an enemy to blame. It forces you to pick sides. It creates a culture of antagonism, eroding social capital and dismantling any bipartisan trust. We are far down this path. Perhaps too far to turn back. But we have to realize that no nation or region in modern history has ever prospered in the long term by wantonly destroying social capital. There are many examples of how regionalism, xenophobia and populism have caused nations to regress. There is no example of these things leading to prosperity and long-term success. Not one. Yet this is the path we seem to have chosen.

If you look at the media, it’s politicians that are to blame for all our problems, whether they’re on the right or left. Based on most mainstream media, with its inherent, left wing bias, there is a personification of the problem, primarily in the President. “If Trump wasn’t there, things would be better.” But the problem would still persist. Much as we left leaning individuals found Obama a more palatable choice for president, the problem was here then as well. That’s how we got to where we are today.

The sad truth is, Trump didn’t cause the problem. He just capitalized on it. So we have to look elsewhere for where the problem originated. And that leads us to an uncomfortable reality. We are the problem – meaning we – the media, particularly in the U.S. But it’s hard to see that when you’re looking from the inside. So last week I changed my perspective.

Because of COVID-19, we should all be focused on the same story, perhaps for the first time in our lives. This gives us an unprecedented opportunity to compare the media landscapes against what should be a fairly objective baseline.

The Canadian Litmus Test

I’m Canadian — and for Americans, I know that living next to Canada is like having “The Simpsons'” Ned Flanders for a neighbor. We seem nice and polite, but you can’t help feeling that we’re constantly judging you. 

But Canada does offers Americans the chance to compare cultures that have much in common but with some key critical differences. It was this comparison that geographer, historian and anthropologist Jared Diamond employed in his latest book, “UpheavalTurning Points for Nations in Crisis.”

“Many of Canada’s social and political practices are drastically different from those of the U.S., such as with regards to national health plans, immigration, education, prisons, and balance between community and individual interests,” he writes. ”Some problems that Americans regard as frustratingly insoluble are solved by Canadians in ways that earn widespread public support.”

As a case in point, Canada has handled COVID-19 in a notably different way. Our pandemic response has been remarkably non-partisan. For example, we have the unusual spectacle of our most Trump-like politician, Ontario Conservative Premier Doug Ford, stepping up as a compassionate leader who is working effectively with Liberal Prime Minister Justin Trudeau and his own opposition.

The Myth of Impartial Reporting

This is not the case in the U.S. Because of America’s political divides, it can’t even agree on what should be a simple presentation of fact on the story that affects us all equally. 

A recent PEW study found that where you turn for your news will significantly impact your understanding of things like when a vaccine will be ready or whether coronavirus came about naturally. 

To check this out, I did a comparison of the three most popular U.S. news sites on April 29.

Let’s start with CNN.  Of the 28 news items featured on the home page “above the fold,” 16 had an overt left bias. The most prominent was  inflammatory, dealing with Trump’s handling of the pandemic and his blowing up at press criticism. A Biden story on the Tara Reade accusations was buried in small print links near the bottom.

Now let’s go to the other side of the spectrum: Fox News, which also featured 28 news items “above the fold.” Of these 14 had an overt right bias. Again, the headline was inflammatory, calling out Biden on the Tara Reade allegations. There was no mention of any Trump temper tantrums on the home page. 

Finally, MSNBC’s headline story was actually focused on COVID-19 and the Remdesivir trial results and had no political bias. The site only had nine news items above the fold. Four of these had a left-leaning bias. 

The home pages bore almost no resemblance to each other. You would be hard-pressed to understand that each of these sites represented the news from the same country on the same day.

Now, let’s compare with Canada’s top two news sites, CBC and Global News. 

About 60% of the stories covered were the same on both sites and given roughly the same priority. The same lead story was featured on both — about a missing Canadian military helicopter. On CBC, only one appeared to have any political bias at all and it was definitely not explicit, while none of Global’s did.

That’s in comparison to the American news sites, where over half the stories featured — and all the lead ones — were designed and written to provoke anger, pitting “us” against “them.”

Once mainstream media normalizes this antagonistic approach, it then gets shunted over to social media, where it’s stripped of context, amplified and shared. Mainstream media sets the mood of the nation, and that mood is anger. Social media then whips it into a frenzy. 

Both left- and right-wing media outlets are equally guilty. CNN’s overriding editorial tone is, “Can you believe how stupid they are?” Fox’s is, “They think you’re stupid and they’re trying to pull a fast one on you.” No wonder there is no common ground where public discourse across the political divide can begin.

Before COVID-19, perhaps we could look at this with a certain amount of resignation and even bemusement. If you’re “us” there is a certain satisfaction in vilifying “them.” But today, the stakes are too high. People are dying because of it. Somehow, the media has to turn America’s ideological landscape from a war zone into a safe space.

Our Complicated Relationship with Heroes

It’s not really surprising that we think more about heroes in times of adversity. Many of our most famous superheroes were born in the crucible of crisis: Batman, Superman, Wonder Woman and Captain America were all created during the Great Depression or the early years of World War II.

Today, we are again craving heroes. They are fabricated out of less fantastic stuff: taxi drivers who give free rides to the airport for patients, nurses who staff the front lines of our hospitals, chefs who provide free food to essential workers and a centenarian (as of tomorrow) who is raising millions for his national health care system by walking around his garden every day.

These are ordinary people who are doing extraordinary things. They are being raised to the rank of hero thanks to the surging tides of social media.

Again, this isn’t surprising. We are still in the early stages of what, for most of us, will likely be the defining crisis of our lifetimes. We desperately need some good news.

In fact, everybody’s favorite paper salesman/CIA operative/husband of Mary Poppins — John Krasinski — has curated a weekly webcast collection of feel-good salutes to local heroes called “Some Good News.” As of the writing of this post, it had collectively racked up close to 50 million views.

Krasinski has himself become a hero by doing things like throwing a surprise virtual prom for all the grads who were derived of theirs by the pandemic, or letting a group of ER nurses take the field at an eerily empty Fenway Park.

Having heroes should be a good thing. They should inspire us to be better people  — to become heroes ourselves. Right?

Well…

It’s complicated.

On the surface of it, hero worship is probably a good thing, especially if our heroes are doing things we all could do, if we were so inclined.  “If a 99.9-year-old man can raise millions for a national health service, there must be something I can do.”

On that very theme, the Heroic Imagination Project was formed to help us all be heroes. Headed up by famed psychologist Dr. Phillip Zimbardo, HIP came out of his infamous Stanford Prison Experiment. “If,” reasoned Zimbardo, “we all have the capacity to be evil, given the right circumstances, we should also all have the capacity to be heroes, again under the right circumstances.”

But there are a few hurdles between us and heroism. One of them, ironically, comes part and parcel with the very idea of hero worship.

In an extensive analysis of how superheroes reflect the American mythology of their own times, Dublin writer Sally Rooney shows how a country uses its heroes to reassure itself of its own goodness: The superhero makes sense in times of crisis. Reducing the vast complex of nationhood into the body of an individual means periods of geopolitical turmoil can be repackaged as moments of psychological stress. In the mirror of the superhero, America is reassured of its good qualities. Physical strength is good, as is the ability to make wisecracks under pressure. Masculinity is good, and women are okay as long as they can do very high kicks while making wisecracks. Once America is on the scene, order can be restored.”

So, we use heroes as a moral baseline to make us feel better about collective selves. They can help us reaffirm our faith in our national ideologies. A picture of a nurse in scrubs silently staring down a protester demanding a haircut makes us feel that things are still OK  in the heartland of the nation. It’s a reverse adaptation of the Lake Wobegone effect: “If this person represents the best of what we (as Americans) are, then the average can’t be all that bad.”

Unfortunately, this leads right into the second hurdle, the Bystander Effect: “If something happens that demands heroic action and there are a lot of people around, surely there’s a hero in the crowd that will step forward before I have to.” Being a hero demands a certain amount of sacrifice. As long as someone else is willing to make that sacrifice, we don’t have to — but we can still feel good about ourselves by giving it a like,  or, if we’re truly motivated, sharing it on our feed.

As the greatest real-time sociological experiment in our lifetime continues to play out, we might have yet another example of an unintended consequence brought on by social media. Based on our Facebook feeds, it appears that we have more heroes than ever. That’s great, but will it encourage us or keep us from stepping up and becoming heroes ourselves?

A New Definition of Social

I am an introvert. My wife is an extrovert. Both of us have been self-isolating for about 5 weeks now.  I don’t know if our experiences are representative of introverts and extroverts as a group, but my sample size has – by necessity – been reduced to a “n” of 2. Our respective concepts of what it means to be social have been disruptively redefined, but in very different ways.

The Extro-Version

You’ve probably heard of Dunbar’s Number. It was proposed by anthropologist Robin Dunbar. It’s the “suggested cognitive limit to the number of people with whom one can maintain stable social relationships.” The number, according to Dunbar, is about 150. But that number is not an absolute. it’s a theoretical limit. Some of us can juggle way more social connections than others.

My wife’s EQ (emotional quotient) is off the charts. She has a need to stay emotionally connected to a staggering number of people. Even in normal times, she probably “checks in” with dozens of people every week. Before COVID-19, this was done face-to-face whenever possible.

Now, her empathetic heart feels an even greater need to make sure everyone is doing okay. But she has to do it through socially distanced channels. She uses text messaging a lot. But she also makes at least a few phone calls every day for those in her network who are not part of the SMS or social media universe.

She has begun using Zoom to coordinate virtual get-togethers of a number of her friends. Many in this circle are also extroverts. A fair number of them are – like my wife – Italian. You can hear them recharging their social batteries as the energy and volume escalates. It’s not cappuccino and biscotti but they are making do with what they’ve got.

Whatever the channel, it has been essential for my wife to maintain this continuum of connection.

The Intro-Version

There are memes circulating that paint the false picture that the time has finally come for us introverts. “I’ve been practicing for this my entire life,” says one. They consistently say that life in lockdown is much harder for extroverts than introverts. They even hint that we should be in introvert’s heaven. They are wrong. I am not having the time of my life.

I’m not alone. Other introverts are having trouble adjusting to a social agenda being forced upon them by their self-isolated extrovert friends and colleagues. We introverts seldom get to write the rules of social acceptability, even in a global pandemic.

If you type “Are introverts more likely” into Google, it will suggest the following ways to complete that sentence: “to be depressed”, “to be single”, “to have anxiety”, “to be alcoholic”, and “to be psychopaths”. The world is not built for introverts.

Understanding introversion vs extroversion is to understand social energy. Unlike my wife for whom social interactions act as a source of renewal, for me they are a depletion of energy. If I’m going to make the effort, it better be worth my while. A non-introvert can’t understand that. It’s often interpreted as aloofness, boredom or just being rude. It’s none of these. It’s just our batteries being run down.

Speaking for myself, I don’t think most introverts are antisocial. We’re just “differently” social. We need connections the same as extroverts. But those connections are of a certain kind. It’s true that introverts are not good at small talk. But under the right circumstances, we do love to talk. Those circumstances are just more challenging in the current situation.

Take Zoom for instance. My wife, the extrovert, and myself, the introvert, have done some Zoom meetings side by side. I have noticed a distinct difference in how we Zoom. But before I talk about that, let me set a comparative to a more typical example of an introvert’s version of hell: the dreaded neighborhood house party.

As an introvert in this scenario, I would be constantly reading body language and non-verbal cues to see if there was an opportunity to reluctantly crowbar my way into a conversation. I would only do so if the topic interested me. Even then, I would be subconsciously monitoring my audience to see if they looked bored. On the slightest sign of disinterest, I would awkwardly wind down the conversation and retreat to my corner.

It’s not that I don’t like to talk. But I much prefer sidebar one-on-one conversations. I don’t do well in environments where there is too much going on. In those scenarios, introverts tend to clam up and just listen.

Now, consider a Zoom “Happy Hour” with a number of other people. All of that non-verbal bandwidth we Introverts rely on to pick and choose where we expend our limited social energy is gone.   Although Zoom adds a video feed, it’s a very low fidelity substitute for an in-the-flesh interaction.

With all this mental picking and choosing happening in the background, you can understand why introverts are slow to jump into the conversational queue and, when we finally do, we find that someone else (probably an extrovert) has started talking first. I’m constantly being asked, “Did you say something Gord?”, at which point everyone stops talking and looks at my little Zoom cubicle, waiting for me to talk. That, my friends, is an introvert’s nightmare.

Finally, I Get the Last Word

Interestingly, neither my wife nor I are using Facebook much for connection. She has joined a few Facebook groups, one of which is a fan club for our provincial health officer, Dr. Bonnie Henry. Dr. Henry has become the most beloved person in B.C.

And I’m doing what I always tell everyone else not to do; follow my Facebook newsfeed and go into self-isolated paroxysms of rage about the Pan-dumb-ic and the battle between science and stupidity.

There is one social sacrifice that both my wife and I agree on. The thing we miss most is the ability to hug those we love.