The Cost of Not Being Curious

The world is having a pandemic-proportioned wave of Ostrichitis.

Now, maybe you haven’t heard of Ostrichitis. But I’m willing to bet you’re showing at least some of the symptoms:

  • Avoiding newscasts, especially those that feature objective and unbiased reporting
  • Quickly scrolling past any online news items in your feed that look like they may be uncomfortable to read
  • Dismissing out of hand information coming from unfamiliar sources

These are the signs of Ostrichitis – or the Ostrich Effect – and I have all of them. This is actually a psychological effect, more pointedly called willful ignorance, which I wrote about a few years ago. And from where I’m observing the world, we all seem to have it to one extent or another.

I don’t think this avoidance of information comes as a shock to anyone. The world is a crappy place right now. And we all seem to have gained comfort from adopting the folk wisdom that “no news is good news.” Processing bad news is hard work, and we just don’t have the cognitive resources to crunch through endless cycles of catastrophic news. If the bad news affirms our existing beliefs, it makes us even madder than what we were. If it runs counter to our beliefs, it forces us to spin up our sensemaking mechanisms and reframe our view of reality. Either way, there are way more fun things to do.

A recent study from the University of Chicago attempted to pinpoint when children started avoid bad news. The research team found that while young children don’t tend to put boundaries around their curiosity, as they age they start avoiding information that challenges their beliefs or their own well-being. The threshold seems to be about 6 years old. Before that, children are actively seeking information of all kinds (as any parent barraged by never ending “Whys” can tell you). After that, chidren start strategizing the types of information they pay attention to.

Now, like everything about humans, curiosity tends to be an individual thing. Some of us are highly curious and some of us avoid seeking new information religiously. But even if we are a curious sort, we may pick and choose what we’re curious about. We may find “safe zones” where we let our curiosity out to play. If things look too menacing, we may protect ourselves by curbing our curiosity.

The unfortunate part of this is that curiosity, in all its forms, is almost always a good thing for humans (even if it can prove fatal to cats).

The more curious we are, the better tied we are to reality. The lens we use to parse the world is something called a sense-making loop. I’ve often referred to this in the past. It’s a processing loop that compares what we experience with what we believe, referred to as our “frame”. For the curious, this frame is often updated to match what we experience. For the incurious, the frame is held on to stubbornly, often by ignoring new information or bending information to conform to their beliefs. A curious brain is a brain primed to grow and adapt. An incurious brain is one that is stagnant and inflexible. That’s why the father of modern-day psychology, William James, called curiosity “the impulse towards better cognition.”

When we think about the world we want, curiosity is a key factor in defining it. Curiosity keeps us moving forward. The lack of curiosity locks us in place or even pushes us backwards, causing the world to regress to a more savage and brutal place. Writers of dystopian fiction knew this. That’s why authors including H.G. Wells, Aldous Huxley, Ray Bradbury and George Orwell all made a lack of curiosity a key part of their bleak future worlds. Our current lack of curiosity is driving our world in the same dangerous direction.

For all these reasons, it’s essential that we stay curious, even if it’s becoming increasingly uncomfortable.

Lilith Fair: A Quarter Century and A Different World Ago

Lilith Fair: Building a Mystery, a new documentary released on Hulu (CBC Gem in Canada), is much more than a chronicle of a music festival. It’s a very timely statement on the both the strength and fragility of community.

Lilith Fair was the festival launched in 1997 by Canadian singer/songwriter Sarah McLachlan. It was conceived as a feminine finger in the eye of a determinedly misogynistic music industry. At the end of the 90’s, despite a boom in talented female singer songwriters (Tracy Chapman, Jewel, Paula Cole, Sheryl Crow, Natalie Merchant, Shawn Colvin, Lisa Loeb, Suzanne Vega and others too numerous to mention), radio stations wouldn’t run two songs by women back-to-back. They also wouldn’t book two women on the same concert ticket. The feeling, based on nothing other than male intuition, was that it would be too much “femininity” for the audience to handle.

McLachlan, in her charmingly polite Canadian way, said “Fudge you!” and launched her own festival. The first one, in 1997, played almost 40 concerts over 51 days across North America. The line-up was exclusively female – 70 singers in all playing on three stages. Almost every concert sold out. Apparently, there was an audience for female talent. Lilith Fair would be repeated in 1998 and 1999, with both tours being smashing successes.

The World needed Lilith Fair in the late 90s. It wasn’t only the music industry that was misogynistic and homophobic. It was our society. The women who played Lilith Fair found a community of support unlike anything they had ever experienced in their careers. Performers who had been feeling isolated for years suddenly found support and – more than anything – understanding.

It was women who made the rules and ran the Lilith Fair show. It was okay to perform when you were 8 months pregnant. It was okay to hold your baby onstage as you performed the group encore. It was okay to bring the whole family on tour and let the kids play backstage while you did your set. These were things that were – up until then – totally foreign in the music industry. It was the very definition of community – diverse people having something in common and joining together to deal from a position of strength.

But it didn’t happen overnight. It took a while – and a lot of bumping into each other backstage – for the community to gel. It also needed a catalyst, which turned out to be Amy Ray and Emily Saliers – officially known as the Indigo Girls. It was their out-going friendliness that initially broke the ice “because we were so gay and so puppy dog-like.”

This sense of community extended beyond the stage to the thousands who attended: men and women, old and young, straight and gay. It didn’t matter – Lilith Fair was a place where you would be accepted and understood. As documentary producer Dan Levy (of Schitt’s Creek fame) – who was 12 years old when he attended and was yet to come out – said, “Being there was one of the earliest memories I’ve had of safety.”

The unity and inclusiveness of Lilith Fair stood in stark contrast to another festival of the same era – Woodstock 99. There, toxic masculinity from acts like Limp Bizkit singer Fred Durst and Kid Rock, swung the vibe of the event heavily towards anarchy and chaos rather than community.

But while Lilith Fair showed the importance of community, it also showed how fragile it could be. The festival became the butt of jokes on late night television (including one particularly cringe-worthy one by Jay Leno about Paula Cole’s body hair) and those that sought to diminish its accomplishments and importance. Finally, at the end of the 1999 tour, McLachlan had had enough. The last concert was played in the rain at Edmonton, Alberta on August 31st.

McLachlan did try to revive Lilith Fair in 2010, but it was a complete failure. Whatever lightening in a bottle she had captured the first time was gone. The world had passed it by. The documentary didn’t dwell on this other than offering a few reasons why this might be. Perhaps Lilith Fair wasn’t needed anymore. Maybe it had done its job. After all, women had mounted some of the top tours of that time, including Taylor Swift, Madonna, Pink and Lady Gaga.

Or maybe it had nothing to do with the industry. Maybe it had everything to do with us, the audience.

The world of 1999 was very different place than the world of 2010. Community was in the midst of being redefined from those sharing a common physical location to those sharing a common ideology in online forums. And that type of community didn’t require a coming together. If anything, those types of communities kept us apart, staring at a screen – alone in our little siloes.

According to the American Time Use Survey, the time spent in-person socializing has been on a steady decline since 2000.  This is especially true for those under the age of 25, the prime market for musical festivals. When we did venture forth to see a concert, we are looking for spectacle, not community. This world was moving too fast for the coalescing of the slow, sweet magic that made Lilith Fair so special.

At the end of the documentary, Sarah McLachlan made it clear that she’ll never attempt to bring Lilith Fair back to life. It was a phenomenon of that time. And that is sad – sad indeed.

When Did the Future Become So Scary?

The TWA hotel at JFK airport in New York gives one an acute case of temporal dissonance. It’s a step backwards in time to the “Golden Age of Travel” – the 1960s. But even though you’re transported back 60 years, it seems like you’re looking into the future. The original space – the TWA Flight Center – was designed in 1962 by Eero Saarinen. This was a time when America was in love with the idea of the future. Science and technology were going to be our saving grace. The future was going to be a utopian place filled with flying jet cars, benign robots and gleaming, sexy white curves everywhere.  The TWA Flight Center was dedicated to that future.

It was part of our love affair with science and technology during the 60s. Corporate America was falling over itself to bring the space-age fueled future to life as soon as possible. Disney first envisioned the community of tomorrow that would become Epcot. Global Expos had pavilions dedicated to what the future would bring. There were four World Fairs over 12 years, from 1958 to 1970, each celebrating a bright, shiny white future. There wouldn’t be another for 22 years.

This fascination with the future was mirrored in our entertainment. Star Trek (pilot in 1964, series start in 1966) invited all of us to boldly go where no man had gone before, namely a future set roughly three centuries from then.   For those of us of a younger age, the Jetsons (original series from 1963 to 64) indoctrinated an entire generation into this religion of future worship. Yes, tomorrow would be wonderful – just you wait and see!

That was then – this is now. And now is a helluva lot different.

Almost no one – especially in the entertainment industry – is envisioning the future as anything else than an apocalyptic hell hole. We’ve done an about face and are grasping desperately for the past. The future went from being utopian to dystopian, seemingly in the blink of an eye. What happened?

It’s hard to nail down exactly when we went from eagerly awaiting the future to dreading it, but it appears to be sometime during the last two decades of the 20th Century. By the time the clock ticked over to the next millennium, our love affair was over. As Chuck Palahniuk, author of the 1999 novel Invisible Monsters, quipped, “When did the future go from being a promise to a threat?”

Our dread about the future might just be a fear of change. As the future we imagined in the 1960’s started playing out in real time, perhaps we realized our vision was a little too simplistic. The future came with unintended consequences, including massive societal shifts. It’s like we collectively told ourselves, “Once burned, twice shy.” Maybe it was the uncertainty of the future that scared the bejeezus out of us.

But it could also be how we got our information about the impact of science and technology on our lives. I don’t think it’s a coincidence that our fear of the future coincided with the decline of journalism. Sensationalism and endless punditry replaced real reporting just about the time we started this about face. When negative things happened, they were amplified. Fear was the natural result. We felt out of control and we keep telling ourselves that things never used to be this way.  

The sum total of all this was the spread of a recognized psychological affliction called Anticipatory Anxiety – the certainty that the future is going to bring bad things down upon us. This went from being a localized phenomenon (“my job interview tomorrow is not going to go well”) to a widespread angst (“the world is going to hell in a handbasket”). Call it Existential Anticipatory Anxiety.

Futurists are – by nature – optimists. They believe things well be better tomorrow than they are today. In the Sixties, we all leaned into the future. The opposite of this is something called Rosy Retrospection, and it often comes bundled with Anticipatory Anxiety. It is a known cognitive bias that comes with a selective memory of the past, tossing out the bad and keeping only the good parts of yesterday. It makes us yearn to return to the past, when everything was better.

That’s where we are today. It explains the worldwide swing to the right. MAGA is really a 4-letter encapsulation of Rosy Retrospection – Make America Great Again! Whether you believe that or not, it’s a message that is very much in sync with our current feelings about the future and the past.

As writer and right-leaning political commentator William F. Buckley said, “A conservative is someone who stands athwart history, yelling Stop!”

It’s Tough to Consume Conscientiously

It’s getting harder to be both a good person and a wise consumer.

My parents never had this problem when I was a kid. My dad was a Ford man. Although he hasn’t driven for 10 years, he still is. If you grew up in the country, your choices were simple – you needed a pickup truck. And in the 1960s and 70s, there were only three choices: Ford, GMC or Dodge. For dad, the choice was Ford – always.

Back then, brand relationships were pretty simple. We benefited from the bliss of ignorance. Did the Ford Motor Company do horrible things during that time? Absolutely. As just one example, they made a cost-benefit calculation and decided to keep the Pinto on the road even though they knew it tended to blow up when hit from the rear. There is a corporate memo saying – in black and white – that it would be cheaper to settle the legal claims of those that died than to fix the problem. The company was charged for negligent homicide. It doesn’t get less ethical than that.

But that didn’t matter to Dad. He either didn’t know or didn’t care. The Pinto Problem, along with the rest of the shady stuff done by the Ford Motor Company, including bribes, kickbacks and improper use of corporate funds by Henry Ford II, was not part of Dad’s consumer decision process. He still bought Ford. And he still considered himself a good person. The two things had little to do with each other.

Things are harder now for consumers. We definitely have more choice, and those choices are harder, because we know more.  Even buying eggs becomes an ethical struggle. Do we save a few bucks, or do we make some chicken’s life a little less horrible?

Let me give you the latest example from my life. Next year, we are planning to take our grandchildren to a Disney theme park. If our family has a beloved brand, it would be Disney. The company has been part of my kids’ lives in one form or another since they were born and we all want it to be part of their kid’s lives as well.

Without getting into the whole debate, I personally have some moral conflicts with some of Disney’s recent corporate decisions. I’m not alone. A Facebook group for those planning a visit to this particular park has recently seen posts from those agonizing over the same issue. Does taking the family to the park make us complicit in Disney’s actions that we may not agree with? Do we care enough to pull the plug on a long-planned park visit?

This gets to the crux of the issue facing consumers now – how do we balance our beliefs about what is wrong and right with our desire to consume? Which do we care more about? The answer, as it turns out, seems to almost always be to click the buy button as we hold our noses.

One way to make that easier is to tell ourselves that one less visit to a Disney mark will make virtually no impact on the corporate bottom line. Depriving ourselves of a long-planned family experience will make no difference. And – individually – this is true. But it’s exactly this type of consumer apathy which, when aggregated, allows corporations to get away with being bad moral characters.

Even if we want to be more ethically deliberate in our consumer decisions, it’s hard to know where to draw the line. Where are we getting our information about corporate behavior from? Can it be trusted? Is this a case of one regrettable action, or is there a pattern of unethical conduct? These decisions are always complex, and coming to any decision that involves complexity is always tricky.

To go back to a simpler time, my grandmother had a saying that she applied liberally to any given situation, “What does all this have to do with the price of tea in China?” Maybe she knew what was coming.

The Credibility Crisis

We in the western world are getting used to playing fast and loose with the truth. There is so much that is false around us – in our politics, in our media, in our day-to-day conversations – that it’s just too exhausting to hold everything to a burden of truth. Even the skeptical amongst us no longer have the cognitive bandwidth to keep searching for credible proof.

This is by design. Somewhere in the past four decades, politicians and society’s power brokers have discovered that by pandering to beliefs rather than trading in facts, you can bend to the truth to your will. Those that seek power and influence have struck paydirt in falsehoods.

In a cover story last summer in the Atlantic, journalist Anne Applebaum explains the method in the madness: “This tactic—the so-called fire hose of falsehoods—ultimately produces not outrage but nihilism. Given so many explanations, how can you know what actually happened? What if you just can’t know? If you don’t know what happened, you’re not likely to join a great movement for democracy, or to listen when anyone speaks about positive political change. Instead, you are not going to participate in any politics at all.”

As Applebaum points out, we have become a society of nihilists. We are too tired to look for evidence of meaning. There is simply too much garbage to shovel through to find it. We are pummeled by wave after wave of misinformation, struggling to keep our heads above the rising waters by clinging to the life preserver of our own beliefs. In the process, we run the risk of those beliefs becoming further and further disconnected from reality, whatever that might be. The cogs of our sensemaking machinery have become clogged with crap.

This reverses a consistent societal trend towards the truth that has been happening for the past several centuries. Since the Enlightenment of the 18th century, we have held reason and science as the compass points of our True Norh. These twin ideals were buttressed by our institutions, including our media outlets. Their goal was to spread knowledge. It is no coincidence that journalism flourished during the Enlightenment. Freedom of the press was constitutionally enshrined to ensure they had the both the right and the obligation to speak the truth.

That was then. This is now. In the U.S. institutions, including media, universities and even museums, are being overtly threatened if they don’t participate in the wilful obfuscation of objectivity that is coming from the White House. NPR and PBS, two of the most reliable news sources according to the Ad Fontes media bias chart, have been defunded by the federal government. Social media feeds are awash with AI slop. In a sea of misinformation, the truth becomes impossible to find. And – for our own sanity – we have had to learn to stop caring about that.

But here’s the thing about the truth. It gives us an unarguable common ground. It is consistent and independent from individual belief and perspective. As longtime senator Daniel Patrick Moynihan famously said, “Everyone is entitled to his own opinion, but not to his own facts.” 

When you trade in falsehoods, the ground is consistently shifting below your feet. The story is constantly changing to match the current situation and the desired outcome. There are no bearings to navigate by. Everyone had their own compass, and they’re all pointing in different directions.

The path the world is currently going down is troubling in a number of ways, but perhaps the most troubling is that it simply isn’t sustainable. Sooner or later in this sea of deliberate chaos, credibility is going to be required to convince enough people to do something they may not want to do. And if you have consistently traded away your credibility by battling the truth, good luck getting anyone to believe you.

Face Time in the Real World is Important

For all the advances made in neuroscience, we still don’t fully understand how our brains respond to other people. What we do know is that it’s complex.

Join the Chorus

Recent studies, including this one from Rochester University, are showing that when we see someone we recognize, the brain responds with a chorus of neuronal activity. Neurons from different parts of the brain fire in unison, creating a congruent response that may simultaneously pull from memory, from emotion, from the rational regions of our prefrontal cortex and from other deep-seated areas of our brain. The firing of any one neuron may be relatively subtle, but together this chorus of neurons can create a powerful response to a person. This cognitive choir represents our total comprehension of an individual.

Non-Verbal Communication

“You’ll have your looks, your pretty face. – And don’t underestimate the importance of body language!” – Ursula, The Little Mermaid

Given that we respond to people with different parts of the brain, it makes sense that we use part of the brain we didn’t realize when communicating with someone else. In 1967, psychologist Albert Mehrabian attempted to pin this down with some actual numbers, publishing a paper in which he put forth what became known as Mehrabian’s Rule: 7% of communication is verbal, 38% is tone of voice and 55% is body language.

Like many oft-quoted rules, this one is typically mis-quoted. It’s not that words are not important when we communication something. Words convey the message. But it’s the non-verbal part that determines how we interpret the message – and whether we trust it or not.

Folk wisdom has told us, “Your mouth is telling me one thing, but your eyes are telling me another.” In this case, folk wisdom is right. We evolved to respond to another person with our whole bodies, with our brains playing the part of conductor. Maybe the numbers don’t exactly add up to Mehrabian’s neat and tidy ratio, but the importance of non-verbal communication is undeniable. We intuitively pick up incredibly subtle hints: a slight tremor in the voice, a bead of sweat on the forehead, a slight turn down of one corner of the mouth, perhaps a foot tapping or a finger trembling, a split-second darting of the eye. All this is subconsciously monitored, fed to the brain and orchestrated into a judgment about a person and what they’re trying to tell us. This is how we evolved to judge whether we should build trust or lose it.

Face to Face vs Face to Screen

Now, we get to the question you knew was coming, “What happens when we have to make these decisions about someone else through a screen rather than face to face?”

Given that we don’t fully understand how the brain responds to people yet, it’s hard to say how much of our ability to judge whether we should convey trust or withhold it is impaired by screen-to-screen communication. My guess is that the impairment is significant, probably well over 50%. It’s difficult to test this in a laboratory setting, given that it generally requires some type of neuroimaging, such as an fMRI scanner. In order to present a stimulus for the brain to respond to when the subject is strapped in, a screen is really the only option. But common sense tells me – given the sophisticated and orchestrated nature of our brain’s social responses – that a lot is lost in translation from a real-world encounter to a screen recording.

New Faces vs Old Ones

If we think of how our brains respond to faces, we realize that in today’s world, a lot of our social judgements are increasing made without face-to-face encounters. In a case where we know someone, we will pull forward a snapshot of our entire history with that person. The current communication is just another data point in a rich collection of interpersonal experience. One would think that would substantially increase our odds of making a valid judgement.

But what if we must make a judgement on someone we’ve never met before, and have only seen through a screen; be it a TikTok post, an Instagram Reel, a YouTube video or a Facebook Post? What if we have to decide whether to believe an influencer when making an important life decision? Are we willing to rely on a fraction of our brain’s capacity when deciding whether to place trust in someone we’ve never met?

Bread and Circuses: A Return to the Roman Empire?

Reality sucks. Seriously. I don’t know about you, but increasingly, I’m avoiding the news because I’m having a lot of trouble processing what’s happening in the world. So when I look to escape, I often turn to entertainment. And I don’t have to turn very far. Never has entertainment been more accessible to us. We carry entertainment in our pocket. A 24-hour smorgasbord of entertainment media is never more than a click away. That should give us pause, because there is a very blurred line between simply seeking entertainment to unwind and becoming addicted to it.

Some years ago I did an extensive series of posts on the Psychology of Entertainment. Recently, a podcast producer from Seattle ran across the series when he was producing a podcast on the same topic and reached out to me for an interview. We talked at length about the ubiquitous nature of entertainment and the role it plays in our society. In the interview, I said, “Entertainment is now the window we see ourselves through. It’s how we define ourselves.”

That got me to thinking. If we define ourselves through entertainment, what does that do to our view of the world? In my own research for this column, I ran across another post on how we can become addicted to entertainment. And we do so because reality stresses us out, “Addictive behavior, especially when not to a substance, is usually triggered by emotional stress. We get lonely, angry, frustrated, weary. We feel ‘weighed down’, helpless, and weak.”

Check. That’s me. All I want to do is escape reality. The post goes on to say, “Escapism only becomes a problem when we begin to replace reality with whatever we’re escaping to.”

I believe we’re at that point. We are cutting ties to reality and replacing them with a manufactured reality coming from the entertainment industry. In 1985 – forty years ago – author and educator Neil Postman warned us in his book Amusing Ourselves to Death that we were heading in this direction. The calendar had just ticked past the year 1984 and the world collectively sighed in relief that George Orwell’s eponymous vision from his novel hadn’t materialized. Postman warned that it wasn’t Orwell’s future we should be worried about. It was Aldous Huxley’s forecast in Brave New World that seemed to be materializing:

“As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny “failed to take into account man’s almost infinite appetite for distractions…  Orwell feared that what we fear will ruin us. Huxley feared that what we desire will ruin us.”

Postman was worried then – 40 years ago – that the news was more entertainment than information. Today, we long for even the kind of journalism that Postman was already warning us about. He would be aghast to see what passes for news now. 

While things unknown to Postman (social media, fake news, even the internet) are throwing a new wrinkle in our downslide into an entertainment induced coma, it’s not exactly new.   This has happened at least once before in history, but you have to go back almost 2000 years to find an example. Near the end of the Western Roman Empire, as it was slipping into decline, the Roman poet Juvenal used a phrase that summed it up – panem et circenses – “bread and circuses”:

“Already long ago, from when we sold our vote to no man, the People have abdicated our duties; for the People who once upon a time handed out military command, high civil office, legions — everything, now restrains itself and anxiously hopes for just two things: bread and circuses.”

Juvenal was referring to the strategy of the Roman emperors to provide free wheat and circus games and other entertainment games to gain political power. In an academic article from 2000, historian Paul Erdkamp said the ploy was a “”briberous and corrupting attempt of the Roman emperors to cover up the fact that they were selfish and incompetent tyrants.”

Perhaps history is repeating itself.

One thing we touched on in the podcast was a noticeable change in the entertainment industry itself. Scarlett Johansenn noticed the 2025 Academy Awards ceremony was a much more muted affair than in years past. There was hardly any political messaging or sermons about how entertainment provided a beacon of hope and justice. In an interview with Vanity Fair  – Johanssen mused that perhaps it’s because almost all the major studies are now owned by Big-Tech Billionaires, “These are people that are funding studios. It’s all these big tech guys that are funding our industry, and funding the Oscars, and so there you go. I guess we’re being muzzled in all these different ways, because the truth is that these big tech companies are completely enmeshed in all aspects of our lives.”

If we have willingly swapped entertainment for reality, and that entertainment is being produced by corporations who profit from addicting as many eyeballs as possible, prospects for the future do not look good.

We should be taking a lesson from what happened to Imperial Rome.

Our Memories Are Our Compass

“You can’t really know where you’re going until you know where you’ve been”

Maya Angelou

Today is Canada Day – the Canadian version of the Fourth of July. In the past decade or so, it’s been a day fraught with some existential angst, as we try to reconcile our feelings of pride with our often-glaring imperfections as Canadians. In a country known for its readiness to apologize, this is perhaps the most Canadian of Canadian holidays – a day made for wondering if we should be saying “we’re sorry.”

 This year, it will be interesting to see how Canada celebrates. As I’ve mentioned before, what is happening south of the border has caused Canadians to have a renewed burst of patriotism and pride. We may not be united on much, but we universally know we don’t want to be the 51st state. No offence (heaven forbid) but we’re good as is, President Trump. Really.

A few days ago, I happened across a little video posted to celebrate Canada. It was a montage of “Heritage Minutes” –little vignettes of our Canadian past produced since 1990 by Historica Canada. This montage was set to a song by another Canadian icon, “It’s a Good Life if You don’t Weaken” by the Tragically Hip. The 4 minute and 29 second video checked all the boxes guaranteed to generate the warm fuzzies for Canadians: Anne of Green Gables (check), the invention of basketball and the telephone (check), the discovery of Insulin (check), the origins of Superman (check), the naming of Winnie the Pooh (check), our contributions in two World Wars (check and check). It was Canadiana distilled; more than maple syrup – which is more of an Eastern Canadian thing. More than poutine, which most Canadians had never heard of until 20 years ago. Maybe on a par with hockey.

But the montage also reminded me of some not so glorious Canadian moments. We were imperfect, in our abhorrent treatment of immigrants in the past – especially the Chinese and Japanese. And our ignoring – and worse – our attempts to irradicate the incredibly rich and diverse Indigenous history and culture because it was inconvenient to our dreams of nation building.

Canada’s history is distinct from that of the U.S.A. In the last half of 19th Century and the beginning of the 20th Century, when immigration started in earnest, we were very much a British Colony. Anyone who was not British was treated as either a necessary evil (providing the manual labor required to build a new country) or as a persona non grata. As for those that preceded us – the Indigenous population of Canada – the British saw them as an inconvenience and potential threat to either be tamed or systematically eradicated.

This – too – is part of Canada’s history. And we have to acknowledge that, because to do so gives us a compass to navigate both the present and future. That montage reminds us that immigration built this country. And Canada’s thousands of years of Indigenous past needs to be recognized so the entire history of our nation can be honestly reconciled. We need to fix our bearings to they read true before we move forward.

Canadians today need to decide what we aspire to be as a nation in the future. And to do that, we need to remember where we’ve been. Do we ignore the fact that we are a nation of immigrants and are so much the richer for it? Do we conveniently forget that there were people here thousands of years before the first European set foot on Canadian soil? We need to fully understand what made Canada what it is – both good and bad – an imperfect country that still happens to be a pretty great place to live.

In the song that the montage is set to, the Tragically Hip’s Gord Downie sings:

In the forest of whispering speakers
Let’s swear that we will
Get with the times
In a current health to stay

But maybe we can do better than just maintain the status quo. If we remember where we’ve been, maybe we can do better in the future than where we are now.

Happy Canada Day!

The Presidential Post-a-Palooza Problem

As of June 3rd of this year, President Donald Trump had posted 2262 times to Truth Social in the132 days since his inauguration. That’s 17 posts per day – or night.  According to a recent article in the Washington Post, the president’s social media use is far exceeding his posting in his first term: “His posting now overshadows even the most explosive Twitter days of his first presidency: He tweeted 14 times on his biggest-posting day in early 2017, the data show — a tenth of the 138 posts his Truth Social account sent on a single day this March.”

According to the White House, this is a good thing: “President Trump is the most transparent president in history and is meeting the American people where they are to directly communicate his policies, message, and important announcements,” said White House Assistant Press Secretary Taylor Rogers.

Transparent? I suppose – as in Saran Wrap transparency – only a few microns thick and unable to stand on its own. But the biggest problem with Trump’s brand of social media transparency is that it is a pinball type of presidentialism – continually launching projectiles just to see what they bump into.

Here’s how this scenario often plays out. Trump sends out many of his missives in the middle of the night. They are posted to Truth Social, the media platform he owns and which he is contractually obligated to post first on. In terms of comparison, X has almost 600 million users, Truth Social has about 1 percent of that – about 6 million. And that is hardly a diverse sampling. LA Times reporter Lorraine Ali dared to spend 24 hours on Truth Social last year, “so you don’t have to.” She found Truth Social to be like “a MAGA town hall in a ventless conference room, where an endless line of folks step up to the mic to share how the world is out to get them.”

Ali went on, “The Truth Social feed I experienced was a mix of swaggering gun talk, typo-filled Bible scripture, violent Biden bashing, nonsensical conspiracy theories and more misguided memes about Jan. 6 “hostages,” trans satanists and murderous migrants than anyone should be subjected to in one day. Or ever.”

This is the audience that is the first stop for Trump’s midnight social media musings. Truth Social is not the place for thoughtful policy statements or carefully crafted communication. Rather, it is a place that laps up posts like the beaut that Trump launched on Memorial Day, which started with: “Happy Memorial Day to all, including the scum that spent the last four years trying to destroy our country through warped radical left minds, who allowed 21,000,000 million people to illegally enter our country, many of them being criminals and the mentally insane.”

He then shortly followed that up by reposting this: “There is no #JoeBiden – executed in 2020. #Biden clones doubles & robotic engineered soulless mindless entities are what you see. >#Democrats don’t know the difference.”

From Truth Social, his posts rapidly move to more mainstream platforms. The Post article plotted the typical course of a Trump “Truth”:

“ ‘His messaging moves in real time from Truth to X, and it spreads just as far if not farther on X than it did when he was tweeting himself on the platform,’ said Darren Linvill, a professor and co-director of the Media Forensics Hub at Clemson University who studies social media.

What’s more, Truth Social’s almost exclusively congenial audience insulates the president from negative responses. ‘His current social media behavior suggests that with time he has been pulled even farther into his own echo chambers,’ Linvill said. ‘Truth Social gives him complete and constant positive feedback.’”

By the time dawn breaks over the White House, these missives have been echoing through the echo chambers of social media for at least a couple hours. Trump has received endorsement from the Truth Social crowd and the posts are out in the world, demanding to be dealt with. This is not even government by fiat. It’s as if you woke from a fever dream at 3 in the morning and decided that the two-headed dragon that was eating your Froot Loops needed to be taken out by an all-out military operation. And you were the President. And you could make it happen. And the Two-Headed Dragon was Iran – or Greenland – or Canada.

It is a quantum leap beyond insane that this is how government policy is currently being determined. Even more unbelievable is the fact that this has now been normalized by the same White House spokesperson Taylor Rogers, who said “President Trump was elected in a historic landslide victory and even won the popular vote — no further validation is needed.”

OMG – yes Taylor, further validation is needed. Desperately! These posts are determining the future of the world and everyone who lives in it. They should be given great thought. Or – at least – more thought than that generated by the mind-altering after-effects of Big Mac eaten at 1:30 in the morning

The Tesla Cybertruck’s Branding Blow-Up

The inexact science of branding is nowhere more evident that in the case of the Tesla Cybertruck, which looks like it might usurp the Edsel’s title as the biggest automotive flop in history.

First, a little of the Tesla backstory. No, it wasn’t founded by Elon Musk. It was founded in 2003 by Martin Eberhard and Marc Tarpenning. Musk came in a year later as a money man. Soon, he had forced Eberhard and Tarpenning out of the company. But their DNA remained, notably in the design and engineering of the hugely popular Tesla Model S, Model X and Model 3. These designs drove Tesla to capture over 50% of the electric car market and are straight line extensions of the original technology developed by Eberhard, Tarpenning and their initial team

Musk is often lauded as an eccentric genius in the mold of Steve Jobs, who had his fingers in every aspect of Tesla. While he was certainly influential, it’s not in the way most people think. The Model S, Model X and Model 3 soon became plagued by production issues, failed software updates, product quality red flags and continually failing to meet to meet Musk’s wildly optimistic and often delusional predictions, both in terms of sales and promised updates. Those things all happened on Musk’s watch.  Even with all this, Tesla was the darling of investors and media, driving it to be the most valuable car company in the world.

Then came the Cybertruck.

Introduced in 2019, the Cybertruck did have Musk’s fingerprints all over it. The WTF design, the sheer impracticality of a truck in name only, a sticker price nearly double of what Musk originally promised and a host of quality issues including body panels that have a tendency to fall off have caused sales to not even come close to projections.

In its first year of sales (2024), the Cybertruck sold 40,000 units, about 16% of what Musk predicted annual sales could be. That makes it a bigger fail than the Edsel, which sold 63,000 units against a target of 200,000 sales in its introductory year – 1958. The Edsel did worse in 1959 and was yanked from the market in 1960. The Cybertruck is sinking even faster. In the first quarter of this year, only 6406 Cybertrucks were sold, half the number sold in the same quarter a year ago. There are over 10,000 Cybertrucks on Tesla lots in the U.S., waiting for buyers that have yet to show up.

But it’s not just that the Cybertruck is a flawed product. Musk has destroyed Tesla’s brand in a way that can only be marvelled at. His erratic actions have managed to generate feelings of visceral hate in a huge segment of the market and that hate has found a visible target in the Cybertruck. It has become the symbol of Elon Musk’s increasingly evident meltdown.

I remember my first reaction when I heard that Musk had jumped on the MAGA bandwagon. “How the hell,” I thought, “does that square with the Tesla brand?” That brand, pre-Musk-meltdown and pre-Cybertruck, was a car for the environmentally conscious who had a healthy bank account – excitingly leading edge but not dangerously so. Driving a Tesla made a statement that didn’t seem to be in the MAGA lexicon at all. It was all very confusing.

But I think it’s starting to make a little more sense. That brand was built by vehicles that Musk had limited influence over. Sure, he took full credit for the brand, but just like company he took over, it’s initial form and future direction was determined by others.

The Cybertruck was a different story. That was very much Musk’s baby. And just like his biological ones (14 and counting), it shows all the hallmarks of Musk’s “bull in a China shop” approach to life. He lurches from project to project, completely tone-deaf to the implications of his actions. He is convinced that his genius is infallible. If the Tesla brand is a reflection of Musk, then the Cybertruck gives us a much truer picture. It shows what Tesla would have been if there had never been a Martin Eberhard and Marc Tarpenning and Musk was the original founder.

To say that the Cybertruck is “off brand” for Tesla is like saying that the Titanic had a tiny mishap. But it’s not that Musk made a mistake in his brand stewardship. It’s that he finally had the chance to build a brand that he believed in.