What Authoritarianism Gets Wrong

Like the rest of the world, my attention and intentions got hijacked over the weekend by what is happening in Minneapolis. I did not intend to write this post, but I feel I must.

What is happening right now is – plain and simple – authoritarianism. Some – like Jonathon Rausch in the Atlantic –  have used the word Fascism. Whatever label you put on it, it has the same flawed logic behind it – the belief that might makes right. It’s the same calculus of cruelty and coercion that the school yard bully uses: I’m bigger than you so do what I want you to do.

Here’s the problem with that formula. Resolve, resistance and resiliency aren’t things that can be consistently quantified. They are not static. The bewildering thing about humans when we’re faced with a crisis is this: the harder you push, the harder we’ll push back.

This is the reality of the red line. We accept adversity only up to a certain point. Past that point, individual concerns give way to that of the greater good. We join together into a coalition, dismantling the smaller walls that used to separate us to unite and fight a greater enemy that threatens us all. Rather than being beaten down by adversity, it raises us up.

We have always done this. Journalist Sebastian Junger documents one example in his excellent book Tribe: On Homecoming and Belonging. During the London Blitz, Hitler believed he could bomb Londoners into submission. For 56 days he tried, dropping over 12,000 tonnes of bombs on the city, sure that it would break the will of Londoners. On one day alone, in September 1940, over 700 tones of high explosives and 86,000 incendiaries fell, killing 1,436 people. But the resolve of Londoners never wavered. In fact, it grew with adversity. They kept calm and carried on.

I’ve seen it firsthand in my own community. Our city, Kelowna, B.C., has been threatened with wildfires a number of times. In 2003, our city of 150,000 lost over 200 homes in one night and one third of the city was evacuated.

I have never seen this city come together like it did then. Neighbours helped neighbours. Those of us who weren’t evacuated opened our homes to those that were. In many cases, spare bedrooms and pull-out couches were occupied by total strangers. Crisis centers were swamped with offers of food, clothing, blankets and volunteer assistance.

This is how we’re wired. We band together in times of trouble. We are tribal creatures. As Junger found in his research, psychological health actually seems to improve in times of crisis. He cites a 1961 paper by American sociologist Charles Fritz, which opens with this sentence, “Who do large-scale disasters produce such mentally healthy conditions?” Junger writes, “Fritz’s theory was that modern society has gravely disrupted the social bonds that have always characterized the human experience, and that disasters thrust people back into a more ancient, organic way of relating. Disasters, he proposed, create a ‘community of sufferers’ that allows individuals to experience an immensely reassuring connection to others.”

Humans evolved to join together to overcome obstacles. Our modern world doesn’t often ask that of us. But right now, in Minneapolis, that’s exactly what’s happening as thousands of ordinary people are coordinating protection patrols to document authoritarianism. They are using the encrypted Signal platform to communicate and direct observers to emerging trouble areas. They have established their own protocols of behaviour. It is, in the words of Robert F. Worth, again writing in the Atlantic, “a meticulous urban choreography of civic protest.”

At least two Minnesotans have paid as much as they mortally can, with their own lives.

This is the wrench that humans throw into the crushing cogs of authoritarian behaviour: the more you crack down on us, the stronger we will become as we join together to push back against you.

Of all the places on Earth, Americans should know this.  I can think of one more example of this that is particularly relevant. It happened 250 years ago, when American colonists joined together to protest against the authority of the British Crown.

We shouldn’t forget that.

Living with Chronic Disappointment

I was reading recently that 70% of American ex-pats that move to their dream destinations move back to the US within 5 years. Their fantasy of a sun-drenched, easier life in places like southern Portugal, Spain or Italy didn’t quite come true when their expectations run into reality. The Algarve villa, Costa del Sol hacienda or Sicilian villaggio that seemed so wonderful when you went there for a three-week vacation constitutes a different ball of wax entirely when you pick up your stakes and attempt to embed them again in foreign soil. There is a reason why everything seems so laid back in these Mediterranean destinations – it’s because it’s really hard to get anything done there- especially if you’re a foreigner carrying the extra baggage of North American entitlement.

Our unfulfilled expectations are becoming more and more of a problem. We incorrectly tend to over-forecast the positives and under forecast the negatives when we think about the future. And things seem to be trending towards more of this in the future.

I have always tried to live by the Kellogg’s Variety Pack Philosophy – everything in life is a mix – some things are great, some things you just have to put up with. Remember those trays of little individual sized cereal boxes? We used to get them when we went camping. For every little box of Frosted Flakes or Froot Loops, there would be a box of Pep or Bran Flakes. But we (and by we – I mean my 10-year-old self) cannot live on Froot Loops alone. Someone needs to eat the Pep. The sooner we learn that, the less disappointing life becomes.

This philosophy applies to most things in life – the people on your cruise, the cousins you’re going to run into at your family reunion, the things you do in your job, the experiences you’re going to have on your next vacation – even how happy you will be today. Not everything can be wonderful. But not everything will be horrible either.

There’s nothing new about this, but for some reason, our expectations seem to be set at an impossibly high level for more and more things lately. All we want is a life full of Froot Loops – or sunsets on the Costa Del Sol sipping sangria, and when the world can’t possible deliver what we expect, we end up living with chronic disappointment.

Now, obviously we’re not all that fragile that we’ll collapse is a sobbing heap if it rains on our birthday or we’re 8th in line at the grocery store checkout. We are made of sterner stuff than that. But I’ve also seen a noticeable trend towards less tolerance.  

For example, how often do you hear the word “toxic” now? Toxic used to be exclusively applied to things that were – well – toxic: industrial waste, hazardous chemicals, weapons of mass destruction. I think we can all agree that those things are 100% bad. But in the last ten years, toxic started being applied to the general stuff of our lives – people, jobs, behaviors, experiences and situations. And when we give things the label “toxic” we write those things off as a whole. We cease trying to look for the positive in any of it. Our patience with the real world runs out.

As it turns out, even disappointment is not an entirely bad thing. It does serve an evolutionary purpose. Part of our brain’s ability to learn and adapt is due to something called Reward Prediction Error – which measures the difference between expected and actual rewards. Using dopamine as the driver, the brain gets a pleasant jolt with unexpected rewards, a neutral response for expected rewards and if we end up with less than we expected, the dopamine factory shuts down and we get mopey. Suddenly, everything takes on a negative tinge.

This mechanism works well when disappointment is just part of our adaptive landscape, a temporary signal that tells us to steer towards something that offers a better chance of reward. But in a world where all our media is telling us to expect something better, bigger and more exciting, because that seems to be what everyone else is enjoying, real life will never live up to our expectations. We are doomed to be chronically disappointed.

When that happens, our brains start to rewire the dopamine circuits, trying to protect itself by recalibrating away from anticipation, moving from hope to pessimism. We settle for dopamine-neutral responses, trying to avoid the dopamine lows. We expect the bad and stop looking for the good. Our world seems filled with toxicity.

Here’s the problem with that. When we enter that state of mind, we prejudge a lot of the world as being toxic. Remember, the biggest dopamine jolt comes with unexpected rewards. It we look at the whole world with cynical eyes, we shut ourselves down to those surprise positive experiences that get the dopamine flowing again.

And that might be the biggest disappointment of all, because the joy of life is almost never planned. It just happens.

We’re Constantly Rewriting Our History

“No man ever steps into the same river twice, because it is not the same river, and he is not the same man.” – Heraclitus

Time is a funny thing. It is fluid and flowing and ever changing. It’s no surprise than that The Greek philosopher Heraclitus tried to describe it by using the analogy of a river. He then doubled down on the theme of change by saying it wasn’t only the river that was constantly changing. It was also the person stepping in the river. With time, nothing ever stays static. To try to capture the present we inhabit is simply taking a snapshot in time, from one of a million different vantage points.

This is also true when we look backwards. Like time itself, our history does not stay static. It is constantly being rewritten, depending on when and where we are and what our view of our own reality is. The past is constantly in flux – eternally in the process of being rewritten using the lens of today’s culture and political reality to interpret what happened yesterday.

This is happening everywhere.

Right now, in the occupied parts of Ukraine, school history curriculums are being rewritten en masse to conform to a Kremlin approved version of the past dictated by Moscow’s Ministry of Enlightenment. References to Ukraine and Kyiv are being edited out. There are numerous mentions of Putin as the savior of the area’s true Russian heritage. Teachers who try to remain pro-Ukrainian are being threatened with deportation, forcing them into hiding or being sent for “re-training.”

Here in Canada, the country’s history that is being taught in schools today bears scant resemblance to the history I learned as a child some six decades ago. The colonial heroes of the past (almost all of English, Scottish or French descent) are being re-examined in the light of our efforts to reconcile ourselves to our true history. What we know now were that many of the historic heroes we used to name universities after and erect statues to honor were astoundingly racist and complicit in a planned program of cultural eradication against our indigenous population.

And in the US, the MAGA-fication of cultural and heritage institutions is proceeding at a breakneck pace. Trump has tacked his name onto the Kennedy Centre. The White House is in the process of being “bedazzled” into a grotesque version of its former stately self, cloaked in a design sensibility more suitable for a 17th century French Sun King.

Perhaps the most overt example of rewriting history came with an executive order issued last year with the title “Restoring Truth and Sanity to American History.” This little Orwellian gem gives J.D. Vance (who sits on the Smithsonian’s Board of Regents) the power to eliminate “improper, divisive or anti-American ideology” from the museums and related centers. The inconvenient bits of history that this order aims to sweep under the carpet include slavery and the U.S.’s own sordid history of colonialism. These things have been determined to be “un-American.”

Compare all of this to the mission statement of the Smithsonian, which is to “increase and diffuse knowledge, providing Americans and the world with the tools and information they need to forge Our Shared Future.”

I wholeheartedly agree with that mission. I have said that we need to know our past to know what we aspire to be in the future. But that comes with a caveat; you have to embrace the past – as near as you’re able – for what it truly was, warts and all. Historians have an obligation to not whitewash the past. But we also must realize that actions we abhor today took place within a social context that made them more permissible – or even lauded – at the time. It is a historian’s job to record the past faithfully but also to interprete it given the societal and cultural context of the present.

This is the balancing act that historians have to engage in we’re truly going to use the past as something we can learn from.

Singing in Unison

It’s the year-end so it’s time to reflect and also to look forward, carrying what we’ve learned in the past into an uncertain future.

Let me share one thing I’ve learned; we have to get serious about how we create community. And by community, I will use a very specific definition. In fact, perhaps it would be more accurate to replace “community” with “choir.”

Let me explain my thought with a story.

In the late1980’s, Harvard professor Bob Putnam was in Italy doing research. He was studying Italy’s regional decentralization of power which began in 1970. For a political scientist like Putnam, this was an opportunity that didn’t come often. Italy had passed power down to its 20 regional governments and had also established a single framework for administration and governance. This framework was the constant. The variables were the people, the societal environment and the nature of the regions themselves. If anyone is familiar with Italy, you know that there are vast differences between these regions, especially from the north to the south.

For Bob Putnam, he looked at how effective each administrative government was – was democracy working in the region? Even though the administrators were all referring to the same playbook, the results were all over the map – literally. Generally speaking, governance in Northern and Central Italy was much more effective than in the South.

For Putnam, the next big question was – why? What was it about some regions that made democracy work better than in others. He looked for correlations in the reams of data he had collected. Was it education? Wealth? Occupational breakdowns? In examining each factor, he found some correlation, but they all came short of the perfect positive relationship he was looking for.

Finally, he took a break from the analysis and drove into the country with his wife, Rosemary.  Stopping in one town, he heard music coming from a small church, so the two stepped inside. There, a volunteer choir was singing.  It may sound cliché, but in that moment, Bob Putnam had an epiphany. Perhaps the answer lay in people coming together, engaging in civic activities and creating what is called “social capital” by working together as a group.

Maybe democracy works best in places where people actually want to sing together.

Bob Putnam relooked at the numbers and, sure enough, there was almost a perfect correlation. The regions that had the most clubs, civic groups, social organizations and – yes – choral societies also had the highest degree of democratic effectiveness. This set Putnam on a path that would lead to the publishing of this work in 1993 under the title of Making Democracy Work along with his subsequent 2000 best seller, Bowling Alone. (If you’d like to know more about Bob Putnam, check out the excellent Netflix documentary, Join or Die).

Putnam showed it’s better to belong – but that only explains part of the appeal of a choir. There has to be something special about singing together.

Singing as a group is one of those cultural universals; people do it everywhere in the world. And we’ve been doing it for ever, since before we started recording our history. Modern science has now started to discover why. Singing as a group causes the brain to release oxytocin – christened the “moral” molecule by neuro-economist Paul J. Zak – by the bucketload. Zak explains the impact of this chemical, “When oxytocin is raised, people are more generous, they’re more compassionate and, in particular, they’re empathetic – they connect better to people emotionally.”

The results of an oxytocin high are the creation of the building blocks of trust and social capital. People who sing together treat each other better. Our brains start something tuning into other brains through something called neural synchrony. We connect with other people in a profoundly and beautifully irrational way that burrows down through our consciousness to a deeply primal level.

But there is also something else going on here that, while not unique to singing together, finds a perfect home in your typical community choir.

Sociologist Émile Durkheim found that groups that do the same thing at the same time experience something called “collective effervescence.” This is the feeling of being “carried away” and being part of a whole that is greater than the sum of its parts. You find it in religious ceremonies, football stadiums, rock concerts and – yes – you’ll find it in choirs.

So, if singing together is so wonderful, why are we doing it less and less? When was the last time you sang – really sang, not just moved your lips – with others? For myself, it’s beyond the limits of my own memory. Maybe it was when I was still a kid. And I suspect the reason I haven’t sang out loud since is because someone, somewhere along the line, told me I can’t sing.

But that’s not the point. Singing shouldn’t be competitive. It should be spiritual. We shouldn’t judge ourselves against singers we see on media. This never used to be the case.  It’s just one more example of how we can never be good enough – at anything – if we use media for our mirror.

So, in 2026, I’m going to try singing more. Care to join me?

The Long-Term Fallout from MAGA: One Canadian’s Perspective

The other day, an American friend asked how Canada was currently feeling about Trump and the whole MAGA thing. You may remember some months back a number of broadsides towards Canada from the president that seemingly came from nowhere -– Trump threatening/cajoling us to become the 51st state, on again-off again tariffs, continued assertions that the US does not need Canada for anything, completely unveiled threats towards us from Pete Hoekstra, the American Ambassador to Canada.

We took it personally. “Elbows up” became the Canadian rallying cry – a reference to protecting yourself in our beloved national sport – fighting along the boards balanced on frozen water while wearing sharp blades on your feet. Liquor stores had shelf after empty shelf that once were laden with California reds and Kentucky bourbon. Canadian trips to Disneyland and Las Vegas plummeted. Grocery stores started labeling products that (supposedly – which is another story) came from Canada. Canadian consumers and businesses scrambled to find Canadian substitutes for traditional American suppliers.

That was then. What about now?

Trump and the MAGA train have moved on to an endless list of other scandals and dumpster fires. I haven’t heard a whisper of the 51st state for a long time. While our trade war continues on, fueled by shots across the bow from both sides, I think it’s fair to say that we are now just lumped with every other country reeling from the daily bat-shit crazy barrage coming from Washington. Canadians are used to being ignored, for good or bad, so we’re back to situation normal – all F*$%ed up.

But have Canadians moved on? Have we dropped said elbows? The honest answer is – it’s complicated.

Predictably the patriotic fervor we had early this year has cooled off. California reds are back on the shelves. More Canadians are planning to visit Hawaii and Florida this winter. “Grown in the U.S.A.” stickers are back where they belong, in the produce bins at our grocery stores. When it comes to our American habit – it’s like the line from Brokeback Mountain – “We wish we knew how to quit you.”

Like all relationships, the one between the US and Canada is complex. It’s unrealistic to expect a heavily intertwined relationship like ours to disappear overnight. There are probably no two countries in the world more involved with each other’s business than we are. And that cuts both ways, despite what Mr. Trump says. We have been married to each other for a very long time. Even if we want to go through with it, a divorce is going to take some time.

The numbers from the first six months of our “Buy Canadian” campaign are in, and they are less than inspiring. According to StatsCan, 70% of Canadian businesses saw no increase in sales at all. Even with those that did, the impact was minimal and any gain was usually offset by other sales challenges.  

But if you dig a little deeper, there are signs that there might be more long-term damage done here than first meets the eye. In Canadian grocery stores over the past six months, sales of “Made in Canada” products are up 10% while U.S. made goods are down 9%. Those aren’t huge swings, but they have been sustained over 6 months, and in the words of one Canadian analyst speaking on CBC Radio, when something lasts for 6 months, “you’re moving from fad territory to trend territory.”

The dilemma facing Canadians is something called the “Attitude Behavior Gap” – the difference between what we want to do and what we are actually doing. Canadians – 85% of us anyway – want to buy Canadian rather than American, but it’s really hard to do that. Canadian goods are harder to find and typically cost more. It’s the reality of having a trading partner that outnumbers you both in market size and output by a factor of 10 to 1. If we want to have a Ceasar salad in December, we’re going to have to buy lettuce grown in the U.S.

But we are talking relationships here, so let’s relook at that 85% intention to “Buy Canadian” number again. That means that – 6 months after we were insulted – we still feel that a fundamental trust was irrevocably broken. We’re being pragmatic about it, but our intention is clear, we’re looking for alternatives to our past default behavior – buying American. When those alternatives make economic and behavioral sense to us, we’ll find other partners. That is what is happening in Canada right now.

Should Americans care? I believe so. Because I’m sure we’re not the only ones. The world is currently reeling from the sharp American pivot away from being a globally trusted partner. The short-term reality is that we will put up with it for now and pander to the Presidential powers that be, because we have to.

But we’re looking for options. Our dance card is suddenly wide open.

The Cost of Not Being Curious

The world is having a pandemic-proportioned wave of Ostrichitis.

Now, maybe you haven’t heard of Ostrichitis. But I’m willing to bet you’re showing at least some of the symptoms:

  • Avoiding newscasts, especially those that feature objective and unbiased reporting
  • Quickly scrolling past any online news items in your feed that look like they may be uncomfortable to read
  • Dismissing out of hand information coming from unfamiliar sources

These are the signs of Ostrichitis – or the Ostrich Effect – and I have all of them. This is actually a psychological effect, more pointedly called willful ignorance, which I wrote about a few years ago. And from where I’m observing the world, we all seem to have it to one extent or another.

I don’t think this avoidance of information comes as a shock to anyone. The world is a crappy place right now. And we all seem to have gained comfort from adopting the folk wisdom that “no news is good news.” Processing bad news is hard work, and we just don’t have the cognitive resources to crunch through endless cycles of catastrophic news. If the bad news affirms our existing beliefs, it makes us even madder than what we were. If it runs counter to our beliefs, it forces us to spin up our sensemaking mechanisms and reframe our view of reality. Either way, there are way more fun things to do.

A recent study from the University of Chicago attempted to pinpoint when children started avoid bad news. The research team found that while young children don’t tend to put boundaries around their curiosity, as they age they start avoiding information that challenges their beliefs or their own well-being. The threshold seems to be about 6 years old. Before that, children are actively seeking information of all kinds (as any parent barraged by never ending “Whys” can tell you). After that, chidren start strategizing the types of information they pay attention to.

Now, like everything about humans, curiosity tends to be an individual thing. Some of us are highly curious and some of us avoid seeking new information religiously. But even if we are a curious sort, we may pick and choose what we’re curious about. We may find “safe zones” where we let our curiosity out to play. If things look too menacing, we may protect ourselves by curbing our curiosity.

The unfortunate part of this is that curiosity, in all its forms, is almost always a good thing for humans (even if it can prove fatal to cats).

The more curious we are, the better tied we are to reality. The lens we use to parse the world is something called a sense-making loop. I’ve often referred to this in the past. It’s a processing loop that compares what we experience with what we believe, referred to as our “frame”. For the curious, this frame is often updated to match what we experience. For the incurious, the frame is held on to stubbornly, often by ignoring new information or bending information to conform to their beliefs. A curious brain is a brain primed to grow and adapt. An incurious brain is one that is stagnant and inflexible. That’s why the father of modern-day psychology, William James, called curiosity “the impulse towards better cognition.”

When we think about the world we want, curiosity is a key factor in defining it. Curiosity keeps us moving forward. The lack of curiosity locks us in place or even pushes us backwards, causing the world to regress to a more savage and brutal place. Writers of dystopian fiction knew this. That’s why authors including H.G. Wells, Aldous Huxley, Ray Bradbury and George Orwell all made a lack of curiosity a key part of their bleak future worlds. Our current lack of curiosity is driving our world in the same dangerous direction.

For all these reasons, it’s essential that we stay curious, even if it’s becoming increasingly uncomfortable.

Being in the Room Where It Happens

I spent the past weekend attending a conference that I had helped to plan. As is now often the case, this was a hybrid conference; you could choose to attend in person or online via Zoom. Although it involved a long plane ride, I choose to attend in person. It could be because – as a planner – I wanted to see how the event played out. Also, it’s been a long time since I attended a conference away from my home. Or – maybe – it was just FOMO.

Whatever the reason, I’m glad I was there, in the room.

This was a very small conference planned on a shoestring budget. We didn’t have money for extensive IT support or AV equipment. We were dependent solely on a laptop and whatever sound equipment our host was able to supply. We knew going into the conference that this would make for a less-than-ideal experience for those attending virtually. But – even accounting for that – I found there was a huge gap in the quality of that experience between those that were there and those that were attending online. And, over the duration of the 3-day conference, I observed why that might be so.

This conference was a 50/50 mix of those that already knew each other and those that were meeting each other for the first time. Even those who were familiar with each other tended to connect more often via a virtual meeting platform than in a physical meeting space. I know that despite the convenience and efficiency of being able to meet online, something is lost in the process. After the past two days, carefully observing what was happening in the room we were all in, I have a better understanding of what that loss might be – it was the vague and inexact art of creating a real bond with another person.

In that room, the bonding didn’t happen at the speaking podium and very seldom happened during the sessions we so carefully planned. It seeped in on the sidelines, over warmed-over coffee from conference centre urns, overripe bananas and the detritus of the picked over pastry tray. The bonding came from all of us sharing and digesting a common experience. You could feel a palpable energy in the room. You could pick up the emotion, read the body language and tune in to the full bandwidth of communication that goes far beyond what could be transmitted between an onboard microphone and a webcam.

But it wasn’t just the sharing of the experience that created the bonds. It was the digesting of those experiences after the fact. We humans are herding animals, and that extends to how we come to consensus about things we go through together. We do so through communication with others – not just with words and gesture, but also through the full bandwidth of our evolved mechanisms for coming to a collective understanding. It wasn’t just that a camera and microphone couldn’t transmit that effectively, it was that it happened where there was no camera or mic.

As researchers have discovered, there is a lived reality and a remembered reality and often, they don’t look very much alike. The difference between the effectiveness of an in-person experience and one accessed through an online platform shouldn’t come as a surprise to us. This is due to how our evolved sense-making mechanisms operate. We make sense of reality both internally, through a comparison with our existing cognitive models and externally, through interacting with others around us who have shared that same reality. This communal give-and-take colors what we take with us, in the form of both memories and an updated model of what we know and believe. When it comes to how humans are built, collective sense making is a feature, not a bug.

I came away from that conference with much more than the content that was shared at the speaker dais. I also came away with a handful of new relationships, built on sharing an experience and, through that, laying down the first foundations of trust and familiarity. I would not hesitate to reach out to any of these new friends if I had a question about something or a project I felt they could collaborate on.

I think that’s true largely because I was in the room where it happened.

Lilith Fair: A Quarter Century and A Different World Ago

Lilith Fair: Building a Mystery, a new documentary released on Hulu (CBC Gem in Canada), is much more than a chronicle of a music festival. It’s a very timely statement on the both the strength and fragility of community.

Lilith Fair was the festival launched in 1997 by Canadian singer/songwriter Sarah McLachlan. It was conceived as a feminine finger in the eye of a determinedly misogynistic music industry. At the end of the 90’s, despite a boom in talented female singer songwriters (Tracy Chapman, Jewel, Paula Cole, Sheryl Crow, Natalie Merchant, Shawn Colvin, Lisa Loeb, Suzanne Vega and others too numerous to mention), radio stations wouldn’t run two songs by women back-to-back. They also wouldn’t book two women on the same concert ticket. The feeling, based on nothing other than male intuition, was that it would be too much “femininity” for the audience to handle.

McLachlan, in her charmingly polite Canadian way, said “Fudge you!” and launched her own festival. The first one, in 1997, played almost 40 concerts over 51 days across North America. The line-up was exclusively female – 70 singers in all playing on three stages. Almost every concert sold out. Apparently, there was an audience for female talent. Lilith Fair would be repeated in 1998 and 1999, with both tours being smashing successes.

The World needed Lilith Fair in the late 90s. It wasn’t only the music industry that was misogynistic and homophobic. It was our society. The women who played Lilith Fair found a community of support unlike anything they had ever experienced in their careers. Performers who had been feeling isolated for years suddenly found support and – more than anything – understanding.

It was women who made the rules and ran the Lilith Fair show. It was okay to perform when you were 8 months pregnant. It was okay to hold your baby onstage as you performed the group encore. It was okay to bring the whole family on tour and let the kids play backstage while you did your set. These were things that were – up until then – totally foreign in the music industry. It was the very definition of community – diverse people having something in common and joining together to deal from a position of strength.

But it didn’t happen overnight. It took a while – and a lot of bumping into each other backstage – for the community to gel. It also needed a catalyst, which turned out to be Amy Ray and Emily Saliers – officially known as the Indigo Girls. It was their out-going friendliness that initially broke the ice “because we were so gay and so puppy dog-like.”

This sense of community extended beyond the stage to the thousands who attended: men and women, old and young, straight and gay. It didn’t matter – Lilith Fair was a place where you would be accepted and understood. As documentary producer Dan Levy (of Schitt’s Creek fame) – who was 12 years old when he attended and was yet to come out – said, “Being there was one of the earliest memories I’ve had of safety.”

The unity and inclusiveness of Lilith Fair stood in stark contrast to another festival of the same era – Woodstock 99. There, toxic masculinity from acts like Limp Bizkit singer Fred Durst and Kid Rock, swung the vibe of the event heavily towards anarchy and chaos rather than community.

But while Lilith Fair showed the importance of community, it also showed how fragile it could be. The festival became the butt of jokes on late night television (including one particularly cringe-worthy one by Jay Leno about Paula Cole’s body hair) and those that sought to diminish its accomplishments and importance. Finally, at the end of the 1999 tour, McLachlan had had enough. The last concert was played in the rain at Edmonton, Alberta on August 31st.

McLachlan did try to revive Lilith Fair in 2010, but it was a complete failure. Whatever lightening in a bottle she had captured the first time was gone. The world had passed it by. The documentary didn’t dwell on this other than offering a few reasons why this might be. Perhaps Lilith Fair wasn’t needed anymore. Maybe it had done its job. After all, women had mounted some of the top tours of that time, including Taylor Swift, Madonna, Pink and Lady Gaga.

Or maybe it had nothing to do with the industry. Maybe it had everything to do with us, the audience.

The world of 1999 was very different place than the world of 2010. Community was in the midst of being redefined from those sharing a common physical location to those sharing a common ideology in online forums. And that type of community didn’t require a coming together. If anything, those types of communities kept us apart, staring at a screen – alone in our little siloes.

According to the American Time Use Survey, the time spent in-person socializing has been on a steady decline since 2000.  This is especially true for those under the age of 25, the prime market for musical festivals. When we did venture forth to see a concert, we are looking for spectacle, not community. This world was moving too fast for the coalescing of the slow, sweet magic that made Lilith Fair so special.

At the end of the documentary, Sarah McLachlan made it clear that she’ll never attempt to bring Lilith Fair back to life. It was a phenomenon of that time. And that is sad – sad indeed.

When Did the Future Become So Scary?

The TWA hotel at JFK airport in New York gives one an acute case of temporal dissonance. It’s a step backwards in time to the “Golden Age of Travel” – the 1960s. But even though you’re transported back 60 years, it seems like you’re looking into the future. The original space – the TWA Flight Center – was designed in 1962 by Eero Saarinen. This was a time when America was in love with the idea of the future. Science and technology were going to be our saving grace. The future was going to be a utopian place filled with flying jet cars, benign robots and gleaming, sexy white curves everywhere.  The TWA Flight Center was dedicated to that future.

It was part of our love affair with science and technology during the 60s. Corporate America was falling over itself to bring the space-age fueled future to life as soon as possible. Disney first envisioned the community of tomorrow that would become Epcot. Global Expos had pavilions dedicated to what the future would bring. There were four World Fairs over 12 years, from 1958 to 1970, each celebrating a bright, shiny white future. There wouldn’t be another for 22 years.

This fascination with the future was mirrored in our entertainment. Star Trek (pilot in 1964, series start in 1966) invited all of us to boldly go where no man had gone before, namely a future set roughly three centuries from then.   For those of us of a younger age, the Jetsons (original series from 1963 to 64) indoctrinated an entire generation into this religion of future worship. Yes, tomorrow would be wonderful – just you wait and see!

That was then – this is now. And now is a helluva lot different.

Almost no one – especially in the entertainment industry – is envisioning the future as anything else than an apocalyptic hell hole. We’ve done an about face and are grasping desperately for the past. The future went from being utopian to dystopian, seemingly in the blink of an eye. What happened?

It’s hard to nail down exactly when we went from eagerly awaiting the future to dreading it, but it appears to be sometime during the last two decades of the 20th Century. By the time the clock ticked over to the next millennium, our love affair was over. As Chuck Palahniuk, author of the 1999 novel Invisible Monsters, quipped, “When did the future go from being a promise to a threat?”

Our dread about the future might just be a fear of change. As the future we imagined in the 1960’s started playing out in real time, perhaps we realized our vision was a little too simplistic. The future came with unintended consequences, including massive societal shifts. It’s like we collectively told ourselves, “Once burned, twice shy.” Maybe it was the uncertainty of the future that scared the bejeezus out of us.

But it could also be how we got our information about the impact of science and technology on our lives. I don’t think it’s a coincidence that our fear of the future coincided with the decline of journalism. Sensationalism and endless punditry replaced real reporting just about the time we started this about face. When negative things happened, they were amplified. Fear was the natural result. We felt out of control and we keep telling ourselves that things never used to be this way.  

The sum total of all this was the spread of a recognized psychological affliction called Anticipatory Anxiety – the certainty that the future is going to bring bad things down upon us. This went from being a localized phenomenon (“my job interview tomorrow is not going to go well”) to a widespread angst (“the world is going to hell in a handbasket”). Call it Existential Anticipatory Anxiety.

Futurists are – by nature – optimists. They believe things well be better tomorrow than they are today. In the Sixties, we all leaned into the future. The opposite of this is something called Rosy Retrospection, and it often comes bundled with Anticipatory Anxiety. It is a known cognitive bias that comes with a selective memory of the past, tossing out the bad and keeping only the good parts of yesterday. It makes us yearn to return to the past, when everything was better.

That’s where we are today. It explains the worldwide swing to the right. MAGA is really a 4-letter encapsulation of Rosy Retrospection – Make America Great Again! Whether you believe that or not, it’s a message that is very much in sync with our current feelings about the future and the past.

As writer and right-leaning political commentator William F. Buckley said, “A conservative is someone who stands athwart history, yelling Stop!”

It’s Tough to Consume Conscientiously

It’s getting harder to be both a good person and a wise consumer.

My parents never had this problem when I was a kid. My dad was a Ford man. Although he hasn’t driven for 10 years, he still is. If you grew up in the country, your choices were simple – you needed a pickup truck. And in the 1960s and 70s, there were only three choices: Ford, GMC or Dodge. For dad, the choice was Ford – always.

Back then, brand relationships were pretty simple. We benefited from the bliss of ignorance. Did the Ford Motor Company do horrible things during that time? Absolutely. As just one example, they made a cost-benefit calculation and decided to keep the Pinto on the road even though they knew it tended to blow up when hit from the rear. There is a corporate memo saying – in black and white – that it would be cheaper to settle the legal claims of those that died than to fix the problem. The company was charged for negligent homicide. It doesn’t get less ethical than that.

But that didn’t matter to Dad. He either didn’t know or didn’t care. The Pinto Problem, along with the rest of the shady stuff done by the Ford Motor Company, including bribes, kickbacks and improper use of corporate funds by Henry Ford II, was not part of Dad’s consumer decision process. He still bought Ford. And he still considered himself a good person. The two things had little to do with each other.

Things are harder now for consumers. We definitely have more choice, and those choices are harder, because we know more.  Even buying eggs becomes an ethical struggle. Do we save a few bucks, or do we make some chicken’s life a little less horrible?

Let me give you the latest example from my life. Next year, we are planning to take our grandchildren to a Disney theme park. If our family has a beloved brand, it would be Disney. The company has been part of my kids’ lives in one form or another since they were born and we all want it to be part of their kid’s lives as well.

Without getting into the whole debate, I personally have some moral conflicts with some of Disney’s recent corporate decisions. I’m not alone. A Facebook group for those planning a visit to this particular park has recently seen posts from those agonizing over the same issue. Does taking the family to the park make us complicit in Disney’s actions that we may not agree with? Do we care enough to pull the plug on a long-planned park visit?

This gets to the crux of the issue facing consumers now – how do we balance our beliefs about what is wrong and right with our desire to consume? Which do we care more about? The answer, as it turns out, seems to almost always be to click the buy button as we hold our noses.

One way to make that easier is to tell ourselves that one less visit to a Disney mark will make virtually no impact on the corporate bottom line. Depriving ourselves of a long-planned family experience will make no difference. And – individually – this is true. But it’s exactly this type of consumer apathy which, when aggregated, allows corporations to get away with being bad moral characters.

Even if we want to be more ethically deliberate in our consumer decisions, it’s hard to know where to draw the line. Where are we getting our information about corporate behavior from? Can it be trusted? Is this a case of one regrettable action, or is there a pattern of unethical conduct? These decisions are always complex, and coming to any decision that involves complexity is always tricky.

To go back to a simpler time, my grandmother had a saying that she applied liberally to any given situation, “What does all this have to do with the price of tea in China?” Maybe she knew what was coming.