2024: A Media Insider Review

(This is my annual look back at what the MediaPost Media Insiders were talking about in the last year.)

Last year at this time I took a look back at what we Media Insiders had written about over the previous 12 months. Given that 2024 was such a tumultuous year, I thought it would be interesting to do it again and see if that was mirrored in our posts.

Spoiler alert: It was.

If MediaPost had such a thing as an elder’s council, the Media Insiders would be it. We have all been writing for MediaPost for a long, long  time. As I mentioned, my last post was my 1000th for MediaPost. Cory Treffiletti has actually surpassed my total, with 1,154 posts. Dave Morgan has written 700. Kaila Colbin has 586 posts to her credit. Steven Rosenbaum has penned 371, and Maarteen Albarda has 367. Collectively, that is well over 4,000 posts.

I believe we bring a unique perspective to the world of media and marketing and — I hope — a little gravitas. We have collectively been around several blocks numerous times and have been doing this pretty much as long as there has been a digital marketing industry. We have seen a lot of things come and go.  Given all that, it’s probably worth paying at least a little bit of attention to what is on our collective minds. So here, in a Media Insider meta analysis, is 2024 in review.

I tried to group our posts in four broad thematic buckets and tally up the posts that fell in each. Let’s do them in reverse order.

Media

Technically, we’re supposed to write on media, which, I admit, is a very vaguely defined category. It could probably be applied to almost everything we wrote, in one way or the other. But if we’re going to be sticklers about it, very few of our posts were actually about media. I only counted 12, the majority of these about TV or movies. There were a couple of posts about music as well.

If you define media as a “box,” we were definitely thinking outside of it.

It Takes a Village

This next category is more in the “Big Picture” category we Media Insiders seem to gravitate toward. It goes to how we humans define community, gather in groups and find our own places in the world. In 2024 we wrote 59 posts that I placed in this category.

Almost half of these posts looked at the role of markets in in our world and how the rules of engagement for consumers in those markets are evolving. We also looked at how we seek information, communicate with each other and process the world through our own eyes.

The Business of Marketing

All of us Media Insiders either are or were marketers, so it makes sense that marketing is still top of mind for us. We wrote 80 posts about the business of marketing. The three most popular topics were — in order — buying media, the evolving role of the agency, and marketing metrics. We also wrote about advertising technology platforms, branding and revenue models. Even my old wheelhouse of search was touched on a few times last year.

Existential Threats

The most popular topic was not surprising, given that it does reflect the troubled nature of the world we live in. Fully 40% of the posts we wrote — 99 in total — were about something that threatens our future as humans.

The number-one topic, as it was last year, was artificial intelligence. There is a caveat here. Not all the posts were about AI as a threat. Some looked at the potential benefits. But the vast majority of our posts were rather doomy and gloomy in their outlook.

While AI topped the list of things we wrote about in 2024, it was followed closely by two other topics that also gave us grief: the death knell of democracy, and the scourge of social media.

The angst about the decay of democracy is not surprising, given that the U.S. has just gone through a WTF election cycle. It’s also clear that we collectively feel that social media must be reined in. Not one of our 28 posts on social media had anything positive to say.

As if those three threats weren’t enough, we also touched briefly on climate change, the wars raging in Ukraine and the Middle East, and the disappearance of personal privacy.

Looking Forward

What about 2025? Will we be any more positive in the coming year? I doubt it. But it’s interesting to note that the three biggest worries we had last year were all monsters of our own making. AI, the erosion of democracy, and the toxic nature of social media all are things which are squarely within our purview. Even if these things are not created by media and marketing, they certainly share the same ecosystem. And, as I said in my 1000th post, if we built these things, we can also fix them.

My 1000th Post – and My 20 Year Journey

Note: This week marks the 1000th post I’ve written for MediaPost. For this blog, all of those posts are here, plus a number that I’ve written for other publications and exclusively for Out of My Gord. But the sentiments here apply to all those posts. If you’re wondering, I’ve written 1233 posts in total.

According to the MediaPost search tool, this is my 1000th post for this publication. There are a few duplicates in there, but I’m not going to quibble. No matter how you count them up, that’s a lot of posts.

My first post was written on August 19th, 2004. Back then I wrote exclusively for the emerging search industry. Google was only 6 years old.  They had just gone public, with investors hoping to cash in on this new thing called paid search. Social media was even greener. There was no Facebook. Something called Myspace had launched the year before.

In the 20 years I’ve written for MediaPost, I’ve bounced from masthead to masthead. My editorial bent evolved from being Search industry specific to eventually find my sweet spot, which I found at the intersection of human behavior and technology.

It’s been a long and usually interesting journey. When I started, I was the parent of two young children who I dragged along to industry events, using the summer search conference in San Jose as an opportunity to take a family camping vacation. I am now a grandfather, and I haven’t been to a digital conference for almost 10 years (the last being the conferences I used to host and program for the good folks here at MediaPost).

When I started writing these posts, I was both a humanist and a technophile. I believed that people were inherently good, and that technology would be the tool we would use to be better. The Internet was just starting to figure out how to make money, but it was still idealistic enough that people like me believed it would be mostly a good thing. Google still had the phrase “Don’t be Evil” as part of its code of conduct.

Knowing this post was coming up, I’ve spent the past few months wondering what I’d write when the time came. I didn’t want it to be yet another look back at the past 20 years. The history I have included I’ve done so to provide some context.

No, I wanted this to be what this journey has been like for me. There is one thing about having an editorial deadline that forces you to come up with something to write about every week or two. It compels you to pay attention. It also forces you to think. The person I am now – what I believe and how I think about both people and technology – has been shaped in no small part by writing these 1000 posts over the past 20 years.

So, If I started as a humanist and technophile, what am I now, 20 years later? That is a very tough question to answer. I am much more pessimistic now. And this post has forced me to examine the causes of my pessimism.

I realized I am still a humanist. I still believe that if I’m face to face with a stranger, I’ll always place my bet on them helping me if I need it. I have faith that it will pay off more often than it won’t. If anything, we humans may be just a tiny little bit better than we were 20 years ago: a little more compassionate, a little more accepting, a little more kind.

So, if humans haven’t changed, what has? Why do I have less faith in the future than I did 20 years ago? Something has certainly changed. But what was it, I wondered?

Coincidentally, as I was thinking of this, I was also reading the late Philip Zimbardo’s book – The Lucifer Effect: Understanding How Good People Turn Evil. Zimbardo was the researcher who oversaw the Stanford Prison Experiment, where ordinary young men were randomly assigned roles as guards or inmates in a makeshift prison set up in a Stanford University basement. To make a long story short – ordinary people started doing such terrible things that they had to cut the experiment short after just 6 days.

 Zimbardo reminded me that people are usually not dispositionally completely good or bad, but we can find ourselves in situations that can push us in either direction. We all have the capacity to be good or evil. Our behavior depends on the environment we function in. To use an analogy Zimbardo himself used, it may not be the apples that are bad. It could be the barrel.

So I realized, it isn’t people who have changed in the last 20 years, but the environment we live in. And a big part of that environment is the media landscape we have built in those two decades. That landscape looks nothing like it did back in 2004.  With the help of technology, we have built an information landscape that doesn’t really play to the strengths of humanity. It almost always shows us the worst side of ourselves. Journalism has been replaced by punditry. Dialogue and debate have been pushed out of the way by demagoguery and divisiveness.

So yes, I’m more pessimistic now that I was when I started this journey 20 years ago. But there is a glimmer of hope here. If people had truly changed, there is not a lot we can do about that. But if it’s the media landscape that’s changed, that’s a different story. Because we built it, we can also fix it.

It’s something I’ll be thinking about as I start a new year.

The Strange Social Media Surge for Luigi Mangione

Luigi Mangione is now famous. Just one week ago, we had never heard of him. But now, he has become so famous, I don’t even have to recount the reason for his fame.

But, to me, what’s more interesting than Mangione’s sudden fame is how we feel about him. According to the Network Contagion Research Institute there is a lot of online support for Luigi Mangione. An online funding campaign has raised over $130,000 for his legal defense fund. The hashtag #FreeLuigi, #TeamLuigi and other pro-Luigi memes have taken over every social media channel. Amazon, Etsy and E-Bay are scrambling to keep Luigi inspired merchandise out of their online stores. His X (formerly Twitter) account has ballooned from 1500 followers to almost half a million.

It’s an odd reaction for someone who is accused of gunning down a prominent American businessman in cold blood.

The outpouring of support for Luigi Mangione is so consequential, it’s threatening to lay a very heavy thumb on the scales of justice. There is so much public support for Luigi Mangione, prosecutors are worried that it could lead to jury nullification. It may be impossible to find unbiased and impartial jurors who would find Mangione guilty, even if it’s proven beyond a reasonable doubt.

Now, I certainly don’t want to comment or Mr. Mangione’s guilt, innocence or whether he’s appropriate material from which to craft a folk hero. Nor do I want to talk about the topic of American Healthcare and the corporate ethics of United Healthcare or any other medical insurance provider.  I won’t even dive into the admittedly juicy and ironic twist that our latest anti-capitalist hero of the common people is a young, white, male, good looking, wealthy and privately educated scion who probably leans right in his political beliefs.

No, I will leave all of that well enough alone. What I do want to talk about is how this had played out through social media and why it’s different than anything we’ve seen before.

We behave and post differently depending on what social platform we’re on at the time. In sociology and psychology, this is called “modality.”  How we act depends on what role we’re playing and what mode we’re in. The people we are, the things we do, the things we say and the way we behave are very different when we’re being a parent at home, an employee at work or a friend having a few drinks after work with our buddies. Each mode comes with different scripts and we usually know what is appropriate to say in each setting.

It was sociologist Erving Goffman who likened it to being on stage in his 1956 book, The Presentation of Self in Everyday Life. The roles we choose to play depends on the audience we’re playing too. We try to stay consistent with the expectations we think the audience has of us. Goffman said, “We are all just actors trying to control and manage our public image, we act based on how others might see us.”

Now, let’s take this to the world of social media. What we post depends on how it plays to the audience of the platform we’re on. We may have a TikTok persona, a Facebook persona and an X persona. But all of those are considered mainstream platforms, especially when compared to platforms like 4Chan, Parler or Reddit. If we’re on any of those platforms, we are probably taking on a very different role and reading from a different script.

Think of it this way. Posting something on Facebook is a little like getting up and announcing something at a townhall meeting that’s being held at your kid’s school. You assume that the audience will be somewhat heterogenous in terms of tastes and ideologies, and you consider your comments accordingly.

But posting something on 4Chan is like the conversation that might happen with your 4 closest bros (4Chan’s own demos admit their audience is 70% male) after way too many beers at a bar. Fear about stepping over the line is non-existent. Racial slurs, misogynistic comments and conspiracy theories abound in this setting.

The thing that’s different with the Mangione example is that comments we would only expect to see on the fringes of social media are showing up in the metaphorical Town Square of Facebook and Instagram (I no longer put X in this category, thank to Mr. Musk’s flirting with the Fringe). In the report from the Network Contagion Research Institute, the authors said,  “While this phenomenon was once largely confined to niche online subcultures, we are now witnessing similar dynamics emerging on mainstream platforms, amplifying the risk of further escalation,”

As is stated in this report, the fear is that by moving discussions of this sort into a mainstream channel, we legitimize it. We have moved the frame of what’s acceptable to say (my oft referenced example of Overton’s Window) into uncharted territory in a new and much more public arena. This could create an information cascade, when can encourage copycats and other criminal behavior.

This is a social phenomenon that will have implications for our future. The degrees of separation between the wild, wacky outer fringes of social media and the mainstream information sources that we use to view the world through are disappearing, one by one. With the Luigi Mangione example, we just realized how much things have changed.

Democracy Dies in the Middle

As I write this, I don’t know what the outcome of the election will be. But I do know this. There has never been an U.S. Presidential election campaign quite like this one. If you were scripting a Netflix series, you couldn’t have made up a timeline like this (and this is only a sampling):

January 26 – A jury ordered Donald Trump to pay E. Jean Carroll $83 million in additional emotional, reputation-related, and punitive damages. The original award was $5 million.

April 15 – Trial of New York vs Donald Trump begins. Trump was charged with 34 counts of felony.

May 30 – Trump is found guilty on all 34 counts in his New York trial, making him the first U.S. president to be convicted of a felony

June 27 – Biden and Trump hold their first campaign debate hosted by CNN. Biden’s performance is so bad, it’s met with calls for him to suspend his campaign

July 1 – The U.S. Supreme Court delivers a 6–3 decision in Trump v. United States, ruling that Trump had absolute immunity for acts he committed as president within his core constitutional purview.  This effectively puts further legal action against Trump on hold until after the election

July 13 – Trump is shot in the ear in an assassination attempt at a campaign rally held in Butler, Pennsylvania. One bystander and the shooter are killed and two others are injured.

July 21 – Biden announces his withdrawal from the race, necessitating the start of an “emergency transition process” for the Democratic nomination. On the same day, Kamala Harris announces her candidacy for president.

September 6 – Former vice president Dick Cheney and former Congresswoman Liz Cheney announce their endorsements for Harris. That’s the former Republican Vice President and the former chair of the House Republican Conference, endorsing a Democrat.

September 15: A shooting takes place at the Trump International Golf Club in West Palm Beach, Florida, while Donald Trump is golfing. Trump was unharmed in the incident and was evacuated by Secret Service personnel.

With all of that, it’s rather amazing that – according to a recent PEW Research Centre report – Americans don’t seem to be any more interested in the campaign than in previous election years. Numbers of people closely following election news are running about the same as in 2020 and are behind what they were in 2016.

This could be attributed in part to a certain ennui on the part of Democrats. In the spring, their level of interest was running behind Republicans. It was only when Joe Biden dropped out in July that Democrats started tuning in in greater numbers. As of September, they were following just as closely as Republicans.

I also find it interesting to see where they’re turning for their election coverage. For those 50 plus, it is overwhelmingly television. News websites and apps come in a distant second.

But for those under 50, Social Media is the preferred source, with news websites and television tied in second place. This is particularly true for those under 30, where half turn to Social media. The 30 to 49 cohort is the most media-diverse, with their sources pretty much evenly split between TV, websites and social media. 

If we look at political affiliations impacting where people turn to be informed, there was no great surprise. Democrats favour the three networks (CBS, NBC and ABC, with CNN just behind. Republicans Turn first to Fox News, then the three networks, then conservative talk radio.

The thing to note here is that Republicans tend to stick to news platforms known for having a right-wing perspective, where Democrats are more open to what could arguably be considered more objective sources.

It is interesting to note that this flips a bit with younger Republicans, who are more open to mainstream media like the three networks or papers like the New York Times. Sixty percent of Republicans aged 18 – 29 cited the three networks as a source of election information, and 45% mentioned the New York Times.

But we also have to remember that all younger people, Republican or Democrat, are more apt to rely on social media to learn about the election. And there we have a problem. Recently, George Washington University political scientist Dave Karpf was interviewed on CBC Radio about how Big Tech is influencing this election.  What was interesting about Karpf’s comments is how social media is now just as polarized as our society. X has become a cesspool of right-leaning misinformation, led by Trump supporter Elon Musk, and Facebook has tried to depoliticize their content after coming under repeated fire for influencing previous campaigns.

So, the two platforms that Karpf said were the most stabilized in past elections have effectively lost their status as common ground for messaging to the right and the left.  Karpf explains, “Part of what we’re seeing with this election cycle is a gap where nothing has really filled into those voids and left campaigns wondering what they can do. They’re trying things out on TikTok, they’re trying things out wherever they can, but we lack that stability. It is, in a sense, the first post social media election.”

This creates a troubling gap. If those under the age of 30 turn first to social media to be informed, what are they finding there? Not much, according to Karpf. And what they are finding is terribly biased, to the point of lacking any real objectivity.

In 2017, the Washington Post added this line under their masthead: “Democracy Dies in Darkness”. , in this polarized mediascape, I think it’s more accurate to say “Democracy Dies in the Middle”.  There’s a Right-Wing reality and a Left-Wing reality. The truth is somewhere in the middle. But it’s getting pretty hard to find it.

Not Everything is Political. Hurricanes, for Example.

During the two recent “once in a lifetime” hurricanes that happened to strike the southern US within two weeks of each other, people apparently thought they were a political plot and that meteorologists were in on the conspiracy,

Michigan meteorologist Katie Nickolaou received death threats through social media.

“I have had a bunch of people saying I created and steered the hurricane, there are people assuming we control the weather. I have had to point out that a hurricane has the energy of 10,000 nuclear bombs and we can’t hope to control that. But it’s taken a turn to more violent rhetoric, especially with people saying those who created Milton should be killed.”

Many weather scientists were simply stunned at the level of stupidity and misinformation hurled their way. After someone suggested that someone should “stop the breathing” of those that “made” the hurricanes, Nickolaou responded with this post, “Murdering meteorologists won’t stop hurricanes. I can’t believe I just had to type that.”

Washington, D.C. based meteorologist Matthew Cappucci also received threats: “Seemingly overnight, ideas that once would have been ridiculed as very fringe, outlandish viewpoints are suddenly becoming mainstream, and it’s making my job much more difficult.” 

Marjorie Taylor Greene, U.S. Representative for  Georgia’s 14th congressional district, jumped forcefully into the fray by suggesting the conspiracy was politically motivated.  She posted on X: “This is a map of hurricane affected areas with an overlay of electoral map by political party shows how hurricane devastation could affect the election.”

And just in case you’re giving her the benefit of the doubt by saying she might just be pointing out a correlation, not a cause, she doubled down with this post on X: “Yes they can control the weather, it’s ridiculous for anyone to lie and say it can’t be done.” 

You may say that when it comes to MTG, we must consider the source and sigh “You can’t cure stupid.”   But Marjorie Taylor Greene easily won a democratic election with almost 66% of the vote, which means the majority of people in her district believed in her enough to elect her as their representative. Her opponent, Marcus Flowers, is a 10-year veteran of the US Army and he served 20 years as a contractor or official for the State Department and Department of Defense. He’s no slouch. But in Georgia’s 14th Congressional district, two out of three voters decided a better choice would be the woman who believes that the Nazi Secret Police were called the Gazpacho.

I’ve talked about this before. Ad nauseum – actually. But this reaches a new level of stupidity…and stupidity on this scale is f*&king frightening. It is the most dangerous threat we as humans face.

That’s right, I said the “biggest” threat.  Bigger than climate change. Bigger than AI. Bigger than the new and very scary alliance emerging between Russia, Iran, North Korea and China. Bigger than the fact that Vladimir Putin, Donald Trump and Elon Musk seem to be planning a BFF pajama party in the very near future.

All of those things can be tackled if we choose to. But if we are functionally immobilized by choosing to be represented by stupidity, we are willfully ignoring our way to a point where these existential problems – and many others we’re not aware of yet – can no longer be dealt with.

Brian Cox, a professor of particle physics at the University of Manchester and host of science TV shows including Universe and The Planets, is also warning us about rampant stupidity. “We may laugh at people who think the Earth is flat or whatever, the darker side is that, if we become unmoored from fact, we have a very serious problem when we attempt to solve big challenges, such as AI regulation, climate or avoiding global war. These are things that require contact with reality.” 

At issue here is that people are choosing politics over science. And there is nothing that tethers political to reality. Politics are built on beliefs. Science strives to be built on provable facts. If we choose politics over science, we are embracing wilful ignorance. And that will kill us.

Hurricanes offer us the best possible example of why that is so. Let’s say you, along with Marjorie Taylor Greene, believe that hurricanes are created by meteorologist and mad weather scientists. So, when those nasty meteorologists try to warn you that the storm of the century is headed directly towards you, you respond in one of two ways: You don’t believe them and/or you get mad and condemn them as part of a conspiracy on social media. Neither of those things will save you. Only accepting science as a reliable prediction of the impending reality will give you the best chance of survival, because it allows you to take action.

Maybe we can’t cure stupid. But we’d better try, because it’s going to be the death of us.

The Political Brinkmanship of Spam

I am never a fan of spam. But this is particularly true when there is an upcoming election. The level of spam I have been wading through seems to have doubled lately. We just had a provincial election here in British Columbia and all parties pulled out all stops, which included, but was not limited to; email, social media posts, robotexts and robocalls.

In Canada and the US, political campaigns are not subject to phone and text spam control laws such as our Canadian Do Not Call List legislation. There seems to be a little more restriction on email spam. A report from Nationalsecuritynews.com this past May warned that Americans would be subjected to over 16 billion political robocalls. That is a ton of spam.

During this past campaign here in B.C., I noticed that I do not respond to all spam with equal abhorrence. Ironically, the spam channels with the loosest restrictions are the ones that frustrate me the most.

There are places – like email – where I expect spam. It’s part of the rules of engagement. But there are other places where spam sneaks through and seems a greater intrusion on me. In these channels, I tend to have a more visceral reaction to spam. I get both frustrated and angry when I have to respond to an unwanted text or phone call. But with email spam, I just filter and delete without feeling like I was duped.

Why don’t we deal with all spam – no matter the channel – the same? Why do some forms of spam make us more irritated than others? It’s almost like we’ve developed a spam algorithm that dictates how irritated we get when we deal with spam.

According to an article in Scientific American, the answer might be in how the brain marshalls its own resources.

When it comes to capacity, the brain is remarkably protective. It usually defaults to the most efficient path. It likes to glide on autopilot, relying on instinct, habit and beliefs. All these things use much less cognitive energy than deliberate thinking. That’s probably why “mindfulness” is the most often quoted but least often used meme in the world today.

The resource we’re working with here is attention. Limited by the capacity of our working memory, attention is a spotlight we must use sparingly. Our working memory is only capable of handling a few discrete pieces of information at a time. Recent research suggests the limit may be around 3 to 5 “chunks” of information, and that research was done on young adults. Like most things with our brains, the capacity probably diminishes with age. Therefore, the brain is very stingy with attention. 

I think spam that somehow gets past our first line of defence – the feeling that we’re in control of filtering – makes us angry. We have been tricked into paying attention to something that was unsuspected. It becomes a control issue. In an information environment where we feel we have more control, we probably have less of a visceral response to spam. This would be true for email, where a quick scan of the items in our inbox is probably enough to filter out the spam. The amount of attention that gets hijacked by spam is minimal.

But when spam launches a sneak attack and demands a swing of attention that is beyond our control, that’s a different matter. We operate with a different mental modality when we answer a phone or respond to a text. Unlike email, we expect those channels to be relatively spam-free, or at least they are until an election campaign comes around. We go in with our spam defences down and then our brain is tricked into spending energy to focus on spurious messaging.

How does the brain conserve energy? It uses emotions. We get irritated when something commandeers our attention. The more unexpected the diversion, the greater the irritation.  Conversely, there is the equivalent of junk food for the brain – input that requires almost no thought but turns on the dopamine tap and becomes addictive. Social media is notorious for this.

This battle for our attention has been escalating for the past two decades. As we try to protect ourselves from spam with more powerful filters, those that spread spam try to find new ways to get past those filters. The reason political messaging was exempt from spam control legislation was that democracies need a well-informed electorate and during election campaigns, political parties should be able to send out accurate information about their platforms and positions.

That was the theory, anyway.

No News is Not Good News

Kelowna, the city I live in – with a population of about 250,000 – just ran its last locally produced TV news show. That marks the end of a 67-year streak. Our local station, CHBC – first signed on the air on September 21, 1957.

That streak was not without some hiccups. There have been a number of ownership changes. The trend in those transitions was away from local ownership towards huge nation spanning media conglomerates. In 2009, when the station became part of the Global network, the intention was to shut down the local station and run everything out of CHAN, the Vancouver Global operation. We kicked up a Kelowna fuss and convinced Global to at least keep a local news presence in the community. But – as it turned out – that was just buying us some time. 15 years later, the plug was finally pulled.

In that time, my city has also essentially lost its daily newspaper, which is a mere ghost of its former self; an anemic online version and a printed paper which is little more than a wrapper for a bunch of grocery flyers.  The tri weekly paper has suffered a similar fate. Radio stations have gutted their local news teams. The biggest news team in the region works for a local news portal. They are young and eager, but few of them are trained journalists.

CHBC started as an extension of local radio. At the time it was launched, only 500 households in the city had a TV set. Broadcasting was “over the air” and I live in a very mountainous location, so it was impossible to watch TV prior to the station signing on. 

Given that the first TV stations only signed on in Canada in 1952 (CBFT in Montreal and CBLT in Toronto), it’s rather amazing to think that my little town (population 10,000 at the time) had its own station just 5 years later. Part of the rapid roll out of TV in Canada was to prevent cultural colonization from the rapidly expanding American TV industry. Our federal government pushed hard to have Canadian programming available from coast to coast.

For the decades that followed, it was local news that defined communities. Local was granular and immediately relevant in a way networks news couldn’t be. It gave you what you needed to know to knowingly participate in local democracy.

For that alone, CHBC News will be missed here in Kelowna.

This story probably resonates with all of you. The death of local journalism is not unique to my city. I have just learned that I probably will be living in a news desert soon.  The  importance of local news is enshrined in the very definition of a news desert:

“a community, either rural or urban, with limited access to the sort of credible and comprehensive news and information that feeds democracy at the grassroots level.”

The death of local news was recently discussed at the Canadian Association of Journalists Annual conference in Toronto. There, April Lindgren, a professor at Toronto Metropolitan University’s School of Journalism and the principal investigator of the Local News Research Project, said this:

I think one of the things .. people don’t think about in terms of the mechanics of the role of local news in a community is the role that it plays in equipping people to participate in decision-making.”

We need local news. A recent study by Resonate said that Americans trust Local News more than any other source. And not just by a little margin. By a lot. The next closest answer was a full 15 percentage points behind.

But there are two existential problems that are pushing local news to the brink of an extinction event. First of all, most local news outlets were swallowed up into corporate mass media conglomerates over the past 3 or 4 decades. And secondly, the business model for local news has disappeared. Local advertising dollars have migrated to other platforms. So the fate of local news had become a P&L decision.

That’s what it was for CHBC. It’s owned by Corus entertainment. Corus owns the Global Network (15 stations), 39 radio stations, 33 specialty TV channels and a bunch of other media miscellanea.  

Oh, did I mention that Corus is also bleeding cash at a fatal rate? On the heels of an announced $770 Million loss (CDN) it cut 25% of its workforce. That was the death knell for CHBC. It didn’t have a hope in hell.

Local news doesn’t have to die. It just has to find another way to live. Like so much of our media environment, basing survival on advertising revenue is a sure recipe for disaster. That’s why the Local News Research Project is floating ideas like supporting local news with philanthropy. I’m not sure that’s a viable or scalable answer.

I think a better idea might be to move local news to protected species status. If we recognize its importance to democracy, especially at local levels, then perhaps tax dollars should go to ensuring it’s survival.

The scenario of government supported local journalism brings up a philosophical debate that I have ignited in the past, when I talked about public broadcasting. It split my readers along national lines, with those from the US giving a thumbs down to the idea, and those from Australia, New Zealand and Canada receiving it more favorably.

Let’s see what happens this time.

The Olympics Are Finished — But We’ll Always Have Paris!

I have to confess: The Olympics sucked me in again.

Prior to the kickoff in Paris, I was unusually ambivalent about the Olympics. Given the debacle that was the spectator-less Tokyo Olympics, it was like the world had agreed not to expect too much from these games. Were the Olympics still relevant? Do we need them anymore?

I caught the opening ceremonies and was still skeptical. It was very Parisienne – absolutely breathtaking, with a healthy dose of “WTF.” Still, I was withholding judgement.

But by day three, I was hooked. I had signed up for the daily Olympic news feed. I was watching Canada’s medal count. I was embarrassed – along with the rest of the nation – by our women’s soccer team’s drone spying scandal. I became an instant expert in all those obscure sports that pique our interest on a quadrennial cycle. I could go on at length about the nuances of speed climbing, slalom canoe or B-Boy breaking.

The Olympics had done it again. Paris did not disappoint.

So, this last Sunday night, I watched the closing ceremony with all the feels you get when you have to say goodbye to those new friends you made as you board the bus taking you home from summer camp. Into this bittersweet reverie of video flashbacks and commentators gushing about this international kumbaya moment, my wife had the nerve to kill my vibe by commenting that “there must be a better use for all the billions this game cost.”

It’s hard to argue against that. The estimated total cost of the games was 9 billion euros, or almost $10 billion U.S. You don’t need to be particularly jaded to realize that the Olympics are really a spectacle for rich nations. Sure, any nation can send a team, but if you combine the 40 smallest teams – coming from places like the Sudan, Chad, Namibia, Lesotho and Belize — you’d have a total of 120 athletes. That would be about the same size as the Olympic team from Denmark, the 25th largest team that attended.

The Olympics are supposed to offer an opportunity to those of all nations, but the bigger your GDP (gross domestic product) the more likely you are to end up with a medal around your neck.

So I come back to the question: Do we still need the Olympics, if only to break the relentless downward spiral of our horrific news cycle for 16 brief days?

Before we get too gooey about the symbolism of the Olympics, we should take a look back at its history.

Baron Pierre de Coubertin, who revived the modern Olympics, did so because he was fascinated by the culture and ideals of ancient Greece. The original Olympic Games were essentially a chance for city states to “one-up” their rivals. A temporary truce was in place during the games but behind the athletic competitions, there was a flurry of alliances and back-room deals being made to gain advantages when Greece went back to its warlike ways after the games.

The idea that the modern games are a symbol of equality and fraternity was — at best – tangential to Coubertin’s original plan. He wanted to encourage amateur competition and athletic prowess because he believed better athletes made better soldiers. The Games were also an attempt to keep amateur sports in the hands of the upper classes, out of the grimy grips of the working class.

Let’s also not forget that women were not allowed to participate in the games until the second Olympiad — the original Paris Olympics in 1900. There were five female athletes and almost 1,000 men participating. And even then, Coubertin was not in favor of it. He later said women competing in sports was “impractical, uninteresting, unaesthetic, and we are not afraid to add: incorrect.”

Even the much-commented-on Olympic tradition of athletes at the Opening Ceremonies coming in divided by nation, but at the closing, all athletes coming in as one, without national divides, was never part of the original plan. That was added by the Aussies in the 1956 Melbourne Games, which would be called the “Friendly Games.” It was put forward by John Ian Wing, an Australian teenager who wrote an anonymous letter to the IOC suggesting the idea. He didn’t put his name on it because he was afraid of the backlash his family (who were Chinese) might receive.

 Let’s get back to today. Paris excelled at pulling off a delicate balancing act. The hope to make these the “Games wide open” was realized at the opening ceremonies, the marathons and the men’s and women’s road races. In the case of the latter, over a million spectators lined the streets of Paris.

The organizing committee managed to balance the French flair for spectacle with a tastefulness that was generally successful. They gave the modern Olympics at least four more years of life.

It remains to be seen whether the inevitable bombast that comes when the Games move to Los Angeles in 2028 will continue the trend — or put the final nail in the coffin.

Why The World No Longer Makes Sense

Does it seem that the world no longer makes sense? That may not just be you. The world may in fact no longer be making sense.

In the late 1960s, psychologist Karl Weick introduced the world to the concept of sensemaking, but we were making sense of things long before that. It’s the mental process we go through to try to reconcile who we believe we are to the world in which we find ourselves.  It’s how we give meaning to our life.

Weick identified 7 properties critical to the process of sensemaking. I won’t mention them all, but here are three that are critical to keep in mind:

  1. Who we believe we are forms the foundation we use to make sense of the world
  2. Sensemaking needs retrospection. We need time to mull over new information we receive and form it into a narrative that makes sense to us.
  3. Sensemaking is a social activity. We look for narratives that seem plausible, and when we find them, we share them with others.

I think you see where I’m going with this. Simply put, our ability to make sense of the world is in jeopardy, both for internal and external reasons.

External to us, the quality of the narratives that are available to us to help us make sense of the world has nosedived in the past two decades. Prior to social media and the implosion of journalism, there was a baseline of objectivity in the narratives we were exposed to. One would hope that there was a kernel of truth buried somewhere in what we heard, read or saw on major news providers.

But that’s not the case today. Sensationalism has taken over journalism, driven by the need for profitability by showing ads to an increasingly polarized audience. In the process, it’s dragged the narratives we need to make sense of the world to the extremes that lie on either end of common sense.

This wouldn’t be quite as catastrophic for sensemaking if we were more skeptical. The sensemaking cycle does allow us to judge the quality of new information for ourselves, deciding whether it fits with our frame of what we believe the world to be, or if we need to update that frame. But all that validation requires time and cognitive effort. And that’s the second place where sensemaking is in jeopardy: we don’t have the time or energy to be skeptical anymore. The world moves too quickly to be mulled over.

In essence, our sensemaking is us creating a model of the world that we can use without requiring us to think too much. It’s our own proxy for reality. And, as a model, it is subject to all the limitations that come with modeling. As the British statistician George E.P. Box said, “All models are wrong, but some are useful.”

What Box didn’t say is, the more wrong our model is, the less likely it is to be useful. And that’s the looming issue with sensemaking. The model we use to determine what is real is become less and less tethered to actual reality.

It was exactly that problem that prompted Daniel Schmachtenberger and others to set up the Consilience Project. The idea of the Project is this – the more diversity in perspectives you can include in your model, the more likely the model is to be accurate. That’s what “consilience” means: pulling perspectives from different disciplines together to get a more accurate picture of complex issues.  It literally means the “jumping together” of knowledge.

The Consilience Project is trying to reverse the erosion of modern sensemaking – both from an internal and external perspective – that comes from the overt polarization and the narrowing of perspective that currently typifies the information sources we use in our own sensemaking models.  As Schmachtenberger says,  “If there are whole chunks of populations that you only have pejorative strawman versions of, where you can’t explain why they think what they think without making them dumb or bad, you should be dubious of your own modeling.”

That, in a nutshell, explains the current media landscape. No wonder nothing makes sense anymore.

Can Media Move the Overton Window?

I fear that somewhere along the line, mainstream media has forgotten its obligation to society.

It was 63 years ago, (on May 9, 1961) that new Federal Communications Commission Chair Newton Minow gave his famous speech, “Television and the Public Interest,” to the convention of the National Association of Broadcasters.

In that speech, he issued a challenge: “I invite each of you to sit down in front of your own television set when your station goes on the air and stay there, for a day, without a book, without a magazine, without a newspaper, without a profit and loss sheet or a rating book to distract you. Keep your eyes glued to that set until the station signs off. I can assure you that what you will observe is a vast wasteland.”

Minow was saying that media has an obligation to set the cultural and informational boundaries for society. The higher you set them, the more we will strive to reach them. That point was a callback to the Fairness Doctrine, established by the FCC in 1949. The policy required that “holders of broadcast licenses to present controversial issues of public importance and to do so in a manner that fairly reflected differing viewpoints.” The Fairness Doctrine was abolished by the FCC in 1987.

What Minow realized, presciently, was that mainstream media is critically important in building the frame for what would come to be called, three decades later, the Overton Window. First identified by policy analyst Joseph Overton at the Mackinaw Center for Public Policy, the term would posthumously be named after Overton by his colleague Joseph Lehman.

The term is typically used to describe the range of topics suitable for public discourse in the political arena. But, as Lehman explained in an interview, the boundaries are not set by politicians: “The most common misconception is that lawmakers themselves are in the business of shifting the Overton Window. That is absolutely false. Lawmakers are actually in the business of detecting where the window is, and then moving to be in accordance with it.

I think the concept of the Overton Window is more broadly applicable than just within politics. In almost any aspect of our society where there are ideas shaped and defined by public discourse, there is a frame that sets the boundaries for what the majority of society understands to be acceptable — and this frame is in constant motion.

Again, according to Lehman,  “It just explains how ideas come in and out of fashion, the same way that gravity explains why something falls to the earth. I can use gravity to drop an anvil on your head, but that would be wrong. I could also use gravity to throw you a life preserver; that would be good.”

Typically, the frame drifts over time to the right or left of the ideological spectrum. What came as a bit of a shock in November of 2016 was just how quickly the frame pivoted and started heading to the hard right. What was unimaginable just a few years earlier suddenly seemed open to being discussed in the public forum.

Social media was held to blame. In a New York Times op-ed written just after Trump was elected president (a result that stunned mainstream media) columnist Farhad Manjoo said,  “The election of Donald J. Trump is perhaps the starkest illustration yet that across the planet, social networks are helping to fundamentally rewire human society.”

In other words, social media can now shift the Overton Window — suddenly, and in unexpected directions. This is demonstrably true, and the nuances of this realization go far beyond the limits of this one post to discuss.

But we can’t be too quick to lay all the blame for the erratic movements of the Overton Window on social media’s doorstep.

I think social media, if anything, has expanded the window in both directions — right and left. It has redefined the concept of public discourse, moving both ends out from the middle. But it’s still the middle that determines the overall position of the window. And that middle is determined, in large part, by mainstream media.

It’s a mistake to suppose that social media has completely supplanted mainstream media. I think all of us understand that the two work together. We use what is discussed in mainstream media to get our bearings for what we discuss on social media. We may move right or left, but most of us realize there is still a boundary to what is acceptable to say.

The red flags start to go up when this goes into reverse and mainstream media starts using social media to get its bearings. If you have the mainstream chasing outliers on the right or left, you start getting some dangerous feedback loops where the Overton Window has difficulty defining its middle, risking being torn in two, with one window for the right and one for the left, each moving further and further apart.

Those who work in the media have a responsibility to society. It can’t be abdicated for the pursuit of profit or by saying they’re just following their audience. Media determines the boundaries of public discourse. It sets the tone.

Newton Minow was warning us about this six decades ago.