My Many Problems with the Metaverse

I recently had dinner with a comedian who had just did his first gig in the Metaverse. It was in a new Meta-Comedy Club. He was excited and showed me a recording of the gig.

I have to admit, my inner geek thought it was very cool: disembodied hands clapping with avataresque names floating above, bursts of virtual confetti for the biggest laughs and even a virtual-hook that instantly snagged meta-hecklers, banning them to meta-purgatory until they promised to behave. The comedian said he wanted to record a comedy meta-album in the meta-club to release to his meta-followers.

It was all very meta.

As mentioned, as a geek I’m intrigued by the Metaverse. But as a human who ponders our future (probably more than is healthy) – I have grave concerns on a number of fronts. I have mentioned most of these individually in previous posts, but I thought it might be useful to round them up:

Removed from Reality

My first issue is that the Metaverse just isn’t real. It’s a manufactured reality. This is at the heart of all the other issues to come.

We might think we’re clever, and that we can manufacturer a better world than the one that nature has given us, but my response to that would be Orgel’s Second Rule, courtesy of Sir Francis Crick, co-discoverer of DNA: “Evolution is cleverer than you are.”

For millions of years, we have evolved to be a good fit in our natural environment. There are thousands of generations of trial and error baked into our DNA that make us effective in our reality. Most of that natural adaptation lies hidden from us, ticking away below the surface of both our bodies and brains, silently correcting course to keep us aligned and functioning well in our world.

But we, in our never-ending human hubris, somehow believe we can engineer an environment better than reality in less than a single generation. If we take Second Life as the first iteration of the metaverse, we’re barely two decades into the engineering of a meta-reality.

If I was placing bets on who is the better environmental designer for us, humans or evolution, my money would be on evolution, every time.

Who’s Law is It Anyway?

One of the biggest selling features of the Metaverse is that it frees us from the restrictions of geography. Physical distance has no meaning when we go meta.

But this also has issues. Societies need laws and our laws have evolved to be grounded within the boundaries of geographical jurisdictions. What happens when those geographical jurisdictions become meaningless? Right now, there are no laws specifically regulating the Metaverse. And even if there are laws in the future, in what jurisdiction would they be enforced?

This is a troubling loophole – and by hole I mean a massive gaping metaverse-sized void. You know who is attracted by a lack of laws? Those who have no regard for the law. If you don’t think that criminals are currently eyeing the metaverse looking for opportunity, I have a beautiful virtual time-share condo in the heart of meta-Boca Raton that I’d love to sell you.

Data is Matter of the Metaverse

Another “selling feature” for the metaverse is the ability to append metadata to our own experiences, enriching them with access to information and opportunities that would be impossible in the real world. In the metaverse, the world is at our fingertips – or in our virtual headset – as the case may be. We can stroll through worlds, real or imagined, and the sum of all our accumulated knowledge is just one user-prompt away.

But here’s the thing about this admittedly intriguing notion: it makes data a commodity and commodities are built to be exchanged based on market value. In order to get something of value, you have to exchange something of value. And for the builders of the metaverse, that value lies in your personal data. The last shreds of personal privacy protection will be gone, forever!

A For-Profit Reality

This brings us to my biggest problem with the Metaverse – the motivation for building it. It is being built not by philanthropists or philosophers, academics or even bureaucrats. The metaverse is being built by corporations, who have to hit quarterly profit projections. They are building it to make a buck, or, more correctly, several billion bucks.

These are the same people who have made social media addictive by taking the dirtiest secrets of Las Vegas casinos and using them to enslave us through our smartphones. They have toppled legitimate governments for the sake of advertising revenue. They have destroyed our concept of truth, bashed apart the soft guardrails of society and are currently dismantling democracy. There is no noble purpose for a corporation – their only purpose is profit.

Do you really want to put your future reality in those hands?

50 Shades of Greying

Here is what I know: Lisa LaFlamme – the main anchor of CTV News, one of Canada’s national nightly newscasts – was fired.

What I don’t know is why. There are multiple versions of why floating around. The one that seems to have served as a rallying point for those looking to support Ms. LaFlamme is that she was fired because she was getting old. During COVID she decided to let her hair go to its natural grey. That, according to the popular version, prompted network brass to pull the pin on her contract.

I suspect the real reason why was not quite that cut and dried. The owners of the network, Bell Media, have been relentlessly trimming their payrolls at their various news organizations over the past several years. I know of one such story through a personal connection. The way this one scenario played out sounded very similar to what happened to Lisa LaFlamme – minus the accusations of ageism and gender double standards. In this case, it was largely a matter of dollars and cents. TV news is struggling financially. Long-time on-air talent have negotiated a salary over their careers that is no longer sustainable. Something had to give. These are probably just the casualties attributable to a dying industry. A hundred years ago it would have been blacksmiths and gas lamplighters that were being let go by the thousands. The difference is that the average blacksmith or lamplighter didn’t have a following of millions of people. They also didn’t have social media. They certainly didn’t have corporate PR departments desperately searching for the latest social media “woke” bandwagon to vault upon.

What is interesting is how these things play out through various media channels. In Ms LaFlamme’s case, it was a perfect storm that lambasted Bell Media (which owns the CTV Network). As the ageism rumours began to emerge, anti-ageism social media campaigns were run by Dove, Wendy’s and even Sports Illustrated. LaFlamme wasn’t mentioned by name in most of these, but the connection was clear. Going grey was something to be celebrated, not a cause for contract cancellation. Grey flecked gravitas should be gender neutral. “Who the f*&k were these Millennial corporate pin-heads that couldn’t stand a little grey on the nightly news!”

It makes excellent fodder for the meme-factory, but I suspect the reality wasn’t quite that simple. Ms La Flamme has never publicly revealed the actual reason for dismissal from her point of view. She never mentioned ageism. She simply said she was “blindsided” by the news. The reasoning behind the parting of the ways from Bell Media has largely been left up to conjecture.

A few other things to note.  LaFlamme received the news on June 29th but didn’t share the news until six weeks later (August 15th) on a personal video she shared on her own social media feed. Bell Media offered her the opportunity to have an on-air send off, but she declined. Finally, she also declined several offers from Bell to continue with the network in other roles. She chose instead to deliver her parting shot in the war zone of social media.

To be fair to both sides, if we’re to catalog all the various rumors floating about, there are also those saying that the decision was brought in – in part – by an allegedly toxic work environment in the news department that started at the top, with LaFlamme.

Now, if the reason for the termination actually was ageism, that’s abhorrent. Ms. LaFlamme is actually a few years younger than I am. I would hate to think that people of our age, who should be still at the height of their careers, would be discriminated against simply because of age.

The same is true if the reason was sexism. There should be no distinction between the appropriate age of a male or female national anchor.

But if it’s more complex, which I’m pretty sure it is, it shows how our world doesn’t really deal very well with complexity anymore. The consideration required to understand them don’t fit well within the attention constraints of social media. It’s a lot easier just to sub in a socially charged hot button meme and wait for the inevitable opinion camps to form. Sure, they’ll be one dimensional and about as thoughtful as a sledgehammer, but those types of posts are a much better bet to go viral.

Whatever happened in the CTV National Newsroom, I do know that this shows that business decisions in the media business will have to follow a very different playbook from this point forward. Bell Media fumbled the ball badly on this one. They have been scrambling ever since to save face. It appears that Lisa LaFlamme – and her ragtag band of social media supporters – outplayed them at every turn.

By the way, LaFlamme just nabbed a temporary gig as a “special correspondent” for CityTV, Bell Media’s competitor, covering the funeral of Queen Elizabeth II and the proclamation of King Charles III.  She’s being consummately professional and comforting, garnering a ton of social media support as she eases Canada through the grieving process (our emotional tie to the Crown is another very complex relationship that would require several posts to unpack).  

Well played, Lisa LaFlamme – well played.

Dealing with Daily Doom

“We are Doomed”

The tweet came yesterday from a celebrity I follow. And you know what? I didn’t even bother to look to find out in which particular way we were doomed. That’s probably because my social media feeds are filled by daily predictions of doom. The end being nigh has ceased to be news. It’s become routine. That is sad. But more than that, it’s dangerous.

This is why Joe Mandese and I have agreed to disagree about the role media can play in messaging around climate change , or – for that matter – any of the existential threats now facing us. Alarmist messaging could be the problem, not the solution.

Mandese ended his post with this:

“What the ad industry really needs to do is organize a massive global campaign to change the way people think, feel and behave about the climate — moving from a not-so-alarmist “change” to an “our house is on fire” crisis.”

Joe Mandese – Mediapost

But here’s the thing. Cranking up the crisis intensity on our messaging might have the opposite effect. It may paralyze us.

Something called “doom scrolling” is now very much a thing. And if you’re looking for Doomsday scenarios, the best place to start is the Subreddit r/collapse thread.

In a 30 second glimpse during the writing of this column, I discovered that democracy is dying, America is on the brink of civil war, Russia is turning off the tap on European oil supplies, we are being greenwashed into complacency, the Amazon Rainforest may never recover from its current environmental destruction and the “Doomsday” glacier is melting faster than expected. That was all above-the-fold. I didn’t even have to scroll for this buffet of all-you-can eat disaster. These were just the appetizers.

There is a reason why social media feeds are full of doom. We are hardwired to pay close attention to threats. This makes apocalyptic prophesizing very profitable for social media platforms. As British academic Julia Bell said in her 2020 book, Radical Attention,

“Behind the screen are impassive algorithms designed to ensure that the most outrageous information gets to our attention first. Because when we are enraged, we are engaged, and the longer we are engaged the more money the platform can make from us.”

Julia Bell – Radical Attention

But just what does a daily diet of doom do for our mental health? Does constantly making us aware of the impending end of our species goad us into action? Does it actually accomplish anything?

Not so much. In fact, it can do the opposite.

Mental health professionals are now treating a host of new climate change related conditions, including eco-grief, eco-anxiety and eco-depression. But, perhaps most alarmingly, they are now encountering something called eco-paralysis.

In an October 2020 Time.com piece on doom scrolling, psychologist Patrick Kennedy-Williams, who specializes in treating climate-related anxieties, was quoted, ““There’s something inherently disenfranchising about someone’s ability to act on something if they’re exposed to it via social media, because it’s inherently global. There are not necessarily ways that they can interact with the issue.” 

So, cranking up the intensity of the messaging on existential threats such as climate change may have the opposite effect, by scaring us into doing nothing. This is because of something called Yerkes-Dodson Law.

By Yerkes and Dodson 1908 – Diamond DM, et al. (2007). “The Temporal Dynamics Model of Emotional Memory Processing: A Synthesis on the Neurobiological Basis of Stress-Induced Amnesia, Flashbulb and Traumatic Memories, and the Yerkes-Dodson Law”. Neural Plasticity: 33. doi:10.1155/2007/60803. PMID 17641736., CC0, https://commons.wikimedia.org/w/index.php?curid=34030384

This “Law”, discovered by psychologists Robert Yerkes and John Dodson in 1908, isn’t so much a law as a psychological model. It’s a typical bell curve. On the front end, we find that our performance in responding to a situation increases along with our attention and interest in that situation. But the line does not go straight up. At some point, it peaks and then goes downhill. Intent gives way to anxiety. The more anxious we become, the more our performance is impaired.

When we fret about the future, we are actually grieving the loss of our present. In this process, we must make our way through the 5 stages of grief introduced by psychiatrist Elisabeth Kübler-Ross in 1969 through her work with terminally ill patients. The stages are: Denial, Anger, Bargaining, Depression and Acceptance.

One would think that triggering awareness would help accelerate us through the stages. But there are a few key differences. In dealing with a diagnosis of terminal illness, typically there is one hammer-blow event when you become aware of the situation. From there dealing with it begins. And – even when it begins – it’s not a linear journey. As anyone who has ever grieved will tell you, what stage you’re in depends on which day I’m talking to you. You can slip for Acceptance to Anger in a heartbeat.

With climate change, awareness doesn’t come just once. The messaging never ends. It’s a constant cycle of crisis, trapping us in a loop that cycles between denial, depression and despair.

An excellent post on Climateandmind.org on climate grief talks about this cycle and how we get trapped within it. Some of us get stuck in a stage and never move on. Even climate scientist and activist Susanne Moser admits to being trapped in something she calls Functional Denial,

“It’s that simultaneity of being fully aware and conscious and not denying the gravity of what we’re creating (with Climate Change), and also having to get up in the morning and provide for my family and fulfill my obligations in my work.”

Susan Moser

It’s exactly this sense of frustration I voiced in my previous post. But the answer is not making me more aware. Like Moser, I’m fully aware of the gravity of the various threats we’re facing. It’s not attention I lack, it’s agency.

I think the time to hope for a more intense form of messaging to prod the deniers into acceptance is long past. If they haven’t changed their minds yet, they ain’t goin’ to!

I also believe the messaging we need won’t come through social media. There’s just too much froth and too much profit in that froth.

What we need – from media platforms we trust – is a frank appraisal of the worst-case scenario of our future. We need to accept that and move on to deal with what is to come. We need to encourage resilience and adaptability. We need hope that while what is to come is most certainly going to be catastrophic, it doesn’t have to be apocalyptic.

We need to know we can survive and start thinking about what that survival might look like.

Does Social Media “Dumb Down” the Wisdom of Crowds?

We assume that democracy is the gold standard of sustainable political social contracts. And it’s hard to argue against that. As Winston Churchill said, “democracy is the worst form of government – except for all the others that have been tried.”

Democracy may not be perfect, but it works. Or, at least, it seems to work better than all the other options. Essentially, democracy depends on probability – on being right more often than we’re wrong.

At the very heart of democracy is the principle of majority rule. And that is based on something called Jury Theorem, put forward by the Marquis de Condorcet in his 1785 work, Essay on the Application of Analysis to the Probability of Majority Decisions. Essentially, it says that the probability of making the right decision increases when you average the decisions of as many people as possible. This was the basis of James Suroweicki’s 2004 book, The Wisdom of Crowds.

But here’s the thing about the wisdom of crowds – it only applies when those individual decisions are reached independently. Once we start influencing each other’s decision, that wisdom disappears. And that makes social psychologist Solomon Asch’s famous conformity experiments of 1951 a disturbingly significant fly in the ointment of democracy.

You’re probably all aware of the seminal study, but I’ll recap anyway. Asch gathered groups of people and showed them a card with three lines of obviously different lengths. Then he asked participants which line was the closest to the reference line. The answer was obvious – even a toddler can get this test right pretty much every time.

But unknown to the test subject, all the rest of the participants were “stooges” – actors paid to sometimes give an obviously incorrect answer. And when this happened, Asch was amazed to find that the test subjects often went against the evidence of their own eyes just to conform with the group. When wrong answers were given, a third of the subjects always conformed, 75% of the subjects conformed at least once, and only 25% stuck to the evidence in front of them and gave the right answer.

The results baffled Asch. The most interesting question to him was why this was happening. Were people making a decision to go against their better judgment – choosing to go with the crowd rather than what they were seeing with their own eyes? Or was something happening below the level of consciousness? This was something Solomon Asch wondered about right until his death in 1996. Unfortunately, he never had the means to explore the question further.

But, in 2005, a group of researchers at Emory University, led by Gregory Berns, did have a way. Here, Asch’s experiment was restaged, only this time participants were in a fMRI machine so Bern and his researchers could peak at what was actually happening in their brains. The results were staggering.

They found that conformity actually changes the way our brain works. It’s not that we change what we say to conform with what others are saying, despite what we see with our own eyes. What we see is changed by what others are saying.

If, Berns and his researchers reasoned, you were consciously making a decision to go against the evidence of your own eyes just to conform with the group, you should see activity in the frontal areas of our brain that are engaged in monitoring conflicts, planning and other higher-order mental activities.

But that isn’t what they found. In those participants that went along with obviously incorrect answers from the group, the parts of the brain that showed activity were only in the posterior parts of the brain – those that control spatial awareness and visual perception. There was no indication of an internal mental conflict. The brain was actually changing how it processed the information it was receiving from the eyes.

This is stunning. It means that conformity isn’t a conscious decision. Our desire to conform is wired so deeply in our brains, it actually changes how we perceive the world. We never have the chance to be objectively right, because we never realize we’re wrong.

But what about those that went resisted conformity and stuck to the evidence they were seeing with their own eyes? Here again, the results were fascinating. The researchers found that in these cases, they saw a spike of activity in the right amygdala and right caudate nucleus – areas involved in the processing of strong emotions, including fear, anger and anxiety. Those that stuck to the evidence of their own eyes had to overcome emotional hurdles to do so. In the published paper, the authors called this the “pain of independence.”

This study highlights a massively important limitation in the social contract of democracy. As technology increasingly imposes social conformity on our culture, we lose the ability to collectively make the right decision. Essentially, is shows that this effect not only erases the wisdom of crowds, but actively works against it by exacting an emotional price for being an independent thinker.

Memories Made by Media

If you said the year 1967 to me, the memory that would pop into my head would be of Haight-Ashbury (ground zero for the counterculture movement), hippies and the summer of love. In fact, that same memory would effectively stand in for the period 1967 to 1969. In my mind, those three years were variations on the theme of Woodstock, the iconic music festival of 1969.

But none of those are my memories. I was alive, but my own memories of that time are indistinct and fuzzy. I was only 6 that year and lived in Alberta, some 1300 miles from the intersection of Haight and Ashbury Streets, so I have discarded my own personal representative memories. The ones I have were all created by images that came via media.

The Swapping of Memories

This is an example of the two types of memories we have – personal or “lived” memories and collective memories. Collective memories are the memories we get from outside, either for other people or, in my example, from media. As we age, there tends to be a flow back and forth between these two types or memories, with one type coloring the other.

One group of academics proposed an hourglass model as a working metaphor to understand this continuous exchange of memories – with some flowing one way and others flowing the other.  Often, we’re not even aware of which type of memory we’re recalling, personal or collective. Our memories are notoriously bad at reflecting reality.

What is true, however, is that our personal memories and our collective memories tend to get all mixed up. The lower our confidence in our personal memories, the more we tend to rely on collective memories. For periods before we were born, we rely solely on images we borrow.

Iconic Memories

What is true for all memories, ours or the ones we borrow from others, is we put them through a process called “leveling and sharpening.” This is a type of memory consolidation where we throw out some of the detail that is not important to us – this is leveling – and exaggerate other details to make it more interesting – i.e. sharpening.

Take my borrowed memories of 1967, for example. There was a lot more happening in the world than whatever was happening in San Francisco during the Summer of Love, but I haven’t retained any of it in my representative memory of that year. For example, there was a military coup in Greece, the first successful human heart transplant, the creation of the Corporation for Public Broadcasting, a series of deadly tornadoes in Chicago and Typhoon Emma left 140,000 people homeless in the Philippines. But none of that made it into my memory of 1967.

We could call the memories we do keep as “iconic” – which simply means we chose symbols to represent a much bigger and more complex reality – like everything that happened in a 365 day stretch 5 and a half decades ago.

Mass Manufactured Memories

Something else happens when we swap our own personal memories for collective memories – we find much more commonality in our memories. The more removed we become from our own lived experiences, the more our memories become common property.

If I asked you to say the first thing that comes to mind about 2002, you would probably look back through your own personal memory store to see if there was anything there. Chances are it would be a significant event from your own life, and this would make it unique to you. If we had a group of 50 people in a room and I asked that question, I would probably end up with 50 different answers.

But if I asked that same group what the first thing is that comes to mind when I say the year 1967, we would find much more common ground. And that ground would probably be defined by how each of us identify ourselves. For some you might have the same iconic memory that I do – that of the Haight Ashbury and the Summer of Love. Others may have picked the Vietnam War as the iconic memory from that year. But I would venture to guess that in our group of 50, we would end up with only a handful of answers.

When Memories are Made of Media

I am taking this walk down Memory Lane because I want to highlight how much we rely on the media to supply our collective memories. This dependency is critical, because once media images are processed by us and become part of our collective memories, they hold tremendous sway over our beliefs. These memories become the foundation for how we make sense of the world.

This is true for all media, including social media. A study in 2018 (Birkner & Donk) found that “alternative realities” can be formed through social media to run counter to collective memories formed from mainstream media. Often, these collective memories formed through social media are polarized by nature and are adopted by outlier fringes to justify extreme beliefs and viewpoints. This shows that collective memories are not frozen in time but are malleable – continually being rewritten by different media platforms.

Like most things mediated by technology, collective memories are splintering into smaller and smaller groupings, just like the media that are instrumental in their formation.

Sensationalizing Scam Culture

We seem to be fascinated by bad behavior. Our popular culture is all agog with grifters and assholes. As TV Blog’s Adam Buckman wrote in March: “Two brand-new limited series premiering this week appear to be part of a growing trend in which some of recent history’s most notorious innovators and disruptors are getting the scripted-TV treatment.”

The two series Buckman was talking about were “Super Pumped: The Battle for Uber,” about Uber CEO Travis Kalanick, and “The Dropout,” about Theranos founder Elizabeth Holmes.

But those are just two examples from a bumper crop of shows about bad behavior. My streaming services are stuffed with stories of scammers. In addition to the two series Buckman mentioned, I just finished Shonda Rhimes’ Netflix series “Inventing Anna,” about Anna Sorokin, who posed as an heiress named Anna Delvey.

All these treatments tread a tight wire of moral judgement, where the examples are presented as antisocial, but in a wink-and-a-nod kind of way, where we not so secretly admire these behaviors. Much as the actions are harmful to well-being of the collective “we,” they do appeal to the selfishness and ambition of “me.”

Most of the examples given are rags to riches to retribution stores (Holmes was an exception with her upper-middle-class background). The sky-high ambitions of Kalanick Holmes and Sorokin were all eventually brought back down to earth. Sorokin and Holmes both ended up in prison, and Kalanick was ousted from the company he founded.

But with the subtlest of twists, they didn’t have to end this way. They could have been the story of almost any corporate America hustler who triumphed. With a little more substance and a little less scam, you could swap Elizabeth Holmes for Steve Jobs. They even dressed the same.

Obviously, scamming seems to sell. These people fascinate us. Part of the appeal is no doubt due a class conflict narrative: the scrappy hustler climbing the social ranks by whatever means possible. We love to watch “one of us” pull the wool over the eyes of the social elite.

In the case of Anna Sorokin, Laura Craik dissects our fascination in a piece published in the UK’s Evening Standard:

“The reason people are so obsessed with Sorokin is simple: she had the balls to pull off on a grand scale what so many people try and fail to pull off on a small one. To use a phrase popular on social media, Sorokin succeeded in living her best life — right down to the clothes she wore in court, chosen by a stylist. Like Jay Gatsby, she was a deeply flawed embodiment of The American Dream: a person from humble beginnings who rose to achieve wealth and social status. Only her wealth was borrowed and her social status was conferred via a chimera of untruths.”

Laura Craik – UK Evening Standard

This type of behavior is nothing new. It’s always been a part of us. In 1513, a Florentine bureaucrat named Niccolo Machiavelli gave it a name — actually, his name. In writing “The Prince,” he condoned bad behavior as long as the end goal was to elevate oneself. In a Machiavellian world, it’s always open season on suckers: “One who deceives will always find those who allow themselves to be deceived.”

For the past five centuries, Machiavellianism was always synonymous with evil. It was a recognized character flaw, described as “a personality trait that denotes cunningness, the ability to be manipulative, and a drive to use whatever means necessary to gain power. Machiavellianism is one of the traits that forms the Dark Triad, along with narcissism and psychopathy.”

Now, however, that stigma seems to be disappearing. In a culture obsessed with success, Machiavellianism becomes a justifiable means to an end, so much so that we’ve given this culture its own hashtag: #scamculture: “A scam culture is one in which scamming has not only lost its stigma but is also valorized. We rebrand scamming as ‘hustle,’ or the willingness to commodify all social ties, and this is because the ‘legitimate’ economy and the political system simply do not work for millions of Americans.”

It’s a culture that’s very much at home in Silicon Valley. The tech world is steeped in Machiavellianism. Its tenets are accepted — even encouraged — business practices in the Valley. “Fake it til you make it” is tech’s modus operandi. The example of Niccolo Machiavelli has gone from being a cautionary tale to a how-to manual.

But these predatory practices come at a price. Doing business this way destroys trust. And trust is still, by far, the best strategy for our mutual benefit. In behavioral economics, there’s something called “tit for tat,” which according to Wikipedia “posits that a person is more successful if they cooperate with another person. Implementing a tit-for-tat strategy occurs when one agent cooperates with another agent in the very first interaction and then mimics their subsequent moves. This strategy is based on the concepts of retaliation and altruism.”

In countless game theory simulations, tit for tat has proven to be the most successful strategy for long-term success. It assumes a default position of trust, only moving to retaliation if required.

Our society needs trust to function properly. In a New York Times op-ed entitled “Why We Need to Address Scam Culture,” Tressie McMillan Cottom writes,  

“Scams weaken our trust in social institutions, but their going mainstream — divorced from empathy for the victims or stigma for the perpetrators — means that we have accepted scams as institutions themselves.”

Tressie McMillan Cottom – NY Times

The reason that trust is more effective than scamming is that predatory practices are self-limiting. You can only be a predator if you have enough prey. In a purely Machiavellian world, trust disappears — and there are no easy marks to prey upon.

Making Time for Quadrant Two

Several years ago, I read Stephen Covey’s “The 7 Habits of Highly Effective People.” It had a lasting impact on me. Through my life, I have found myself relearning those lessons over and over again.

One of them was the four quadrants of time management. How we spend our time in these quadrants determines how effective we are.

 Imagine a box split into four quarters. On the upper left box, we’ll put a label: “Important and Urgent.” Next to it, in the upper right, we’ll put a label saying “Important But Not Urgent.” The label for the lower left is “Urgent but Not Important.” And the last quadrant — in the lower right — is labeled “Not Important nor Urgent.”

The upper left quadrant — “Important and Urgent” — is our firefighting quadrant. It’s the stuff that is critical and can’t be put off, the emergencies in our life.

We’ll skip over quadrant two — “Important But Not Urgent” — for a moment and come back to it.

In quadrant three — “Urgent But Not Important” — are the interruptions that other people brings to us. These are the times we should say, “That sounds like a you problem, not a me problem.”

Quadrant four is where we unwind and relax, occupying our minds with nothing at all in order to give our brains and body a chance to recharge. Bingeing Netflix, scrolling through Facebook or playing a game on our phones all fall into this quadrant.

And finally, let’s go back to quadrant two: “Important But Not Urgent.” This is the key quadrant. It’s here where long-term planning and strategy live. This is where we can see the big picture.

The secret of effective time management is finding ways to shift time spent from all the other quadrants into quadrant two. It’s managing and delegating emergencies from quadrant one, so we spend less time fire-fighting. It’s prioritizing our time above the emergencies of others, so we minimize interruptions in quadrant three. And it’s keeping just enough time in quadrant four to minimize stress and keep from being overwhelmed.

The lesson of the four quadrants came back to me when I was listening to an interview with Dr. Sandro Galea, epidemiologist and author of “The Contagion Next Time.” Dr. Galea was talking about how our health care system responded to the COVID pandemic. The entire system was suddenly forced into quadrant one. It was in crisis mode, trying desperately to keep from crashing. Galea reminded us that we were forced into this mode, despite there being hundreds of lengthy reports from previous pandemics — notably the SARS crisis–– containing thousands of suggestions that could have helped to partially mitigate the impact of COVID.

Few of those suggestions were ever implemented. Our health care system, Galea noted, tends to continually lurch back and forth within quadrant one, veering from crisis to crisis. When a crisis is over, rather than go to quadrant two and make the changes necessary to avoid similar catastrophes in the future, we put the inevitable reports on a shelf where they’re ignored until it is — once again — too late.

For me, that paralleled a theme I have talked about often in the past — how we tend to avoid grappling with complexity. Quadrant two stuff is, inevitably, complex in nature. The quadrant is jammed with what we call wicked problems. In a previous column, I described these as, “complex, dynamic problems that defy black-and-white solutions. These are questions that can’t be answered by yes or no — the answer always seems to be maybe.  There is no linear path to solve them. You just keep going in loops, hopefully getting closer to an answer but never quite arriving at one. Usually, the optimal solution to a wicked problem is ‘good enough — for now.’”

That’s quadrant two in a nutshell. Quadrant-one problems must be triaged into a sort of false clarity. You have to deal with the critical stuff first. The nuances and complexity are, by necessity, ignored. That all gets pushed to quadrant two, where we say we will deal with it “someday.”

Of course, someday never comes. We either stay in quadrant one, are hijacked into quadrant three, or collapse through sheer burn-out into quadrant four. The stuff that waits for us in quadrant two is just too daunting to even consider tackling.

This has direct implications for technology and every aspect of the online world. Our industry, because of its hyper-compressed timelines and the huge dollars at stake, seems firmly lodged in the urgency of quadrant one. Everything on our to-do list tends to be a fire we have to put out. And that’s true even if we only consider the things we intentionally plan for. When we factor in the unplanned emergencies, quadrant one is a time-sucking vortex that leaves nothing for any of the other quadrants.

But there is a seemingly infinite number of quadrant two things we should be thinking about. Take social media and privacy, for example. When an online platform has a massive data breach, that is a classic quadrant one catastrophe. It’s all hands on deck to deal with the crisis. But all the complex questions around what our privacy might look like in a data-inundated world falls into quadrant two. As such, they are things we don’t think much about. It’s important, but it’s not urgent.

Quadrant two thinking is systemic thinking, long-term and far-reaching. It allows us to build the foundations that helps to mitigate crisis and minimize unintended consequences.

In a world that seems to rush from fire to fire, it is this type of thinking that could save our asses.

The News Cycle, Our Attention Span and that Oscar Slap

If your social media feed is like mine, it was burning up this Monday with the slap heard around the world. Was Will Smith displaying toxic masculinity? Was “it was a joke” sufficient defence for Chris Rock’s staggering lack of ability to read the room? Was Smith’s acceptance speech legendary or just really, really lame?

More than a few people just sighed and chalked it up as another scandal up for the beleaguered awards show. This was one post I saw from a friend on Facebook, “People smiling and applauding as if an assault never happened is probably Hollywood in a nutshell.”

Whatever your opinion, the world was fascinated by what happened. The slap trended number one on Twitter through Sunday night and Monday morning. On CNN, the top trending stories on Monday morning were all about the “slap.” You would have thought that there was nothing happening in the world that was more important than one person slapping another. Not the world teetering on the edge of a potential world war. Not a global economy that can’t seem to get itself in gear. Not a worldwide pandemic that just won’t go away and has just pushed Shanghai – a city of 30 million – back into a total lock down.

And the spectre of an onrushing climactic disaster? Nary a peep in Monday’s news cycle.

We commonly acknowledge – when we do take the time to stop and think about it – that our news cycles have about the same attention span as a 4-year-old on Christmas morning. No matter what we have in our hands, there’s always something brighter and shinier waiting for us under the tree. We typically attribute this to the declining state of journalism. But we – the consumers of news – are the ones that continually ignore the stories that matter in favour of gossipy tidbits.

This is just the latest example of that. It is nothing more than human nature. But there is a troubling trend here that is being accelerated by the impact of social media. This is definitely something we should pay attention to.

The Confounding Nature of Complexity

Just last week, I talked about something psychologists call a locus of control. Essentially it is defined by the amount of control you feel you have over your life. In times of stress, unpredictability or upheaval, our own perceived span of control tends to narrow to the things we have confidence we can manage. Our ability to cope draws inward, essentially circling the wagons around the last vestiges of our capability to direct our own circumstances. 

I believe the same is true with our ability to focus attention. The more complex the world gets, the more we tend to focus on things that we can easily wrap our minds around. It has been shown repeatedly that anxiety impacts the ability of our brain to focus on things. A study from Finland’s Abo Akademi University showed that anxiety reduces the ability of the brain to focus on tasks. It eats away at our working memory, leaving us with a reduced capacity to integrate concepts and work things out. Complex, unpredictable situations natural raise our level of anxiety, leading us to retreat to things we don’t have to work too hard to understand.

The irony here is the more we are aware of complex and threatening news stories, the more we go right past them to things like the Smith-Rock story. It’s like catnip to a brain that’s trying to retreat from the real news because we can’t cope with it.

This isn’t necessarily the fault of journalism, it’s more a limitation of our own brains. On Monday morning, CNN offered plenty of coverage dealing with the new airstrikes in Ukraine, Biden’s inflammatory remarks about Putin, Trump’s attempts to block Congress from counting votes and the restriction of LGBTQ awareness in the classrooms of Florida. But none of those stories were trending. What was trending were three stories about Rock and Smith, one about the Oscar winners and another about a 1600-pound shark. That’s what we were collectively reading.

False Familiarity

It’s not just that the news is too complex for us to handle that made the Rock/Smith story so compelling. Our built-in social instincts also made it irresistible.

Evolution has equipped us with a highly attuned social antennae. Humans are herders and when you travel in a herd, your ability to survive is highly dependent on picking up signals from your fellow herders. We have highly evolved instincts to help us determine who we can trust and who we should protect ourselves from. We are quick to judge others, and even quicker to gossip about behavior that steps over those invisible boundaries we call social norms.

For generations, these instincts were essential when we had keep tabs on the people closest to us. But with the rise of celebrity culture in the last century, we now apply those same instincts to people we think we know. We pass judgement on the faces we see on TV and in social media. We have a voracious appetite for gossip about the super-rich and the super famous.

Those foibles may be ours and ours alone, but they not helped by the fact that certain celebrities – namely one Mr. Smith – feels compelled to share way too much about himself with the public at large. Witness his long and tear-laden acceptance speech. Even though I have only a passing interest in the comings and goings of Will and Jada, I know more about their sex lives than that of my closest friends. The social norm that restricts bedroom talk amongst our friends and family is not there with the celebrities we follow. We salivate over salacious details.

No Foul, No Harm?

That’s the one-two punch (sorry, I had to go there) that made the little Oscar ruckus such a hot news item. But what’s the harm? It’s just a momentary distraction for the never-ending shit-storm that defines our daily existence, right?

Not quite.

The more we continually take the path of least resistance in our pursuit of information, the harder it becomes for us to process the complex concepts that make up our reality. When that happens, we tend to attribute too much importance and meaning to these easily digestible nuggets of gossip. As we try to understand complex situations (which covers pretty much everything of importance in our world today) we start relying too much on cognitive short cuts like availability bias and representative bias. In the first case, we apply whatever information we have at hand to every situation and in the second we resort to substituting stereotypes and easy labels in place of trying to understand the reality of an individual or group.

Ironically, it’s exactly this tendency towards cognitive laziness that was skewered in one of Sunday night’s nominated features, Adam McKay’s Don’t Look Up.

Of course, it was ignored. As Will Smith said, sometimes, “art imitates life.”

Why Are Podcasts so Popular?

Everybody I know is listening to podcasts. According to eMarketer, the number of monthly U.S. podcast listeners will increase by over 10% this year, to a total of 117.8 million. And this growth is ruled by younger consumers. Apparently, more than 60% of U.S. adults ages 18 to 34 will listen to podcasts.

That squares with my anecdotal evidence. Both my daughters are podcast fans. But the popularity of podcasts declines with age. Again, according to eMarketer, less than one-fifth of adults in the U.S. over 65 listen to podcasts.

I must admit, I’m not a regular podcast listener. Nor are most of my friends. I’m not sure why. You’d think we’d be the ideal target. Many of us listen to public radio, so the format of a podcast should be a logical extension of that. But maybe it’s because we’ve already made our choice, and we’re fine with listening to old-fashioned radio.

In theory, I should love podcasts. At the beginning of my career, I was a radio copywriter. I even wrote a few radio plays in my 20s. As a creator, I am very intrigued by the format of a podcast. I’m even considering experimenting in this medium for my own content. I just don’t listen to them that often.

What’s also perplexing about the recent popularity of podcasts is that they’re nothing new. Podcasts have been around forever, at least in Internet terms.

A Brief History of Podcasting

The idea of bite-sized broadcasts goes back to the 1980s and ‘90s, but the advent of the Internet in 2000 opened up the concept of the digital delivery of an audio file to the average listener. This content found a new home in 2001 when Apple introduced the iPod. For the next 10 plus years, podcasts were generally just another delivery option for existing content.

But in 2014, “This American Life” launched season one of its true-crime “Serial” podcast. Suddenly, something gelled in the medium, and the audiences started to grow. The true crime bandwagon gathered speed. Both producers and audiences found their groove; the content became more compelling, and more people started listening.

In 2013, just over 10% of the U.S. population listened to podcasts monthly. This year, podcasting will become a $1 billion industry and over 50% of Americans listen regularly.

So why did podcasting, a medium with relatively few technical bells and whistles, suddenly become so hot?

A Story Well Told

The first clue to the popularity of podcasts is that many of them (certainly the most popular ones) focus on storytelling. And we are innately connected to the power of a good story.

The one genre of podcast that has been the most popular are the true crime series. Humans have a need to resolve mysteries. These podcasts have become very good at creating a curiosity gap that itches to be closed. They hit many of our hard-wired hot buttons.

Still, there are many, many ways to tell a murder mystery. So, beyond a compelling story, what else is it about podcasts that make them so addictive?

The Beauty of Brain Bonding

When you think of how our brain interprets messages, an audio-based one seems to thread the needle between the effort of imagination and the joy of focused relaxation. It opens the door to our theater of the mind, allowing us to fill in the sensory gaps needed to bring the story alive.

As I mentioned in last week’s post, the brain works by retrieving and synthesizing memories and experiences when prompted by a stimulus. It’s a process that makes the stories a little more personal for us, a little more intimate; these are stories self-tailored for us by our own experiences and beliefs.

But there are other audio-only formats available. This clue gets us closer to understanding the popularity of podcasts, but still leaves us a bit short. For the final answer, we have to explore one more aspect of them.

An Intimate Invitation

When you google “why are podcasts popular?” you’ll often see that their appeal lies in their convenience. You can listen to them at your own pace, in your own place and on your own timeline. They are not as restrictive as a radio broadcast.

You could take that at face value, but I think there’s more that meets the ear here. There is something about the portability and convenience of a podcast that sets them up for possibly being the most intimate of media.

When we listen to a podcast, we do so in an environment of our own choosing. Perhaps it’s in our vehicle during our daily commute. Maybe it’s just sitting in our favorite recliner by a fireplace.

Whatever the surroundings, we can make sure it’s a safe space that allows us to connect with the content at a very intimate level. We generally listen to them with our earbuds in, so the juicy details don’t leak out to the world at large.

And the best podcast producers have realized this. This is not a broadcast, it’s a one-sided conversation with your smartest friend talking about the most interesting thing they know.

Whatever lies behind their popularity, it’s a safe bet that half the people you know listen to podcasts on a regular basis.

I’ll have to give them another try.

Whatever Happened to the Google of 2001?

Having lived through it, I can say that the decade from 2000 to 2010 was an exceptional time in corporate history. I was reminded of this as I was reading media critic and journalist Ken Auletta’s book, “Googled, The End of the World as We Know It.” Auletta, along with many others, sensed a seismic disruption in the way media worked. A ton of books came out on this topic in the same time frame, and Google was the company most often singled out as the cause of the disruption.

Auletta’s book was published in 2009, near the end of this decade, and it’s interesting reading it in light of the decade plus that has passed since. There was a sort of breathless urgency in the telling of the story, a sense that this was ground zero of a shift that would be historic in scope. The very choice of Auletta’s title reinforces this: “The End of the World as We Know It.”

So, with 10 years plus of hindsight, was he right? Did the world we knew end?

Well, yes. And Google certainly contributed to this. But it probably didn’t change in quite the way Auletta hinted at. If anything, Facebook ended up having a more dramatic impact on how we think of media, but not in a good way.

At the time, we all watched Google take its first steps as a corporation with a mixture of incredulous awe and not a small amount of schadenfreude. Larry Page and Sergey Brin were determined to do it their own way.

We in the search marketing industry had front row seats to this. We attended social mixers on the Google campus. We rubbed elbows at industry events with Page, Brin, Eric Schmidt, Marissa Mayer, Matt Cutts, Tim Armstrong, Craig Silverstein, Sheryl Sandberg and many others profiled in the book. What they were trying to do seemed a little insane, but we all hoped it would work out.

We wanted a disruptive and successful company to not be evil. We welcomed its determination — even if it seemed naïve — to completely upend the worlds of media and advertising. We even admired Google’s total disregard for marketing as a corporate priority.

But there was no small amount of hubris at the Googleplex — and for this reason, we also hedged our hopeful bets with just enough cynicism to be able to say “we told you so” if it all came crashing down.

In that decade, everything seemed so audacious and brashly hopeful. It seemed like ideological optimism might — just might — rewrite the corporate rulebook. If a revolution did take place, we wanted to be close enough to golf clap the revolutionaries onward without getting directly in the line of fire ourselves.

Of course, we know now that what took place wasn’t nearly that dramatic. Google became a business: a very successful business with shareholders, a grown-up CEO and a board of directors, but still a business not all that dissimilar to other Fortune 100 examples. Yes, Google did change the world, but the world also changed Google. What we got was more evolution than revolution.

The optimism of 2000 to 2010 would be ground down in the next 10 years by the same forces that have been driving corporate America for the past 200 years: the need to expand markets, maximize profits and keep shareholders happy. The brash ideologies of founders would eventually morph to accommodate ad-supported revenue models.

As we now know, the world was changed by the introduction of ways to make advertising even more pervasively influential and potentially harmful. The technological promise of 20 years ago has been subverted to screw with the very fabric of our culture.

I didn’t see that coming back in 2001. I probably should have known better.