Using Science for Selling: Sometimes Yes, Sometimes No

A recent study out of Ohio State University seems like one of those that the world really didn’t need. The researchers were exploring whether introducing science into the marketing would help sell chocolate chip cookies.

And to us who make a living in marketing, this is one of those things that might make us say “Duh, you needed research to tell us that? Of course you don’t use science to sell chocolate chip cookies!”

But bear with me, because if we keep asking why enough, we can come up with some answers that might surprise us.

So, what did the researchers learn? I quote,

“Specifically, since hedonic attributes are associated with warmth, the coldness associated with science is conceptually disfluent with the anticipated warmth of hedonic products and attributes, reducing product valuation.”

Ohio State Study

In other words – much simpler and fewer in number – science doesn’t help sell cookies. And it’s because our brains think differently about some things than other.

For example, a study published in the journal Computers in Human Behavior (Casado-Aranda, Sanchez-Fernandez and Garcia) found that when we’re exposed to “hedonic” ads – ads that appeal to pleasurable sensations – the parts of our brain that retrieve memories kicks in. This isn’t true when we see utilitarian ads. Predictably, we approach those ads as a problem to be solved and engage the parts of our brain that control working memory and the ability to focus our attention.

Essentially, these two advertising approaches take two different paths in our awareness, one takes the “thinking” path and one takes the “feeling” path. Or, as Nobel Laureate Daniel Kahneman would say, one takes the “thinking slow” path and one takes the “thinking fast” path.

Yet another study begins to show why this may be so. Let’s go back to chocolate chip cookies for a moment. When you smell a fresh baked cookie, it’s not just the sensory appeal “in the moment” that makes the cookie irresistible. It’s also the memories it brings back for you. We know that how things smell is a particularly effective way to trigger this connection with the past. Certain smells – like that of cookies just out of the oven – can be the shortest path between today and some childhood memory. These are called associative memories. And they’re a big part of “feeling” something rather than just “thinking” about it.

At the University of California – Irvine – Neuroscientists discovered a very specific type of neuron in our memory centers that oversee the creation of new associative memories. They’re called “fan cells” and it seems that these neurons are responsible for creating the link between new input and those emotion-inducing memories that we may have tucked away from our past. And – critically – it seems that dopamine is the key to linking the two. When our brains “smell” a potential reward, it kicks these fan cells into gear and our brain is bathed in the “warm fuzzies.” Lead research Kei Igarashi, said,

“We never expected that dopamine is involved in the memory circuit. However, when the evidence accumulated, it gradually became clear that dopamine is involved. These experiments were like a detective story for us, and we are excited about the results.”

Kei Igarashi – University of California – Irvine

Not surprisingly – as our first study found – introducing science into this whole process can be a bit of a buzz kill. It would be like inviting Bill Nye the Science Guy to teach you about quantum physics during your Saturday morning cuddle time.

All of this probably seems overwhelmingly academic to you. Selling something like chocolate chip cookies isn’t something that should take three different scientific studies and strapping several people inside a fMRI machine to explain. We should be able to rely on our guts, and our guts know that science has no place in a campaign built on an emotional appeal.

But there is a point to all this. Different marketing approaches are handled by different parts of the brain, and knowing that allows us to reinforce our marketing intuition with a better understanding of why we humans do the things we do.

Utilitarian appeals activate the parts of the brain that are front and center, the data crunching, evaluating and rational parts of our cognitive machinery.

Hedonic appeals probe the subterranean depths of our brains, unpacking memories and prodding emotions below the thresholds of us being conscious of the process. We respond viscerally – which literally means “from our guts”.

If we’re talking about selling chocolate chip cookies, we have moved about as far towards the hedonic end of the scale as we can. At the other end we would find something like motor oil – where scientific messaging such as “advanced formulation” or “proven engine protection” would be more persuasive. But almost all other products fall somewhere in between. They are a mix of hedonic and utilitarian factors. And we haven’t even factored in the most significant of all consumer considerations – risk and how to avoid it. Think how complex things would get in our brains if we were buying a new car!

Buying chocolate chip cookies might seem like a no brainer – because – well – it almost is. Beyond dosing our neural pathways with dopamine, our brains barely kick in when considering whether to grab a bag of Chips Ahoy on our next trip to the store. In fact, the last thing you want your brain to do when you’re craving chewy chocolate is to kick in. Then you would start considering things like caloric intake and how you should be cutting down on processed sugar. Chocolate chip cookies might be a no-brainer, but almost nothing else in the consumer world is that simple.

Marketing is relying more and more on data. But data is typically restricted to answering “who”, “what”, “when” and “where” questions. It’s studies like the ones I shared here that start to pick apart the “why” of marketing.

And when things get complex, asking “why” is exactly what we need to do.

Sensationalizing Scam Culture

We seem to be fascinated by bad behavior. Our popular culture is all agog with grifters and assholes. As TV Blog’s Adam Buckman wrote in March: “Two brand-new limited series premiering this week appear to be part of a growing trend in which some of recent history’s most notorious innovators and disruptors are getting the scripted-TV treatment.”

The two series Buckman was talking about were “Super Pumped: The Battle for Uber,” about Uber CEO Travis Kalanick, and “The Dropout,” about Theranos founder Elizabeth Holmes.

But those are just two examples from a bumper crop of shows about bad behavior. My streaming services are stuffed with stories of scammers. In addition to the two series Buckman mentioned, I just finished Shonda Rhimes’ Netflix series “Inventing Anna,” about Anna Sorokin, who posed as an heiress named Anna Delvey.

All these treatments tread a tight wire of moral judgement, where the examples are presented as antisocial, but in a wink-and-a-nod kind of way, where we not so secretly admire these behaviors. Much as the actions are harmful to well-being of the collective “we,” they do appeal to the selfishness and ambition of “me.”

Most of the examples given are rags to riches to retribution stores (Holmes was an exception with her upper-middle-class background). The sky-high ambitions of Kalanick Holmes and Sorokin were all eventually brought back down to earth. Sorokin and Holmes both ended up in prison, and Kalanick was ousted from the company he founded.

But with the subtlest of twists, they didn’t have to end this way. They could have been the story of almost any corporate America hustler who triumphed. With a little more substance and a little less scam, you could swap Elizabeth Holmes for Steve Jobs. They even dressed the same.

Obviously, scamming seems to sell. These people fascinate us. Part of the appeal is no doubt due a class conflict narrative: the scrappy hustler climbing the social ranks by whatever means possible. We love to watch “one of us” pull the wool over the eyes of the social elite.

In the case of Anna Sorokin, Laura Craik dissects our fascination in a piece published in the UK’s Evening Standard:

“The reason people are so obsessed with Sorokin is simple: she had the balls to pull off on a grand scale what so many people try and fail to pull off on a small one. To use a phrase popular on social media, Sorokin succeeded in living her best life — right down to the clothes she wore in court, chosen by a stylist. Like Jay Gatsby, she was a deeply flawed embodiment of The American Dream: a person from humble beginnings who rose to achieve wealth and social status. Only her wealth was borrowed and her social status was conferred via a chimera of untruths.”

Laura Craik – UK Evening Standard

This type of behavior is nothing new. It’s always been a part of us. In 1513, a Florentine bureaucrat named Niccolo Machiavelli gave it a name — actually, his name. In writing “The Prince,” he condoned bad behavior as long as the end goal was to elevate oneself. In a Machiavellian world, it’s always open season on suckers: “One who deceives will always find those who allow themselves to be deceived.”

For the past five centuries, Machiavellianism was always synonymous with evil. It was a recognized character flaw, described as “a personality trait that denotes cunningness, the ability to be manipulative, and a drive to use whatever means necessary to gain power. Machiavellianism is one of the traits that forms the Dark Triad, along with narcissism and psychopathy.”

Now, however, that stigma seems to be disappearing. In a culture obsessed with success, Machiavellianism becomes a justifiable means to an end, so much so that we’ve given this culture its own hashtag: #scamculture: “A scam culture is one in which scamming has not only lost its stigma but is also valorized. We rebrand scamming as ‘hustle,’ or the willingness to commodify all social ties, and this is because the ‘legitimate’ economy and the political system simply do not work for millions of Americans.”

It’s a culture that’s very much at home in Silicon Valley. The tech world is steeped in Machiavellianism. Its tenets are accepted — even encouraged — business practices in the Valley. “Fake it til you make it” is tech’s modus operandi. The example of Niccolo Machiavelli has gone from being a cautionary tale to a how-to manual.

But these predatory practices come at a price. Doing business this way destroys trust. And trust is still, by far, the best strategy for our mutual benefit. In behavioral economics, there’s something called “tit for tat,” which according to Wikipedia “posits that a person is more successful if they cooperate with another person. Implementing a tit-for-tat strategy occurs when one agent cooperates with another agent in the very first interaction and then mimics their subsequent moves. This strategy is based on the concepts of retaliation and altruism.”

In countless game theory simulations, tit for tat has proven to be the most successful strategy for long-term success. It assumes a default position of trust, only moving to retaliation if required.

Our society needs trust to function properly. In a New York Times op-ed entitled “Why We Need to Address Scam Culture,” Tressie McMillan Cottom writes,  

“Scams weaken our trust in social institutions, but their going mainstream — divorced from empathy for the victims or stigma for the perpetrators — means that we have accepted scams as institutions themselves.”

Tressie McMillan Cottom – NY Times

The reason that trust is more effective than scamming is that predatory practices are self-limiting. You can only be a predator if you have enough prey. In a purely Machiavellian world, trust disappears — and there are no easy marks to prey upon.

I am Generation Jones

I was born in 1961. I always thought that technically made me a baby boomer. But I recently discovered that I am, in fact, part of Generation Jones.

If you haven’t heard of that term (as I had not, until I read a post on it a few weeks ago) Generation Jones refers to people born from 1955 to 1964 — a cusp generation squeezed between the massive boomer block and Gen X.

That squares with me. I always somehow knew I wasn’t really a boomer, but I also knew I wasn’t Gen X. And now I know why. I, along with Barack Obama and Wayne Gretzky, was squarely in the middle of Generation Jones.

I always felt the long shadow of World War II defined baby boomers, but it didn’t define me. My childhood felt like eons removed from the war. Most of the more-traumatic wounds had healed by the time I was riding my trike through the relatively quiet suburban streets of Calgary, Alberta.

I didn’t appreciate the OK Boomer memes, not because I was the butt of them, but more because I didn’t really feel they applied to me. They didn’t hit me where I live. It was like I was winged by a shot meant for someone else.

OK Boomer digs didn’t really apply to my friends and contemporaries either, all of whom are also part of Generation Jones. For the most part, we’re trying to do our best dealing with climate change, racial inequality, more fluid gender identification and political polarization. We get it. Is there entitlement? Yeah, more than a little. But we’re trying.

And I also wasn’t part of Gen X. I wasn’t a latchkey kid. My parents didn’t obsess over the almighty dollar, so I didn’t feel a need to push back against it. My friends and I worked a zillion hours, because we were — admittedly — still materialistic. But it was a different kind of materialism, one edged with more than a little anxiety.

I hit the workforce in the early ‘80s, right in the middle of a worldwide recession. Generation Jones certainly wanted to get ahead, but we also wanted to keep our jobs, because if we lost them, there was no guarantee we’d find another.

When boomers were entering the workforce, through the 1970s, Canada’s unemployment rate hovered in the 6% to 8% range (U.S. numbers varied but roughly followed the same pattern). In 1982, the year I tried to start my career, it suddenly shot up to 13%. Through the ‘80s, as Gen X started to get their first jobs, it declined again to the 8% range. Generation Jones started looking for work just when a job was historically the hardest to find.

It wasn’t just the jobless rate. Interest rates also skyrocketed to historic levels in the early ‘80s. Again, using data from the Bank of Canada, their benchmark rate peaked at an astronomical 20.78% the same month I turned 20, in 1981. Not only couldn’t we find a job, we couldn’t have afforded credit even if we could get a job.

So yes, we were trying to keep up with the Joneses — this is where the name for our generation comes from, coined by social commentator Jonathon Pontell — but it wasn’t all about getting ahead. A lot of it was just trying to keep our heads above water.

We were a generation moving into adulthood at the beginning of HIV/Aids, Reaganomics, globalization and the mass deindustrialization of North American. All the social revolutions of the ‘60s and ‘70s had crystallized to the point where they now had real-world consequences. We were figuring out a world that seemed to be pivoting sharply.

As I said, I always felt that I was somewhat accidentally lodged between baby boomer and Gen X, wading my way through the transition.

Part of that transition involved the explosion of technology that became much more personal at the beginning of the 1980s.  To paraphrase Shakespeare in “Twelfth Night”: Some are born with technology, some achieve technology, and some have technology thrust upon them.

Generation Jones is in the last group.

True boomers could make the decision to ignore technology and drift through life just adopting what they absolutely had to. Gen X grew up with the rudiments of technology, making it more familiar territory for them. The leading edge of that generation started entering the workforce in the mid 80’s. Computers were becoming more common. The Motorala “brick” cellphone had debuted. Technology was becoming ubiquitous – unable to be ignored. 

But we were caught in between. We had to make a decision: Do we embrace technology, or do we fight against it? A lot of that decision depended on what we wanted to do for a living. Through the ‘80s, one by one, industries were being transformed by computers and digitalization.

Often, we of Generation Jones got into our first jobs working on the technology of yesterday — and very early in our careers, we were forced to adopt the technologies of tomorrow. Often, we of Generation Jones got into our first jobs working on the technology of yesterday and very early in our careers, we were forced to adopt the technologies of tomorrow.

I started as a radio copywriter in 1982, and my first ads were written on an IBM Selectric and produced by cutting and patching two-track audio tape together on reel-to-reel machine with razor blades and splicing tape. Just a few years later, I was writing on an Apple IIe, and ads were starting to be recorded digitally. That shift in technology happened just when our generation was beginning our careers.  Some of us went willingly, some of us went kicking and screaming.

This straddling two very different worlds seems to personify my generation. I think, with the hindsight of history, we will identify the early ‘80s as a period of significant transition in almost every aspect of our culture. Obviously, all generations had to navigate that transition, but for Generation Jones, that period just happened to coincide with what is typically the biggest transition for anyone in any generation: the passing from childhood to adulthood. It is during this time when we take the experiences of growing up and crystallize them into the foundations of who we will be for the rest of our lives.

For Generation Jones, those foundations had to be built on the fly, as the ground kept moving beneath our feet.

Making Time for Quadrant Two

Several years ago, I read Stephen Covey’s “The 7 Habits of Highly Effective People.” It had a lasting impact on me. Through my life, I have found myself relearning those lessons over and over again.

One of them was the four quadrants of time management. How we spend our time in these quadrants determines how effective we are.

 Imagine a box split into four quarters. On the upper left box, we’ll put a label: “Important and Urgent.” Next to it, in the upper right, we’ll put a label saying “Important But Not Urgent.” The label for the lower left is “Urgent but Not Important.” And the last quadrant — in the lower right — is labeled “Not Important nor Urgent.”

The upper left quadrant — “Important and Urgent” — is our firefighting quadrant. It’s the stuff that is critical and can’t be put off, the emergencies in our life.

We’ll skip over quadrant two — “Important But Not Urgent” — for a moment and come back to it.

In quadrant three — “Urgent But Not Important” — are the interruptions that other people brings to us. These are the times we should say, “That sounds like a you problem, not a me problem.”

Quadrant four is where we unwind and relax, occupying our minds with nothing at all in order to give our brains and body a chance to recharge. Bingeing Netflix, scrolling through Facebook or playing a game on our phones all fall into this quadrant.

And finally, let’s go back to quadrant two: “Important But Not Urgent.” This is the key quadrant. It’s here where long-term planning and strategy live. This is where we can see the big picture.

The secret of effective time management is finding ways to shift time spent from all the other quadrants into quadrant two. It’s managing and delegating emergencies from quadrant one, so we spend less time fire-fighting. It’s prioritizing our time above the emergencies of others, so we minimize interruptions in quadrant three. And it’s keeping just enough time in quadrant four to minimize stress and keep from being overwhelmed.

The lesson of the four quadrants came back to me when I was listening to an interview with Dr. Sandro Galea, epidemiologist and author of “The Contagion Next Time.” Dr. Galea was talking about how our health care system responded to the COVID pandemic. The entire system was suddenly forced into quadrant one. It was in crisis mode, trying desperately to keep from crashing. Galea reminded us that we were forced into this mode, despite there being hundreds of lengthy reports from previous pandemics — notably the SARS crisis–– containing thousands of suggestions that could have helped to partially mitigate the impact of COVID.

Few of those suggestions were ever implemented. Our health care system, Galea noted, tends to continually lurch back and forth within quadrant one, veering from crisis to crisis. When a crisis is over, rather than go to quadrant two and make the changes necessary to avoid similar catastrophes in the future, we put the inevitable reports on a shelf where they’re ignored until it is — once again — too late.

For me, that paralleled a theme I have talked about often in the past — how we tend to avoid grappling with complexity. Quadrant two stuff is, inevitably, complex in nature. The quadrant is jammed with what we call wicked problems. In a previous column, I described these as, “complex, dynamic problems that defy black-and-white solutions. These are questions that can’t be answered by yes or no — the answer always seems to be maybe.  There is no linear path to solve them. You just keep going in loops, hopefully getting closer to an answer but never quite arriving at one. Usually, the optimal solution to a wicked problem is ‘good enough — for now.’”

That’s quadrant two in a nutshell. Quadrant-one problems must be triaged into a sort of false clarity. You have to deal with the critical stuff first. The nuances and complexity are, by necessity, ignored. That all gets pushed to quadrant two, where we say we will deal with it “someday.”

Of course, someday never comes. We either stay in quadrant one, are hijacked into quadrant three, or collapse through sheer burn-out into quadrant four. The stuff that waits for us in quadrant two is just too daunting to even consider tackling.

This has direct implications for technology and every aspect of the online world. Our industry, because of its hyper-compressed timelines and the huge dollars at stake, seems firmly lodged in the urgency of quadrant one. Everything on our to-do list tends to be a fire we have to put out. And that’s true even if we only consider the things we intentionally plan for. When we factor in the unplanned emergencies, quadrant one is a time-sucking vortex that leaves nothing for any of the other quadrants.

But there is a seemingly infinite number of quadrant two things we should be thinking about. Take social media and privacy, for example. When an online platform has a massive data breach, that is a classic quadrant one catastrophe. It’s all hands on deck to deal with the crisis. But all the complex questions around what our privacy might look like in a data-inundated world falls into quadrant two. As such, they are things we don’t think much about. It’s important, but it’s not urgent.

Quadrant two thinking is systemic thinking, long-term and far-reaching. It allows us to build the foundations that helps to mitigate crisis and minimize unintended consequences.

In a world that seems to rush from fire to fire, it is this type of thinking that could save our asses.

The News Cycle, Our Attention Span and that Oscar Slap

If your social media feed is like mine, it was burning up this Monday with the slap heard around the world. Was Will Smith displaying toxic masculinity? Was “it was a joke” sufficient defence for Chris Rock’s staggering lack of ability to read the room? Was Smith’s acceptance speech legendary or just really, really lame?

More than a few people just sighed and chalked it up as another scandal up for the beleaguered awards show. This was one post I saw from a friend on Facebook, “People smiling and applauding as if an assault never happened is probably Hollywood in a nutshell.”

Whatever your opinion, the world was fascinated by what happened. The slap trended number one on Twitter through Sunday night and Monday morning. On CNN, the top trending stories on Monday morning were all about the “slap.” You would have thought that there was nothing happening in the world that was more important than one person slapping another. Not the world teetering on the edge of a potential world war. Not a global economy that can’t seem to get itself in gear. Not a worldwide pandemic that just won’t go away and has just pushed Shanghai – a city of 30 million – back into a total lock down.

And the spectre of an onrushing climactic disaster? Nary a peep in Monday’s news cycle.

We commonly acknowledge – when we do take the time to stop and think about it – that our news cycles have about the same attention span as a 4-year-old on Christmas morning. No matter what we have in our hands, there’s always something brighter and shinier waiting for us under the tree. We typically attribute this to the declining state of journalism. But we – the consumers of news – are the ones that continually ignore the stories that matter in favour of gossipy tidbits.

This is just the latest example of that. It is nothing more than human nature. But there is a troubling trend here that is being accelerated by the impact of social media. This is definitely something we should pay attention to.

The Confounding Nature of Complexity

Just last week, I talked about something psychologists call a locus of control. Essentially it is defined by the amount of control you feel you have over your life. In times of stress, unpredictability or upheaval, our own perceived span of control tends to narrow to the things we have confidence we can manage. Our ability to cope draws inward, essentially circling the wagons around the last vestiges of our capability to direct our own circumstances. 

I believe the same is true with our ability to focus attention. The more complex the world gets, the more we tend to focus on things that we can easily wrap our minds around. It has been shown repeatedly that anxiety impacts the ability of our brain to focus on things. A study from Finland’s Abo Akademi University showed that anxiety reduces the ability of the brain to focus on tasks. It eats away at our working memory, leaving us with a reduced capacity to integrate concepts and work things out. Complex, unpredictable situations natural raise our level of anxiety, leading us to retreat to things we don’t have to work too hard to understand.

The irony here is the more we are aware of complex and threatening news stories, the more we go right past them to things like the Smith-Rock story. It’s like catnip to a brain that’s trying to retreat from the real news because we can’t cope with it.

This isn’t necessarily the fault of journalism, it’s more a limitation of our own brains. On Monday morning, CNN offered plenty of coverage dealing with the new airstrikes in Ukraine, Biden’s inflammatory remarks about Putin, Trump’s attempts to block Congress from counting votes and the restriction of LGBTQ awareness in the classrooms of Florida. But none of those stories were trending. What was trending were three stories about Rock and Smith, one about the Oscar winners and another about a 1600-pound shark. That’s what we were collectively reading.

False Familiarity

It’s not just that the news is too complex for us to handle that made the Rock/Smith story so compelling. Our built-in social instincts also made it irresistible.

Evolution has equipped us with a highly attuned social antennae. Humans are herders and when you travel in a herd, your ability to survive is highly dependent on picking up signals from your fellow herders. We have highly evolved instincts to help us determine who we can trust and who we should protect ourselves from. We are quick to judge others, and even quicker to gossip about behavior that steps over those invisible boundaries we call social norms.

For generations, these instincts were essential when we had keep tabs on the people closest to us. But with the rise of celebrity culture in the last century, we now apply those same instincts to people we think we know. We pass judgement on the faces we see on TV and in social media. We have a voracious appetite for gossip about the super-rich and the super famous.

Those foibles may be ours and ours alone, but they not helped by the fact that certain celebrities – namely one Mr. Smith – feels compelled to share way too much about himself with the public at large. Witness his long and tear-laden acceptance speech. Even though I have only a passing interest in the comings and goings of Will and Jada, I know more about their sex lives than that of my closest friends. The social norm that restricts bedroom talk amongst our friends and family is not there with the celebrities we follow. We salivate over salacious details.

No Foul, No Harm?

That’s the one-two punch (sorry, I had to go there) that made the little Oscar ruckus such a hot news item. But what’s the harm? It’s just a momentary distraction for the never-ending shit-storm that defines our daily existence, right?

Not quite.

The more we continually take the path of least resistance in our pursuit of information, the harder it becomes for us to process the complex concepts that make up our reality. When that happens, we tend to attribute too much importance and meaning to these easily digestible nuggets of gossip. As we try to understand complex situations (which covers pretty much everything of importance in our world today) we start relying too much on cognitive short cuts like availability bias and representative bias. In the first case, we apply whatever information we have at hand to every situation and in the second we resort to substituting stereotypes and easy labels in place of trying to understand the reality of an individual or group.

Ironically, it’s exactly this tendency towards cognitive laziness that was skewered in one of Sunday night’s nominated features, Adam McKay’s Don’t Look Up.

Of course, it was ignored. As Will Smith said, sometimes, “art imitates life.”

Pursuing a Plastic Perfection

“Within every dystopia, there’s a little utopia”

— novelist Margaret Atwood

We’re a little obsessed with perfection. For myself, this has taken the form of a lifelong crush on Mary Poppins (Julie Andrews from the 1964 movie), who is “practically perfect in every way.”

We’ve been seeking perfection for some time now. The idea of creating Utopia, a place where everything is perfect, has been with us since the Garden of Eden. As humans have trodden down our timeline, we have been desperately seeking mythical Utopias, then religious ones, which then led to ideological ones.

Some time at the beginning of the last century, we started turning to technology and science for perfection. Then, in the middle of the 20th century, we abruptly swung the other way, veering towards Dystopia while fearing that technology would take us to the dark side, a la George Orwell’s “1984” and Aldous Huxley’s “Brave New World.”

Lately, other than futurist Ray Kurzweil and the starry-eyed engineers of Silicon Valley, I think it’s fair to say that most of us have accepted that technology is probably a mixed bag at best: some good and some bad. Hopefully, when the intended consequences are tallied with the unintended ones, we net out a little to the positive. But we can all agree that it’s a long way from perfection.

This quest for perfection is taking some bizarre twists. Ultimately, it comes down to what we feel we can control, focusing our lives on the thinnest of experiences: that handful of seconds that someone pays attention to our social media posts.

It’s a common psychological reaction: the more we feel that our fate is beyond our control, the more we obsess about those things we feel we can control. And on social media, if we can’t control our world, our country, our town or even our own lives, perhaps our locus of control becomes narrowed to the point where the only thing left is our own appearance.

This effect is exacerbated by our cultural obsession with physical attractiveness. Beauty may only be skin deep, but in our world, it seems to count for everything that matters. Especially on Snapchat and Instagram.

And where there’s a need, there is a technological way. Snapchat filters that offer digitally altered perfection have proliferated. One is Facetune 2,  a retouching app that takes your selfie and adjusts lighting, removes blemishes, whitens teeth and nudges you closer and closer to perfection.

In one blog post, the Facetune team, inspired by Paris Hilton, encourages you to start “sliving.” Not sure what the hell “sliving” is? Apparently, it’s a combination of “slaying it” and “living your best life.” It’s an updated version of “that’s hot” for a new audience.

Of course, it doesn’t hurt if you happen to look like Ms. Hilton or Kim Kardashian. The post assures us that it’s not all about appearance. Apparently, “owning it” and “being kind to yourself” are also among the steps to better “sliving.” But as you read down the post, it does ultimately come back to how you look, reinforced with this pearl of wisdom: “a true sliv’ is also going to look their absolute best when it counts”

And if that sounds about as deep as Saran Wrap, what do you expect when you turn to Paris Hilton for your philosophy of life? Plato she’s not.

Other social filter apps go even farther, essentially altering your picture until it’s no longer recognizable. Bulges are gone, to be replaced by chiseled torsos and optimally rounded butts. Cheeks are digitally sucked in and noses are planed to perfection. Eyes sparkle and teeth gleam. The end product? Sure, it looks amazing. It’s just not you anymore.

With all this pressure put on having a perfect appearance, it’s little wonder that it’s royally messing with our heads (what’s inside the head, not the outside). Hence the new disease of Snapchat Dysmorphia. I wish it were harder to believe in this syndrome — but it’s when people, many of them young girls, book a consultation with a plastic surgeon, wanting to look exactly like the result of their filtered Snapchat selfies.

According to one academic article, one in 50 Americans suffers from body dysmorphic disorder, where sufferers

“are preoccupied with at least one nonexistent or slight defect in physical appearance. This can lead them to think about the defect for at least one hour a day, therefore impacting their social, occupational, and other levels of functioning. The individual also should have repetitive and compulsive behaviors due to concerns arising from their appearances. This includes mirror checking and reassurance seeking among others.”

There’s nothing wrong with wanting perfection. As the old saying goes, it might be the enemy of good, but it can be a catalyst for better. We just have to go on knowing that perfection is never going to be attainable.

But social media is selling us a bogus bill of goods: The idea that perfect is possible and that everyone but us has figured it out.  

The Canary in the Casino

It may not seem like it if you’ve watched the news lately, but there are signs we’re balanced on the edge of a gigantic party. We’re all ready to treat ourselves with a little hedonistic indulging.

As I mentioned in a previous column (rerun last week), physician, epidemiologist and sociologist Nicholas Christakis predicted this behavior, but not for a few years yet. Christakis predicted a sort of global “letting loose” starting some time in 2024:

“What typically happens is people get less religious. They will relentlessly seek out social interactions in nightclubs and restaurants and sporting events and political rallies. There’ll be some sexual licentiousness. People will start spending their money after having saved it. They’ll be joie de vivre and a kind of risk-taking, a kind of efflorescence of the arts, I think.”

So, there is light at the end of the pandemic tunnel — but, according to a report just out from the American Gaming Association, some of us can’t wait a couple of years. First out of the gate were gamblers. Well before we started emerging from the pandemic, they were already rolling the dice and starting the party.

According to the report from the AGA, U.S. commercial gaming revenue hit a record $53 billion in 2021. That was more than 21% higher than the previous record, set in 2019, and a huge rebound of 77% from 2020 numbers, when COVID forced casinos to shut down for months at a time.

You might think online gaming accounts for the jump, but you’d be wrong. In-casino gambling underpins this huge spike, accounting for $45.6 billion of the $53 billion total. People were saying to hell with health mandates and streaming into casinos across the country, with most of the top markets seeing significant gains from pre-COVID 2019.

While some of us might not be ready to ditch the masks and belly up to the bar, I suspect these gamblers are an early indicator of things to come. Call them a canary in a coal mine, if you will.

Because I can’t resist interesting historical tidbits, I thought I’d share the story behind this saying about how canaries ended up in coal mines in the first place. Early in the last century, canaries were used as an early warning system for poison gas in England. John Scott Haldane, who was researching the effects of carbon monoxide on humans, suggested using canaries as a “sentinel species,” an animal more sensitive to the impact of poisonous gases. They were kept in cages throughout the mines — and if a canary died, the miners were warned to evacuate the mine.

But why canaries?

Canaries, like most birds, need tremendous amounts of oxygen to fly and to avoid altitude sickness. They actually take in oxygen twice on each breath, once while inhaling and again when exhaling. This, combined with their relatively small size, make them hyper-sensitive to the impact of a poisonous gas. Also, canaries were easy to come by in England and convenient to transport. So, they were recruited to help keep humans alive.

This makes them analogous to gamblers in the following way: Gamblers, by their nature, are built to be more willing to take some risk in search of a reward. You could say they are hyper-sensitive to the rush that comes from rewarding themselves. As such, they are the early adopters in the onrushing desire to put bad news behind them and let loose with a little hedonistic hell-raising. They are not atypical; they’re just ahead of the curve in this one respect.

Sooner or later, the rest of us will follow. Look for similar huge rebounds in the travel and hospitality sectors, entertainment, events and other industries focused on providing pleasure. The world will become one giant spring break party.

Which is perhaps only fitting, coming after a two-year-long winter of our discontent.

Frustrated? Blame Canada.

Canada is looking a little strange these days. Our border crossings are clogged with trucks going nowhere. Main Street Canada is lined with people waving their fists and yelling things. You see a lot of Canadian flags and placards with various interpretations of what freedom means. Parliament Hill in Ottawa is surrounded by pissed-off people and honking horns. And, most un-Canadian of all, we’re making the nightly news around the globe. We’ve traded in polite for protest.

Now, I have nothing against protests. They’re an essential part of democracy. I completely agree with William Faulkner when he said,

“Never be afraid to raise your voice for honesty and truth and compassion against injustice and lying and greed. If people all over the world…would do this, it would change the earth.”

But it all depends on what you’re protesting against. In this case, that’s a difficult one to nail down. I can understand frustration with health mandates. In my opinion, protesting against that would be somewhat misguided, but at least I could sympathize. I’m tired of this whole COVID thing, too.

But this protest likes to talk a lot about Canada and a perceived loss of freedom. In fact, organizers named their nationwide protest the Freedom Rally. As someone who believes I live in a country that offers more freedom than almost anywhere on the planet, that confuses me.

I’m not the only one. Chris Pengilly, a retired physician from Vancouver Island, is also at a loss to understand what is happening to his country, as he noted on the site Writer’s Bloc.  He writes:

“I think the zenith of my despondency came in a recent news broadcast with a Canadian woman vocalizing, and with a placard, saying that she was going to ‘free Canada.’

Free Canada? Free Canada from what?

Any Canadian can start life by being supervised by a midwife or a doctor from conception to delivery – and this is all for free. Nobody needs fear of starting a family because of lack of money.

The children of such a union are educated for free from kindergarten to Grade 12. Canadians are at liberty to choose a private education or home schooling. Excellent public health and immunization is provided at no cost to the parents.

University or college does carry a fee, but is still partially subsidized by the governments.”

Pengilly goes on to list the many other benefits of being Canadian. Yet, apparently, all that is not enough. According to the protestors, there is something deeply flawed and “unfree” about our country.

The ironic thing about these protestors is that they all love to fly the Canadian flag. But the very things they’re protesting against are exactly what makes Canada, Canada.

Canada is a social democracy. The very foundation of a social democracy is, according to Wikipedia, “advocating economic and social interventions to promote social justice within the framework of a liberal-democratic polity and a capitalist-oriented mixed economy. “

By being a social democracy, we benefit from all those things that Pengilly listed — including a government benefit for those impacted by COVID that I’m sure many of those protestors took advantage of.

But with those benefits comes a responsibility to look after each other. That is what makes Canada Canada, at least for three-quarters of us. It’s not about individual rights and freedoms, it’s about collective rights and freedoms. It’s about building a country that strives for the greater good of all.

And you know what? It works.

It’s made Canada one of the best countries in the world to call home. That’s not just me saying that. That’s pretty much every objective analysis that has determined such things, including the most recent one done by the US News and Wharton University, where Canada came out number one. Empirically, according to this analysis, there is no better place on the planet. The rest of the top five are also social democracies.

That’s the nuance that the protestors are missing. Not only are they missing it, they’re tearing apart the very thing that makes Canada the best place in the world in which to live.

Another ranking of the success of a country is the United Nations Human Development Reports. In this ranking, Canada didn’t do quite as well. We came in 16th, just ahead of the U.S. at 17th. Number one was Norway. So where did we lose points?

We lost points due to what the UN calls “New Threats to Human Security in the Anthropocene”: a focus on independency at the expense of interdependency, or, more simply, putting “me above we.”  The report warns, “Development approaches with a strong focus on economic growth over equitable human development have led to stark and growing inequalities and destabilized processes at the planetary scale.” 

We’re also losing social trust by pitting left against right. The cracks through the center of our culture are destroying the stability required to be a world-class country.

Part of the problem is that our benefits are also our downfall. We have a culture and societal framework that allows itself to be taken advantage of. We have an economy and advantages that tend to lead to a sense of entitlement. We have democratic freedoms that allow a small percentage of an entitled population to protest with a voice that can be disproportionately loud. We have a media ecosystem — both social and traditional — that loves to act as the fan for the proverbial shit to hit. And we have political operatives (grifters?) who have learned both how to amplify this exaggerated discontent and how to use it to their own advantage.

Take the truckers’ protest, for example.

In this new reality of tangled supply chains and climactic disruptors, truckers have emerged as folk heroes by putting their lives on the line to keep our shelves stocked. And to that, you’ll find no argument from me. But as James Menzies, editor of Today’s Trucking and a journalist who has covered the Canadian trucking industry for 18 years, pointed out, “The so-called Freedom convoy was never about truckers”:

“I feel bad for the truckers who thought this was about them. It never was. There was never any discussion around the real issues you face every day. Lack of safe parking. Poor road conditions. Access to clean restrooms. Unpaid detention time at shippers and receivers. You were taken advantage of, because you were frustrated and you have big, loud machines that can be quite disruptive. You became the rallying cry of an anti-government group whose ambitions went well beyond the reversal of the vaccine mandate.”

This is a protest with many layers. At the centre are organizers who not saving Canada, but are trying to build a Canada that never really was. It would be a place that looks a lot like the fictional Montana where John Dutton of the TV show Yellowstone lives; rock-hard, hard-right ideals thinly wrapped in a Canadian flag (perhaps it’s not a coincidence that a major protest is at the Coutts border crossing between Alberta and Montana).

It would be a place where you could say (and mean it), “This is America (or Canada). We don’t share land here.”

It’s just not a place where I want to live.

The Joe Rogan Experiment in Ethical Consumerism

We are watching an experiment in ethical consumerism take place in real time. I’m speaking of the Joe Rogan/Neil Young controversy that’s happening on Spotify. I’m sure you’ve heard of it, but if not, Canadian musical legend Neil Young had finally had enough of Joe Rogan’s spreading of COVID misinformation on his podcast, “The Joe Rogan Experience.” He gave Spotify an ultimatum: “You can have Rogan or Young. Not both.”

Spotify chose Rogan. Young pulled his library. Since then, a handful of other artists have followed Young, including former band mates David Crosby, Stephen Stills and Graham Nash, along with fellow Canuck Hall of Famer Joni Mitchell.

But it has hardly been a stampede. One of the reasons is that — if you’re an artist — leaving Spotify is easier said than done. In an interview with Rolling Stone, Rosanne Cash said most artists don’t have the luxury of jilting Spotify: 

It’s not viable for most artists. The public doesn’t understand the complexities. I’m not the sole rights holder to my work… It’s not only that a lot of people who aren’t rights holders can’t remove their work. A lot of people don’t want to. These are the digital platforms where they make a living, as paltry as it is. That’s the game. These platforms own, what, 40 percent of the market share?”

Cash also brings up a fundamental issue with capitalism: it follows profit, and it’s consumers who determine what’s profitable. Consumers make decisions based on self-interest: what’s in it for them. Corporations use that predictable behavior to make the biggest profit possible. That behavior has been perfectly predictable for hundreds of years. It’s the driving force behind Adam Smith’s Invisible Hand. It was also succinctly laid out by economist Milton Friedman in 1970:

“There is one and only one social responsibility of business–to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game, which is to say, engages in open and free competition without deception or fraud.”

We all want corporations to be warm and fuzzy — but it’s like wishing a shark were a teddy bear. It just ain’t gonna happen.

One who indulged in this wishful thinking was a little less well-known Canadian artist who also pulled his music  from Spotify, Ontario singer/songwriter Danny Michel. He told the CBC:

“But for me, what it was was seeing how Spotify chose to react to Neil Young’s request, which was, you know: You can have my music or Joe. And it seems like they just, you know, got out a calculator, did some math, and chose to let Neil Young go. And they said, clear and loud: We don’t need you. We don’t need your music.”

Well, yes, Danny, I’m pretty sure that’s exactly what Spotify did. It made a decision based on profit. For one thing, Joe Rogan is exclusive to Spotify. Neil Young isn’t. And Rogan produces a podcast, which can have sponsors. Neil Young’s catalog of songs can’t be brought to you by anyone.

That makes Rogan a much better bet for revenue generation. That’s why Spotify paid Rogan $100 million. Music journalist Ted Gioia made the business case for the Rogan deal pretty clear in a tweet

“A musician would need to generate 23 billion streams on Spotify to earn what they’re paying Joe Rogan for his podcast rights (assuming a typical $.00437 payout per stream). In other words, Spotify values Rogan more than any musician in the history of the world.”

I hate to admit that Milton Friedman is right, but he is. I’ve said it time and time before, to expect corporations to put ethics ahead of profits is to ignore the DNA of a corporation. Spotify is doing what corporations will always do, strive to be profitable. The decision between Rogan and Young was done with a calculator. And for Danny Michel to expect anything else from Spotify is simply naïve. If we’re going to play this ethical capitalism game, we must realize what the rules of engagement are.

But what about us? Are we any better that the corporations we keep putting our faith in?

We have talked about how we consumers want to trust the brands we deal with, but when a corporation drops the ethics ball, do we really care? We have been gnashing our teeth about Facebook’s many, many indiscretions for years now, but how many of us have quite Facebook? I know I haven’t.

I’ve seen some social media buzz about migrating from Spotify to another service. I personally have started down this road. Part of it is because I agree with Young’s stand. But I’ll be brutally honest here. The bigger reason is that I’m old and I want to be able to continue to listen to the Young, Mitchell and CSNY catalogs. As one of my contemporaries said in a recent post, “Neil Young and Joni Mitchell? Wish it were artists who are _younger_ than me.”

A lot of pressure is put on companies to be ethical, with no real monetary reasons why they should be. If we want ethics from our corporations, we have to make it important enough to us to impact our own buying decisions. And we aren’t doing that — not in any meaningful way.

I’ve used this example before, but it bears repeating. We all know how truly awful and unethical caged egg production is. The birds are kept in what is known as a battery cage holding 5 to 10 birds and each is confined to a space of about 67 square inches. To help you visualize that, it’s just a bit bigger than a standard piece of paper folded in half. This is the hell we inflict on other animals solely for our own gain. No one can be for this. Yet 97% of us buy these eggs, just because they’re cheaper.

If we’re looking for ethics, we have to look in other places than brands. And — much as I wish it were different — we have to look beyond consumers as well. We have proven time and again that our convenience and our own self-interest will always come ahead of ethics. We might wish that were different, but our spending patterns say otherwise.

Don’t Be Too Quick To Dismiss The Metaverse

According to my fellow Media Insider Maarten Albarda, the metaverse is just another in a long line of bright shiny objects that — while promising to change the world of marketing — will probably end up on the giant waste heap of overhyped technologies.

And if we restrict Maarten’s caution to specifically the metaverse and its impact on marketing, perhaps he’s right. But I think this might be a case of not seeing the forest for the trees.

Maarten lists a number of other things that were supposed to revolutionize our lives: Clubhouse, AI, virtual reality, Second Life. All seemed to amount to much ado about nothing.

But as I said almost 10 years ago, when I first started talking about one of those overhyped examples, Google Glass — and what would eventually become the “metaverse” (in rereading this, perhaps I’m better at predictions than I thought)  — the overall direction of these technologies do mark a fundamental shift:

“Along the way, we build a “meta” profile of ourselves, which acts as both a filter and a key to the accumulated potential of the ‘cloud.’ It retrieves relevant information based on our current context and a deep understanding of our needs, it unlocks required functionality, and it archives our extended network of connections.”

As Wired founder and former executive editor Kevin Kelly has told us, technology knows what it wants. Eventually, it gets it. Sooner or later, all these things are bumping up against a threshold that will mark a fundamental shift in how we live.

You may call this the long awaited “singularity” or not. Regardless, it does represent a shift from technology being a tool we use consciously to enhance our experiences, to technology being so seamlessly entwined with our reality that it alters our experiences without us even being aware of it. We’re well down this path now, but the next decade will move us substantially further, beyond the point of no return.

And that will impact everything, including marketing.

What is interesting is the layer technology is building over the real world, hence the term “meta.” It’s a layer of data and artificial intelligence that will fundamentally alter our interactions with that world. It’s technology that we may not use intentionally — or, beyond the thin layer of whatever interface we use, may not even be aware of.

This is what makes it so different from what has come before. I can think of no technical advance in the past that is so consequential to us personally yet functions beyond the range of our conscious awareness or deliberate usage. The eventual game-changer might not be the metaverse. But a change is coming, and the metaverse is a signal of that.

Technology advancing is like the tide coming in. If you watch the individual waves coming in, they don’t seem to amount to much. One stretches a little higher than the last, followed by another that fizzles out at the shoreline. But cumulatively, they change the landscape — forever. This tide is shifting humankind’s relationship with technology. And there will be no going back.

Maybe Maarten is right. Maybe the metaverse will turn out to be a big nothingburger. But perhaps, just perhaps, the metaverse might be the Antonio Meucci  of our time: an example where the technology was inevitable, but the timing wasn’t quite right.

Meucci was an Italian immigrant who started working on the design of a workable telephone in 1849, a full two decades before Alexander Graham Bell even started experimenting with the concept.  Meucci filed a patent caveat in 1871, five years before Bell’s patent application was filed, but was destitute and didn’t have the money to renew it.  His wave of technological disruption may have hit the shore a little too early, but that didn’t diminish the significance of the telephone, which today is generally considered one of the most important inventions  of all time in terms of its impact on humanity.

Whatever is coming, and whether or not the metaverse represents the sea change catalyst that alters everything, I fully expect at some point in the very near future to pinpoint this time as the dawn of the technological shift that made the introduction of the telephone seem trivial in comparison.