The Inevitability of the Pendulum Effect

In the real world, things never go in straight lines or predictable curves. The things we call trends are actually a saw tooth profile of change, reaction and upheaval. If you trace the path, you’ll see evidence of the Law of the Pendulum.

In the physical world, the Law is defined as: “the movement in one direction that causes an equal movement in a different direction.

In the world of human behavior, it’s defined as: “the theory holding that trends in culture, politics, etc., tend to swing back and forth between opposite extremes.

Politically and socially, we’re in the middle of a swing to the right. But this will be countered inevitably with a swing to the left. We could call it Newton’s Third Law of Social Motion: For every action there is an equal and opposite reaction.

Except that’s not exactly true. If it were, the swings would cancel each other out and we’d end in the same place we started from. And we know that’s not the case. Let me give you one example that struck me recently.

This past week, I visited a local branch of my bank. The entire staff were wearing Pride T-shirts in support of their employer’s corporate sponsorship of Pride Week. That is not really a cause for surprise in our world of 2019. No one batted an eye. But I couldn’t help thinking that it’s parsecs removed from the world I grew up in, in the late 60’s and early 70’s.

I won’t jump into the debate of the authenticity of corporate political correctness, but there’s no denying that when it comes to sexual preference, the world is a more tolerant place than it was 50 years ago. The pendulum has swung back and forth, but the net effect has been towards – to use Steven Pinker’s term – the better angels of our nature.

When talking about the Pendulum Effect, we also have to keep an eye on Overton’s Window. This was something I talked about in a previous column some time ago. Overton’s window defines the frame of what the majority of us – as a society – find acceptable. As the pendulum swings back and forth between extremes, somewhere in the middle is a collective view that most of us can live with. But Overton’s window is always moving. And I believe that the window today frames a view of a more tolerant, more empathetic world than the world of 50 years ago – or almost any time in our past. That’s true every day. Lately, it might not even be true most days. But this is probably a temporary thing. The pendulum will swing back eventually, and we’ll be in a better place.

My question is: why? Why – when we even out the swings – are we becoming better people? So far, this column has little to do with media, digital or otherwise. But I think the variable here is information. Stewart Brand, founder of the Whole Earth Catalog, once said “Information wants to be free.” But I think information also wants to set us free – free from the limitations of our gene bound prejudice and pettiness. Where ever you find the pendulum swinging backwards, you’ll find a dearth of information. We need information to be thoughtful. And we need thoughtfulness to create a more just, more tolerant, more empathetic society.

We – in our industry – deal with information as our stock in trade. It is our job to ensure that information spreads as far as possible. It’s the one thing that will ensure that the pendulum swings in the right direction. Eventually. 

Data does NOT Equal People

We marketers love data. We treat it like a holy grail: a thing to be worshipped. But we’re praying at the wrong altar. Or, at the very least, we’re praying at a misleading altar.

Data is the digital residue of behavior. It is the contrails of customer intent — a thin, wispy proxy for the rich bandwidth of the real world. It does have a purpose, but it should be just one tool in a marketer’s toolbox. Unfortunately, we tend to use it as a Swiss army knife, thinking it’s the only tool we need.

The problem is that data is seductive. It’s pliable and reliable, luring us into manipulation because it’s so easy to do. It can be twisted and molded with algorithms and spreadsheets.

But it’s also sterile. There is a reason people don’t fit nicely into spreadsheets. There are simply not enough dimensions and nuances to accommodate real human behavior.

Data is great for answering the questions “what,” “who,” “when” and “where.” But they are all glimpses of what has happened. Stopping here is like navigating through the rear-view mirror.

Data seldom yields the answer to “why.” But it’s why that makes the magic happen, that gives us an empathetic understanding that helps us reliably predict future behaviors.

Uncovering the what, who, when and where makes us good marketers. But it’s “why” that makes us great. It’s knowing why that allows us to connect the distal dots, hacking out the hypotheses that can take us forward in the leaps required by truly great marketing. As Tom Goodwin, the author of “Digital Darwinism,” said in a recent post, “What digital has done well is have enough of a data trail to claim, not create, success.”

We as marketers have to resist stopping at the data. We have to keep pursuing why.

Here’s one example from my own experience. Some years ago, my agency did an eye-tracking study that looked at gender differences in how we navigate websites.

For me, the most interesting finding to fall out of the data was that females spent a lot more time than males looking at a website’s “hero” shot, especially if it was a picture that had faces in it. Males quickly scanned the picture, but then immediately moved their eyes up to the navigation menu and started scanning the options there. Females lingered on the graphic and then moved on to scan text immediately adjacent to it.

Now, I could have stopped at “who” and “what,” which in itself would have been a pretty interesting finding. But I wanted to know “why.” And that’s where things started to get messy.

To start to understand why, you have to rely on feelings and intuition. You also have to accept that you probably won’t arrive at a definitive answer. “Why” lives in the realm of “wicked” problems, which I defined in a previous column as “questions that can’t be answered by yes or no — the answer always seems to be maybe.  There is no linear path to solve them. You just keep going in loops, hopefully getting closer to an answer but never quite arriving at one. Usually, the optimal solution to a wicked problem is ‘good enough – for now.’”

The answer to why males scan a website differently than females is buried in a maze of evolutionary biology, social norms and cognitive heuristics. It probably has something to do with wayfinding strategies and hardwired biases. It won’t just “fall out” of data because it’s not in the data to begin with.

Even half-right “why” answers often take months or even years of diligent pursuit to reveal themselves. Given that, I understand why it’s easier to just focus on the data. It will get you to “good,” and maybe that’s enough.

Unless, of course, you’re aiming to “put a ding in the universe,” as Steve Jobs said in an inspirational commencement speech at Stanford University. Then you have to shoot for great.

The Marie Kondo Effect: Our Quest For Control

There’s a reason why organizational guru Marie Kondo has become a cultural phenomenon. When the world seems increasingly bizarre and unpredictable, we look for things we can still control.

Based on my news feed, it appears that may be limited to our garage and our sock drawer.

In 1954, American psychologist Julian Rotter introduced something he called the locus of control.  To lift the Wikipedia definition, it’s “the degree to which people believe that they have control over the outcome of events in their lives, as opposed to external forces beyond their control.”

Control is important to humans, even if it’s just an illusion. Our perception of being in control makes us happier.

Kondo has tapped into a fundamental human principle: Choosing to organize is choosing joy. There is a mountain of academic research to back that up.

But you really don’t have to look any further than the street you live on. That old Italian guy who’s up at 6:30 every morning washing his driveway? That’s Mario flexing his own locus of control. The more bizarre the world appears to become, the more we narrow the focus of our locus to things we know we can control. And if that’s 1,000 square feet of asphalt, so be it

It’s not just my paisano Mario who needs to stake his claim to control where he can find it. This narrowing of the locus of control commonly goes hand in hand with aging. Typically, as our inevitable cognitive and physical decline catches up with us, we reduce our boundaries of influence to what we can handle.  With my dad, it was recycling. He’d spend a good chunk of his time sorting through cans, jars and cardboard boxes, meticulously sorting them into their respective bins.

We need to feel that we can still exercise control — somehow, somewhere.

This need for control and some semblance of connectable cause and effect always takes a beating during times of upheaval. Theologian Reinhold Niebuhr’s famous Serenity Prayer, which he began using in sermons during the tumultuous 1930s and 40s, became a lifeline in times of turmoil:

“God, give me grace to accept with serenity the things that cannot be changed, courage to change the things which should be changed, and the wisdom to know the difference.”

Reinhold Niebuhr

Unfortunately for us, we don’t have a track record of doing so well on the first two parts of Niebuhr’s prayer. We don’t “accept with serenity” — we usually freak out with anxiety and stress. We adapt by focusing as best we can on those things that can be changed. When external disruption is the norm, our locus of control shrinks inward.

This brings up another facet of our need for control: the source of disruption. Disruption that happens to us personally — divorce, a health crisis, career upheaval, loss of a loved one — tends to at least fall somewhat within our locus of control. We have some options in how we respond and deal with these types of disruption.

But disruption that plays out globally is a different matter. How much control do we have over the rise of populist politics, climate change or microplastics in the ocean? The levers of control we can pull are minuscule compared to the scope of the issue.

That’s the problem with our densely connected, intensely networked world. We are hyper-aware of everything that’s wrong anywhere in the world. We are bombarded with it every minute. Every newsfeed, every CNN alert, every Facebook post seems to make us aware of yet one more potential catastrophe that we have absolutely no control over.

It’s no wonder that sometimes we just need to retreat and clean out our Tupperware drawer. In today’s world, you have to find joy where you can.

A Few Thoughts on Trump, Wikipedia and the Perfect Pour

If you’re looking for a sign of the times, there might be none more representative than Donald Trump’s Wikipedia page. According to a recent article in Slate, it’s one of the most popular pages on the Internet. It’s also one of the most updated. The article states that the page has had more than 28,000 edits since its launch in 2004.

The trick – of course – is taking something, or someone, as polarizing as Trump and trying to adhere to Wikipedia’s mission to “to accurately convey reliable information in a dispassionate, neutral tone” Slate’s behind the scenes look at the ongoing editorial battle to come within spitting distance of this goal is fascinating reading. How do you stay accurate and reliable when trying to navigate through the real-time storm of bombast and hyperbole that typically surrounds the 45th president of the United States? How timely can you be? How timely should you be? One Wikipedia editor noted, ““This is an encyclopedia. We are not in competition with newspapers for readership, so there is no rush to print,”

But we actually are in a rush. We expect online to equal real time. We have no patience for outdated information – or outdated anything – for that matter. And that introduces a conundrum when we refer to the current POTUS.  Say what you want about Trump. He does generate a lot of froth. And froth needs time to settle. Just ask the brewers of Guinness.

Something called “The Settle” is step 4 of the perfect Guinness pour. According to the brewers, the precise time for “The Settle” is 119.53 seconds. I’m not sure what happens if you miscalculate and only allow – say – 119.47 seconds. I’m not aware of any grievous injuries caused by a mistimed settle. But I digress. The point is that The Settle is required to avoid drinking nothing but foam. See how I brought that around to my original point?

You may debate the veracity of the Settle when it comes to a glass of stout, but I believe the idea has merit when it comes to dealing with the deluge of information with which we’re bombarded daily. According to Guinness, the whole point of The Settle is to get the right balance of aromatic “head” and malty liquid when you actually take a drink. Balance is important in beer. It’s also important in information. We need less froth and more substance in our daily media diet.

Why is more time important in our consumption of information? It’s because it gives emotions time to dissipate. Emotions mixed in with information is like gas mixed in with beer. You want a little, but not a lot. You want emotions to color rational thought, not dominate it. And when information is digested too soon, the balance between emotions and logic is all out of whack.

Emotional thought has to be on a hair trigger. It’s how we’re built. Emotions get us out of sticky situations. But they also tend to flood out ration and logic. Emotions and logic live in two very different parts of the brain. In a complex age where we need to be more thoughtful, emotional reactions are counterproductive. Yet, our current media environment is built to cater exclusively to our emotional side. There is no time for “The Settle.” We jump from frothy sip to sip, without ever taking the time to get to the substance of the story. Again, to use Trump’s Wikipedia example, after Trump’s 2018 Helsinki Summit with Vladimir Putin, there was plenty of media generated froth that was trying to force its way into his entry. It ranged from being “a serious mistake” to being “treasonous” and a “disgraceful performance.” But with the benefit of a little time, one Wiki editor noted, “Let’s not play the ‘promote the most ridiculous comments’ game that the media appears to be playing. Approximately nothing new happened, but there are plenty of ‘former government officials’ willing to give hyperbolic quotes on Twitter.”

It’s amazing what a little time can do for perspective. Let’s start with – say – 119.53 seconds.

 

 

 

Personal Endeavour in the Age of Instant Judgement

No one likes to be judged — not even gymnasts and figure skaters. But at least in those sports, the judges supposedly know what it is they’re judging. So, in the spirit of instant feedback, let me rephrase: No one likes to be judged by a peanut gallery*. Or, to use a more era appropriate moniker, by a troll’s chorus.

Because of this, I feel sorry for David Benioff and D.B. Weiss, the showrunners of “Game of Thrones.” Those poor bastards couldn’t be any more doomed if they had been invited to a wedding of the red variety.

At least they were aware of their fate. In an interview with Entertainment Weekly, they disclosed their plans for the airing of the final episode. “We’ll in an undisclosed location, turning off our phones and opening various bottles,” Weiss admitted. “At some point, if and when it’s safe to come out again, somebody like [HBO’s ‘GOT’ publicist] will give us a breakdown of what was out there without us having to actually experience it.” Added Benioff: “I plan to be very drunk and very far from the internet.”

Like it or not, we now live in an era of instant judgement, from everyone. It’s the evil twin of social virality. It means we have to grow thicker skins than your average full-grown dragon**. And because I’m obsessively fixated on unintended consequences, this got me to thinking. How might all this judgement impact our motivation to do stuff?

First of all, let’s look at the good that comes from this social media froth kicked up by fervent fans. There is a sense of ownership and emotional investment in shows like “Game of Thrones” that’s reached a pitch never seen before — and I truly believe we’re getting better TV because of it.

If you look at any of the lists of the best TV shows of all time, they are decidedly back-end loaded. “Game of Thrones,” even at its worst, is better than almost any television of the ’80s or ’90s. And it’s not only because of the advances in special effects and CGI wizardry. There is a plethora of thoughtful, exquisitely scripted and superbly acted shows that have nary an enchantress, dragon or apocalypse of the walking dead in sight. There is no CGI in “Better Call Saul,” “Master of None” or “Atlanta.”

But what about the dark side of social fandom?

I suspect instant judgement might make it harder for certain people to actually do anything that ends up in the public arena. All types of personal endeavors require failure and subsequent growth as an ingredient for success. And fans are getting less and less tolerant of failure. That makes the entry stakes pretty high for anyone producing output that is going to be out there, available for anyone to pass judgement on.

We might get self-selection bias in arenas like the arts, politics and sports. Those adverse to criticism that cuts too deep will avoid making themselves vulnerable. Or — upon first encountering negative feedback — they may just throw in the towel and opt for something less public.

The contributors to our culture may just become hard-nosed and impervious to outside opinion — kind of like Cersei Lannister. Or, even worse, they may be so worried about what fans think that they oscillate trying to keep all factions happy. That would be the Jon Snows of the world.

Either way, we lose the contributions of those with fragile egos and vulnerable hearts. If we applied that same filter retroactively to our historic collective culture, we’d lose most of what we now treasure.

In the end, perhaps David Benioff got it right. Just be “very drunk and very far from the internet.”

* Irrelevant Fact #1: The term peanut gallery comes from vaudeville, where the least expensive seats were occupied by the rowdiest members of the audience. The cheapest snack was peanuts, which the audience would throw at the performers.

** Irrelevant Fact #2: Dragons have thick skin because they don’t shed their skins. It just keeps getting thicker and more armor-like. The older the dragon, the thicker the skin.

The Importance of Playing Make-Believe

One of my favourite sounds in the world is children playing. Although our children are well past that age, we have stayed in a neighbourhood where new families move in all the time. One of the things that has always amazed me is a child’s ability to make believe. I used to do this but I don’t any more. At least, I don’t do it the same way I used to.

Just take a minute to think about the term itself: make-believe. The very words connote the creation of an imaginary world that you and your playmates can share, even in that brief and fleeting moment. Out of the ether, you can create an ephemeral reality where you can play God. A few adults can still do that. George R.R. Martin pulled it off. J.K. Rowling did likewise. But for most of us, our days of make-believe are well behind us.

I worry about the state of play. I am concerned that rather than making believe themselves, children today are playing in the manufactured and highly commercialized imaginations of profit-hungry corporations. There is no making — there is only consuming. And that could have some serious consequences.

Although we don’t use imagination the way we once did, it is the foundation for the most importance cognitive tasks we do. It was Albert Einstein who said, “Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution.”

It is imagination that connects the dots, explores the “what-ifs” and peeks beyond the bounds of the known. It is what separates us from machines.

In that, Einstein presciently nailed the importance of imagination. Only here does the mysterious alchemy of the human mind somehow magically weave fully formed worlds out of nothingness and snippets of reality. We may not play princess anymore, but our ability to imagine underpins everything of substance that we think about.

The importance of playing make-believe is more than just cognition. Imagination is also essential to our ability to empathize. We need it to put ourselves in place of others. Our “theory of mind” is just another instance of the many facets of imagination.

This thing we take for granted has been linked to a massive range of essential cognitive developments. In addition to the above examples, pretending gives children a safe place to begin to define their own place in society. It helps them explore interpersonal relationships. It creates the framework for them to assimilate information from the world into their own representation of reality.

We are not the only animals that play when we’re young. It’s true for many mammals, and scientists have discovered it’s also essential in species as diverse as crocodiles, turtles, octopuses and even wasps.

For other species, though, it seems play is mainly intended to help come to terms with surviving in the physical world.  We’re alone in our need for elaborate play involving imagination and cognitive games.

With typical human hubris, we adults have been on a century-long mission to structure the act of play. In doing so, we have been imposing our own rules, frameworks and expectations on something we should be keeping as is. Much of the value of play comes from its very lack of structure. Playing isn’t as effective when it’s done under adult supervision. Kids have to be kids.

Play definitely loses much of its value when it becomes passive consumption of content imagined and presented by others through digital entertainment channels. Childhood is meant to give us a blank canvas to colour with our imagination.

As we grow, the real world encroaches on this canvas.  But the delivery of child-targeted content through technology is also shrinking the boundaries of our own imagination.

Still, despite corporate interests that run counter to playing in its purest sense, I suspect that children may be more resilient than I fear. After all, I can still hear the children playing next door. And their imaginations still awe and inspire me.

Selfies: A Different Take on Reality

It was a perfect evening in Sydney Harbor. I was there for a conference and the organizers had arranged an event for the speakers at Milsons Point – under the impressive span of the Harbour bridge. It was dusk and the view of downtown Sydney spread out in front of us with awesome breadth and scope. It was one of those moments that literally takes your breath away. That minute seemed eternal.

After some time, I turned around. There was another attendee, who was intently focused on taking a selfie and posting it to social media. His back was turned to the view behind him. At first, I thought I should do the same. Then I changed my mind. I’d rely on my memory and actually try to stay in the moment. My phone stayed in my pocket.

In the age of selfies, it turns out that my mini-existential crisis is getting more common. According to a new study published in the Journal of Consumer Research, something called “self-presentational concern” can creep into these lifetime moments and suck the awe right out of them. One of the study authors, Alixandra Barasch, explains, “When people take photos to share, they remember their experience more from a third-person perspective, suggesting that taking photos to share makes people consider how the event (and the photos) would be evaluated by an observer. “

Simply stated, selfies take us “out of the moment”. But this effect depends on why we’re taking the selfie in first place. The experimenters didn’t find the effect when people took selfies with the intent of just remembering the moment. It showed up when the selfie was taken for the express purpose of sharing on social media. Suddenly, we are more worried about how we look than where we are and what we’re doing.

Dr. Terri Apter, a professor of psychology at Cambridge University, has been looking at the emergence of selfies as a form of “self-definition” for some time. “We all like the idea of being sort of in control of our image and getting attention, being noticed, being part of the culture.” But when does this very human urge slip over the edge into a destructive spiral? Dr. Apter explains, “You can get that exaggerated or exacerbated by celebrity culture that says unless you’re being noticed, you’re no one,”

I suspect what we’re seeing now is a sort of selfie arms race. Can we upstage the rest of our social network by posting selfies in increasingly exotic locations, doing exceptional things and looking ever more “Mahvelous”? That’s a lot of pressure to put on something we do when we’re just supposed to be enjoying life.

A 2015 study explored the connection between personality traits and posting of selfies. In particular, the authors of the study looked at narcissism, psychopathy and self-objectification. They found that frequent posting of selfies and being overly concerned with how you look in the selfies can be tied to both self-objectification and narcissism. This is interesting, because those two things are at opposite ends of the self-esteem spectrum. Narcissists love themselves and those that self-objectify tend to suffer from low self-esteem. In both cases, selfies represent a way to advertise their personal brands to a wider audience.

There’s another danger with selfie-preoccupation that goes hand-in-hand with distancing yourselves from the moment you’re in – you can fall victim to bad judgement. It happened to Barack Obama at Nelson Mandela’s memorial ceremony. In a moment when he should have been acting with appropriate gravitas, he decided to take a selfie with Danish Prime Minister Helle Thorning-Schmidt and then British Prime Minister David Cameron. It was a stunningly classless moment from a usually classy guy. If you check a photo taken at the time, you can see that Michelle Obama was not amused. I agree.

Like many things tied to social media, selfies can represent a troubling trend in how we look at ourselves in a social context. These things seem to be pointing in the same direction: we’re spending more time worrying about an artificial reality of our own making and less time noticing reality as it actually exists.

We just have to put the phone down sometimes and admire the view across the harbor.