The Winona Ryder Effect

I was in the U.S. last week. It was my first visit in the Trump era.

It was weird. I was in California, so the full effect was muted, but I watched my tongue when meeting strangers. And that’s speaking as a Canadian, where watching your tongue is a national pastime. (As an aside, my US host, Lance, told me about a recent post on a satire site: “Concerned, But Not Wanting To Offend, Canada Quietly Plants Privacy Hedge Along Entire U.S. Border.” That’s so us.) There was a feeling that I had not felt before. As someone who has spent a lot of time in the US over the past decade or two, I felt a little less comfortable. There was a disconnect that was new to me.

Little did I know (because I’ve turned off my mobile CNN alerts since January 20th because I was slipping into depression) but just after I whisked through Sea-Tac airport with all the privilege that being a white male affords you, Washington Governor Jay Inslee would hold a press conference denouncing the new Trump Muslim ban in no uncertain terms. On the other side of the TSA security gates there were a thousand protesters gathering. I didn’t learn about this until I got home.

Like I said, it was weird.

And then there were the SAG awards on Sunday night. What the hell was the deal with Winona Ryder?

When the Stranger Things cast got on stage to accept their ensemble acting award, spokesperson David Harbour unleashed a fiery anti-Trump speech. But despite his passion and volume, it was Winona Ryder, standing beside him, that lit up the share button. And she didn’t say a word. Instead, her face contorted through a series of twenty-some different expressions in under 2 minutes. She became, as one Twitter post said, a “human gif machine.”

Now, by her own admission, Winona is fragile. She has battled depression and anxiety for much of her professional life. Maybe she was having a minor breakdown in front of the world. Or maybe this was a premeditated and choreographed social media master stroke. Either way, it says something about us.

The Stranger Things cast hadn’t even left the stage before the Twitterverse started spreading the Ryder meme. If you look at Google Trends there was a huge spike in searches for Winona Ryder starting right around 6:15 pm (PST) Sunday night. It peaked at 6:48 pm with a volume about 20 times that of queries for Ms. Ryder before the broadcast began.

It was David Harbour that delivered the speech Ryder was reacting to. The words were his, and while there was also a spike in searches for him coinciding with the speech, he didn’t come close to matching the viral popularity of the Ryder meme. At its peak, there were 5 searches for “Winona Ryder” for every search for “David Harbour.”

Ryder’s mugging was – premeditated or not – extremely meme-worthy. It was visual, it was over the top and – most importantly – it was a blank canvas we could project our own views on to. Winona didn’t give us any words, so we could fill in our own. We could use it to provide a somewhat bizarre exclamation point to our own views, expressed through social media.

As I was watching this happen, I knew this was going to go viral. Maybe it’s because it takes something pretty surreal to make a dent in an increasingly surreal world that leaves us numb. When the noise that surrounds us seems increasingly unfathomable, we need something like this to prick our consciousness and make us sit up and take notice. Then we hunker down again before we’re pummelled with the next bit of reality.

Let me give you one example.

As I was watching the SAG awards Sunday night, I was unaware that gunmen had opened fire on Muslim worshippers praying in a mosque in Quebec City. I only found out after I flicked through the channels after the broadcast ended. Today, as I write this, I now know that six are dead because someone hated Muslims that much. Canada also has extreme racism.

I find it hard to think about that. It’s easier to think about Winona Ryder’s funny faces. That’s not very noble, I know, but sometimes you have to go with what you’re actually able to wrap your mind around.

The Vanishing Value of the Truth

You know, the very powerful and the very stupid have one thing in common. They don’t alter their views to fit the facts. They alter the facts to fit the views.

Dr. Who, 1977

We might be in a period of ethical crisis. Or not. It’s tough to say. It really depends on what you believe. And that, in a nutshell, is the whole problem.

Take this past weekend for example. Brand new White House Press Secretary Sean Spicer, in his very first address, lied about the size of the inauguration crowd. Afterwards, a very cantankerous Kellyanne Conway defended the lying when confronted by Chuck Todd on Meet the Press. She said they weren’t lies…they were “Alternate Facts”.

http://www.nbcnews.com/widget/video-embed/860142147643

So, what exactly is an alternate fact? It’s something that is not a fact at all, but a narrative intended to be believed by a segment of the population, presumably to gain something from them.

To use a popular turn of phrase, it’s “Faking It til You Make It!”

And there you have the mantra of our society. We’re rewarding alternate facts on the theory that the end justifies the means. If we throw a blizzard of alternate facts out there that resonate with our audience’s beliefs, we’ll get what we want.

The Fake It Til You Make It syndrome is popping up everywhere. It’s always been a part of marketing and advertising. Arguably, the entire industry is based on alternate facts. But it’s also showing up in the development of new products and services, especially in the digital domain. While Eric Ries never espoused dishonesty in his book, The Lean Start Up, the idea of a Minimal Viable Product certainly lends itself to the principle of “faking it until you make it.” Agile development, in its purest sense, is about user feedback and rapid iteration, but humans being humans, it’s tough to resist the temptation to oversell each iteration, treading dangerously close to pitching “vaporware.” Then we hope like hell that the next development cycle will bridge some of the gap between reality and the alternate facts we sold the prospective customer.

I think we have to accept that our world may not place much value on the truth any more. It’s a slide that started about 100 years ago.

The Seven Habits of Highly Effective People author Stephen Covey reviewed the history of success literature in the US from the 1700’s forward. In the first 150 years of America’s history, all the success literature was about building character. Character was defined by words like integrity, kindness, virtue and honor. The most important thing was to be a good person.

Honesty was a fundamental underpinning of the Character Ethic. This coincided with the Enlightenment in Europe. Intellectually, this movement elevated truth above belief. Our modern concept of science gained its legs: “a branch of knowledge or study dealing with a body of facts or truths.” The concepts of honor and honesty were intertwined

But Covey noticed that things changed after the First World War. Success literature became preoccupied with the concept of personality. It was important to be likeable, extroverted, and influential. The most important thing was to be successful. Somehow, being truthful got lost in the noise generated by the rush to get rich.

Here’s the interesting thing about personality and character. Psychologists have found that your personality is resistant to change. Personality tends to work below the conscious surface and scripts play out without a lot of mindful intervention. You can read all the self-help books in the world and you probably won’t change your personality very much. But character can be worked on. Building character is an exercise in mindfulness. You have to make a conscious choice to be honest.

The other interesting thing about personality and character is how other people see you. We are wired to pick up on other people’s personalities almost instantly. We start picking up the subconscious cues immediately after meeting someone. But it takes a long time to determine a person’s character. You have to go through character-testing experiences before you can know if they’re really a good person. Character cuts to the core, where as personality is skin deep. But in this world of “labelability” (where we think we know people better than we actually do) we often substitute personality cues for character. If a person is outgoing, confident and fun, we believe them to be trustworthy, moral and honest.

This all adds up to some worrying consequences. If we have built a society where success is worth more than integrity, then our navigational bearings become dependent on context. Behavior becomes contingent on circumstances. Things that should be absolute become relative. Truth becomes what you believe is the most expedient and useful in a given situation.

Welcome to the world of alternate facts.

Yahoo and the Transitory World

The writing has been on the wall for some time. But where once it spelled out Yahoo, it now says Altaba.

The Yahooligans are no more, have ceased to be, bereft of life, they rest in peace. Marissa Mayer may be riding off into the Silicon Valley sunset with her golden parachute trailing behind. The parking lot attendants at 701 First Avenue, Sunnyvale, CA could soon be sandblasting her name off the CEO’s reserved parking spot. And, predictably, we Internet codgers are mourning the loss of yet another digital pioneer.

But here’s the thing. For the last 150 years the point of a corporation is to not be a permanent fixture. And, in this world, that’s truer than ever. So we’d better get use to stepping around a growing pile of corporate corpses.

The notion of a corporation has been around since Roman times. The name comes from the Latin corpus (body) and means “body of people.” The original idea was that a corporation would live on beyond the lifespan of any of its members. This has certainly been true of the Stora Kopparberg, a mining community in Sweden, the oldest corporation in the world. It started in 1347.

But things changed in 1855 with the passing of the Limited Liability act in England. This flipped the idea of the perpetuity of a corporation on its head. This legislation allowed shareholders to walk away from the wreckage of a failed corporation without assuming any personal liability. It enabled serial entrepreneurialism and lowered the threshold of tolerable risk.

In short, corporate limited liability law made it okay for business people to try and possibly fail.

In the century and a half since the passing of the limited Liability Act (and similar legislation in most US states) we somehow believed that corporations existed to build size and scale, as befits a market that’s pre-occupied with mass. Economist Ronald Coase said the reason corporations exist is that because in imperfect markets, there is less friction doing things inside an organization than outside, making corporate structures more profitable than open markets. That was true in markets that built physical things from raw materials scattered around the world and then also had to distribute those things to distant markets.

But that’s not the world we live in. The world we live in is the world of rapid iteration and Eric Ries’ “Minimum Viable Product”. Increasingly, these products are not made of physical stuff but of digital bytes, where there is very little in the way of transactional costs.

I think we have to start thinking of the Minimum Viable Company – companies that can be assembled quickly around a market need but also can be disassembled and repurposed quickly. In today’s world, that’s the purpose of an organization and it’s a transitory thing.

In their book Creative Destruction, Richard Foster and Sarah Kaplan envision a new corporate structure more like a venture capital fund. A corporation should be made up of a number of transitory operating units that explore market opportunities in a Darwinian fashion. Arguably, this is closer to the model adopted by Google with Alphabet and, ironically, the new corporate structure of Altaba.

But even here, corporate hubris tends to get in the way. At some point, inevitably, the powers that be begin to believe they’re smarter than the market and build an illusion of sustainability. As economist Joseph Schumpeter said, “The problem that is usually being visualized is how capitalism administers existing structures, whereas the relevant problem is how it creates and destroys them.” Corporations have a vested interest in the status quo. Cognitive biases being what they are, we’ll always favor on the side of what we have rather than what we should build. For this reason, I think Coase’s justification for the corporation might be on its last legs.

That was definitely true of Yahoo. It was a corporation that lived beyond its time. Sooner or later, that had to catch up with it.

 

 

What Comes After Generation Z?

We’re running out of alphabet.

The latest generation is Generation Z. They were born between 1995 and 2012 – according to one demographic primer. So, what do we call the generation born from 2013 on? Z+One? Do we go with an Excel naming scheme and call it Generation AA? Or should we just go back to all those unused letters of the alphabet. After all, we haven’t touched A to W yet. Thinking along those lines, Australian social researcher and author Mark McCrindle is lobbying for Generation Alpha. It’s a nice twist – we get to recycle the alphabet and give it a Greek flavor all at the same time.

Maybe the reason we short-sightedly started with the last three letters of the alphabet is that we’re pretty new at this. Before the twentieth century, we didn’t worry much about labeling every generation. And, to be honest, much of that labeling has happened retroactively. The Silent Generation (1925 – 1942) didn’t call themselves that right off that bat. Being Silent, they didn’t call themselves anything. The label wasn’t coined until 1951. And the G.I. Generation, who preceded them ((1901 – 1924), didn’t receive their label until demographers William Strauss and Neil Howe affixed it in 1991.

But starting around the middle of the last century, we developed the need to pigeonhole our cohorts. Maybe it’s because things started moving so quickly about that time. In the first half of the century we had the twin demographical tent poles of the two World Wars. In between we had the Great Depression. After WWII we had the mother of all generational events: the Baby Boom. Each of these eras brought a very different environment, which would naturally affect those growing up in them. Since then, we’ve been scrambling madly to keep up with appropriate labels for each generation.

The standard approach up to now has been to wait for someone to write a book about a generation, which bestows the label, and then we all jump on the bandwagon. But this seems reactive and short sighted. It also means that we get caught in our current situation, where we have a generation that remains unnamed while we’re waiting for the book to be written.

We seem hooked on these generation labels. I don’t think they’re going to go anywhere any time soon. Based on our current fascination with Millennials, we in the media are going to continue to lump every single sociological and technological trend into convenient generationally labeled behavioral buckets. So we should give this naming thing some thought.

Maybe we could take a page from the World Meteorological Organization’s book when it comes to naming hurricanes and tropical storms. They started doing this so the media would have a quick and commonly understood reference point when referring to a particular meteorological event. Don’t generations deserve the same foresight?

The World Meteorological Organization has a strict procedure: “For Atlantic hurricanes, there is a list of male and female names which are used on a six-year rotation. The only time that there is a change is if a storm is so deadly or costly that the future use of its name on a different storm would be inappropriate. In the event that more than twenty-one named tropical cyclones occur in a season, any additional storms will take names from the Greek alphabet.”

I like the idea of using male and female names. This got me thinking. Maybe we combine the WMO’s approach and that of the wisdom of crowds. Perhaps the male and female names should be the most popular baby names of that generation. In case you’re wondering, here’s how that would work out:

Silent Generation (1925 – 1942): The Robert and Mary Generation
Baby Boomers I (1946 – 1954): The James and Mary Generation
Baby Boomers II (1955 – 1965): The Michael and Lisa Generation
Generation X (1966 – 1976): The Michael and Jennifer Generation
Millennials (1977 – 1994): The Michael and Jessica Generation
Generation Z (1995 – 2012): The Jacob and Emily Generation
Generation ??? (2013 – Today) – The Emma and Noah Generation

The sharp sighted amongst you will have noticed two problems with this. First, some names are stubbornly popular (I’m talking about you Michael and Mary) and span multiple generations. Secondly, this is a very US-Centric approach. Maybe we need to mix it up globally. For instance, if we tap into the naming zeitgeist of South Korea, that would make the current generation the Seo-yeon and Min-jun Generation.

Of course, all this could be needless worrying. Perhaps those that affixed the Generation Z label knew something we didn’t.

Branding in the Post Truth Age

If 2016 was nothing else – it was a watershed year for the concept of branding. In the previous 12 months, we saw a decoupling in the two elements we have always believed make up brands. As fellow Spinner Cory Treffiletti said recently:

“You have to satisfy the emotional quotient as well as the logical quotient for your brand.  If not, then your brand isn’t balanced, and is likely to fall flat on its face.”

But another Mediapost article highlighted an interesting trend in branding:

“Brands will strive to be ‘meticulously un-designed’ in 2017, according to WPP brand agency Brand Union.”

This, I believe, speaks to where brands are going. And depending on which side of the agency desk you happen to be on, this could either be good news or downright disheartening.

Let’s start with the logical side of branding. In their book Absolute Value, Itamar Simonson and Emanuel Rosen sounded the death knell for brands as a proxy for consumer information. Their premise, which I agree with, is that in a market that is increasingly moving towards perfect information, brands have lost their position of trust. We would rather rely on information that comes from non-marketing sources.

But brands have been aspiring to transcend their logical side for at least 5 decades now. This is the emotional side of branding that Treffiletti speaks of. And here I have to disagree with Simonson and Rosen. This form of branding appears to be very much alive and well, thank you. In fact, in the past year, this form of branding has upped the game considerably.

Brands, at their most potent, embed themselves in our belief systems. It is here, close to our emotional hearts, which mark the Promised Land for brands. Reid Montague’s famous Coke neuro-imaging experiment showed that for Coke drinkers, the brand became part of who they are. Research I was involved in showed that favored brands are positively responded to in a split second, far faster than the rational brain can act. We are hardwired to believe in brands and the more loved the brand, the stronger the reaction. So let’s look at beliefs for a moment.

Not all beliefs are created equal. Our beliefs have an emotional valence – some beliefs are defended more strongly than others. There is a hierarchy of belief defense. At the highest level are our Core beliefs; how we feel about things like politics and religion. Brands are trying to intrude on this core belief space. There has been no better example of this than the brand of Donald Trump.

Beliefs are funny things. From an evolutionary perspective, they’re valuable. They’re mental shortcuts that guide our actions without requiring us to think. They are a type of emotional auto-pilot. But they can also be quite dangerous for the same reason. We defend our beliefs against skeptics – and we defend our core beliefs most vigorously. Ration has nothing to do with it. It is this type of defense system that brands would love to build around themselves.

We like to believe our beliefs are unique to us – but in actual fact, beliefs also materialize out of our social connections. If enough people in our social network believe something is true, so will we. We will even create false memories and narratives to support the fiction. The evolutionary logic is quite simple. Tribes have better odds for survival than individuals, and our tribe will be more successful if we all think the same way about certain things. Beliefs create tribal cohesion.

So, the question is – how does a brand become a belief? It’s this question that possibly points the way in which brands will evolve in the Post-Truth future.

Up to now, brands have always been unilaterally “manufactured” – carefully crafted by agencies as a distillation of marketing messages and delivered to an audience. But now, brands are multilaterally “emergent” – formed through a network of socially connected interactions. All brands are now trying to ride the amplified waves of social media. This means they have to be “meme-worthy” – which really means they have to be both note and share-worthy. To become more amplifiable, brands will become more “jagged,” trying to act as catalysts for going viral. Branding messages will naturally evolve towards outlier extremes in their quest to be noticed and interacted with. Brands are aspiring to become “brain-worms” – wait, that’s not quite right – brands are becoming “belief-worms,” slipping past the rational brain if at all possible to lodge themselves directly in our belief systems. Brands want to be emotional shorthand notations that resonate with our most deeply held core beliefs. We have constructed a narrative of who we are and brands that fit that narrative are adopted and amplified.

It’s this version of branding that seems to be where we’re headed – a socially infectious virus that creates it’s own version of the truth and builds a bulwark of belief to defend itself. Increasingly, branding has nothing to do with rational thought or a quest for absolute value.