Advertising Meets its Slippery Slope

We’ve now reached the crux of the matter when it comes to the ad biz.

For a couple of centuries now, we’ve been refining the process of advertising. The goal has always been to get people to buy stuff. But right now, there is now a perfect storm of forces converging that requires some deep navel gazing on the part of us insiders.

It used to be that to get people to buy, all we had to do was inform. Pent up consumer demand created by expanding markets and new product introductions would take care of the rest. We just had to connect better the better mousetraps with the world, which would then duly beat the path to the respective door.  Advertising equaled awareness.

But sometime in the waning days of the consumer orgy that followed World War Two, we changed our mandate. Not content with simply informing, we decided to become influencers. We slipped under the surface of the brain, moving from providing information for rational consideration to priming subconscious needs. We started messing with the wiring of our market’s emotional motivations.  We became persuaders.

Persuasion is like a mental iceberg – 90% of the bulk lies below the surface. Rationalization is typically the hastily added layer of ad hoc logic that happens after the decision is already made.  This is true to varying degrees for almost any consumer category you can think including – unfortunately – our political choices.

This is why, a few columns ago – I said Facebook’s current model is unsustainable. It is based on advertising, and I think advertising may have become unsustainable. The truth is, advertisers have gotten so good at persuading us to do things that we are beginning to revolt. It’s getting just too creepy.

To understand how we got here, let’s break down persuasion. It requires the persuader to shift the beliefs of the persuadee. The bigger the shift required, the tougher the job of persuasion.  We tend to build irrational (aka emotional) bulwarks around our beliefs to preserve them. For this reason, it’s tremendously beneficial to the persuader to understand the belief structure of their target. If they can do this, they can focus on those whose belief structure is most conducive to the shift required.

When it comes to advertisers, the needle on our creative powers of persuasion hasn’t really moved that much in the last half century. There were very persuasive ads created in the 1960’s and there are still great ads being created. The disruption that has moved our industry to the brink of the slippery slope has all happened on the targeting end.

The world we used to live in was a bunch of walled and mostly unconnected physical gardens. Within each, we would have relevant beliefs but they would remain essentially private. You could probably predict with reasonable accuracy the religious beliefs of the members of a local church. But that wouldn’t help you if you were wondering whether the congregation leaned towards Ford or Chevy.  Our beliefs lived inside us, typically unspoken and unmonitored.

That all changed when we created digital mirrors of ourselves through Facebook, Twitter, Google and all the other usual suspects. John Battelle, author of The Search,  once called Google the Database of Intentions. It is certainly that. But our intent also provides an insight into our beliefs. And when it comes to Facebook, we literally map out our entire previously private belief structure for the world to see. That is why Big Data is so potentially invasive. We are opening ourselves up to subconscious manipulation of our beliefs by anyone with the right budget. We are kidding ourselves if we believe ourselves immune to the potential abuse that comes with that. Like I said, 90% of our beliefs are submerged in our subconscious.

We are just beginning to realize how effective the new tools of persuasion are. And as we do so, we are beginning to feel that this is all very unfair. No one likes being manipulated; even if they have willing laid the groundwork for that manipulation. Our sense of retroactive justice kicks in. We post rationalize and point fingers. We blame Facebook, or the government, or some hackers in Russia. But these are all just participants in a new eco-system that we have helped build. The problem is not the players. The problem is the system.

It’s taken a long time, but advertising might just have gotten to the point where it works too well.

 

Short Sightedness, Sharks and Mental Myopia

2017 was an average year for shark attacks.

And this just in…

By the year 2050 half of the World will be Near Sighted.

What could these two headlines possibly have in common? Well, sit back – I’ll tell you.

First, let’s look at why 2017 was a decidedly non-eventful year – at least when it came to interactions between Selachimorpha (sharks) and Homo (us). Nothing unusual happened. That’s it. There was no sudden spike in Jaws-like incidents. Sharks didn’t suddenly disappear from the world’s oceans. Everything was just – average. Was it the only way that 2017 was uneventful? No. There were others. But we didn’t notice because we were focused on the ways that the world seemed to be going to hell in a handbasket. If we look at 2017 like a bell curve, we were focused on the outliers, not the middle.

There’s no shame in that. That’s what we do. The usual doesn’t make the nightly news. It doesn’t even make our Facebook feed. But here’s the thing..we live most of our live in the middle of the curve, not in the outlier extremes. The things that are most relevant to our lives falls squarely into the usual. But all the communication channels that have been built to channel information to us are focused on the unusual. And that’s because we insist not on being informed, but instead on being amused.

In 1985, Neil Postman wrote the book Amusing Ourselves to Death. In it, he charts how the introduction of electronic media – especially television – hastened our decline into a dystopian existence that shared more than a few parallels with Aldous Huxley’s Brave New World. His warning was pointed, to say the least, “ There are two ways by which the spirit of a culture may be shrivelled,” Postman says. “In the first—the Orwellian—culture becomes a prison. In the second—the Huxleyan—culture becomes a burlesque.” It’s probably worth reminding ourselves of what burlesque means, “a literary or dramatic work that seeks to ridicule by means of grotesque exaggeration or comic imitation.” If the transformation of our culture into burlesque seemed apparent in the 80’s, you’d pretty much have to say it’s a fait accompli 35 years later. Grotesque exaggeration is the new normal., not to mention the new president.

But this steering of our numbed senses towards the extremes has some consequences. As the world becomes more extreme, it requires more extreme events to catch our notice. We are spending more and more of our media consumption time amongst the outliers. And that brings up the second problem.

Extremes – by their nature – tend to be ideologically polarized as well. If we’re going to consider extremes that carry a politically charged message, we stick to the extremes that are well synced with our worldview. In cognitive terms, these ideas are “fluent” – they’re easier to process. The more polarized and extreme a message is, the more important it is that it be fluent for us. We also are more likely to filter out non-fluent messages – messages that we don’t happen to agree with.

The third problem is that we are becoming short-sighted (see, I told you I’d get there, eventually). So not only do we look for extremes, we are increasingly seeking out the trivial. We do so because being informed is increasingly scaring the bejeezus out of us. We don’t look too deep nor do we look too far in the future – because the future is scary. There is the collapse of our climate, World War III with North Korea, four more years of Trump…this stuff is terrifying. Increasingly we spend our cognitive resources looking things that are amusing and immediate. The information we seek has to provide immediate gratification. Yes, we are becoming physically short-sighted because we stare at screens too much, but we’re also becoming mentally myopic as well.

If all this is disturbing, don’t worry. Just grab a Soma and enjoy a Feelie.

Is Google Slipping, Or Is It Just Our Imagination?

Recently, I’ve noticed a few articles speculating about whether Google might be slipping:

Last month, the American Customer Satisfaction Index notified us that our confidence in search is on the decline. Google’s score dropped 2% to 82. The culprit was the amount of advertising found on the search results page. To be fair, both Google and search in general have had lower scores. Back in 2015, Google scored a 77%, it’s lowest score ever.

This erosion of customer satisfaction may be leading to a drop in advertising ROI. According to a recent report from Analytic Partners, the return on investment from paid search dropped 27% from 2010 to 2016. Search wasn’t alone. All digital ROI seems to be in decline. Analytic’s VP of Marketing, Joe LaSala, predicts that ROI from digital will continue to decline until it converges with ROI from traditional media.

In April of this year, Forbes ran an article asking the question: “Is Google’s Search Quality Starting to Decline?” Contributors to this decline, according to the article, included the introduction of rich snippets and featured news, including popularity as a ranking factor and ongoing black hat SEO manipulation.

But the biggest factor in the drop of Google’s perceived quality was actually in the perception itself. As the Forbes article’s author, Jayson DeMers, stated;

It’s important to realize just how sophisticated Google is, and how far it’s come from its early stages, as well as the impossibility of having a “perfect” search platform. Humans are flawed creatures, and our actions are what are dictating the shape of search.

Google is almost 20 years old. The domain Google.com was registered on September 15, 1997. Given that 20 years is an eternity in internet years, it’s actually amazing that it’s stood up as well as it has for the past two decades. Whether Google’s naysayers care to admit it or not, that’s due to Google’s almost religious devotion to the quality of their search results. That devotion extends to advertising. The balance between user experience and monetization has always been one that Google has paid a lot of attention too.

But it’s not the presence of ads that has led to this perceived decline of quality. It’s a change in our expectations of what a search experience should be. I would argue that for any given search, using objective measures of result relevance, the results Google shows today are far more relevant than the results they showed in 2008, the year it got it’s highest customer satisfaction score (86%). Since then, Google has made great strides in deciphering user intent and providing a results page that’s a good match for that intent. Sometimes it will get it wrong, but when it gets it right, it puts together a page that’s a huge improvement over the vanilla, one size fits all results page of 2008.

The biggest thing that’s changed in the past 10 years is the context from which we’re launching those searches. In 2008, it was almost always the desktop. But today, chances are we’re searching from a mobile device – or our car – or our home through Amazon Echo. This has changed our expectations of search. We are task focused, rather than “browsing” for information. This creates an entirely different mental framework within which we receive the results. We apply a new yardstick of acceptable relevance. Here, we’re not looking for a list of 20 possible answers – we’re looking for one answer. And it had better be the right one. Context based search must be hyper-relevant.

Compounding this trend is the increasing number of circumstances where search is going “under the hood” – something I’ve been forecasting for a long time now. For example, if you use Siri to launch a search through your CarPlay connected device when you’re driving, the results are actually coming from Bing but they’re stripped of the context of the Bing search results page. Here, the presentation of search results is just one step in a multi-step task flow. It’s important that the result that is on top is the one you’re probably looking for.

Unfortunately for Google – and the other search providers – this expectation stays in place even when the context shifts. When we launch a search from our desktop, we are increasingly intolerant of results that are even a little off base from our intent. Ads become the most easily identified culprit. A results set that would have seemed almost frighteningly prescient even a few years ago now seems sub par. Google has come a long way in the past 20 years but it’s still losing ground to our expectations.

 

 

The Medium is the Message, Mr. President

Every day that Barack Obama was in the White House, he read 10 letters. Why letters? Because form matters. There’s still something about a letter. It’s so intimate. It uses a tactile medium. Emotions seem to flow easier through the use of cursive loops and sound of pen on paper. They balance between raw and reflective. As such, they may be an unusually honest glimpse into the soul of the writer. Obama seemed to get that. There was an entire team of hundreds of people at the White House that reviewed 10,000 letters a day and chose the 10 that made it to Obama, but the intent was to give an unfiltered snapshot of the nation at any given time. It was a mosaic of personal stories that – together – created a much bigger narrative.

Donald Trump doesn’t read letters. He doesn’t read much of anything. The daily presidential briefing has been dumbed down to media more fitting of the President’s 140 character long attention span. Trump likes to be briefed with pictures and videos. His information medium of choice? Cable TV. He has turned Twitter into his official policy platform.

Today, technology has exponentially multiplied the number of communication media we have available to us. And in that multiplicativity, Marshall McLuhan’s 50-year-old trope about the medium being the message seems truer than ever. The channels we chose – whether we’re on the sending or receiving end – carry their own inherent message. They say who we are, what we value, how we think. They intertwine with the message, determining how it will be interpreted.

I’m sad that letter writing is a dying art, but I’m also contributing to its demise. It’s been years since I’ve written a letter. I do write this column, which is another medium. But even here I’m mislabeling it. Technically this is a blog post. A column is a concept embedded in the medium of print – with its accompanying physical restriction of column inches. But I like to call it a column, because in my mind that carries its own message. A column comes with an implicit promise between you – the readers – any myself, the author. Columns are meant to be regularly recurring statements of opinion. I have to respect the fact that I remain accountable for this Tuesday slot that MediaPost has graciously given me. Week after week, I try to present something that I hope you’ll find interesting and useful enough to keep reading. I feel I owe that to you. To me, a “post” feels more ethereal – with less of an ongoing commitment between author and reader. It’s more akin to a drive-by-writing.

So that brings me to one of the most interesting things about letters and President Obama’s respect for them. They are meant to be a thoughtful medium between two people. The thoughts captured within are important enough to the writer that they’re put in print but they are intended just for the recipient. They are one of the most effective media ever created to ask for empathetic understanding from one person in particular. And that’s how Obama’s Office of Presidential Correspondence treated them. Each letter represented a person who felt strongly enough about something that they wanted to share it with the President personally. Obama used to read his ten letters at the end of the day, when he had time to digest and reflect. He often made notations in the margins asking pointed questions of his staff or requesting more investigation into the circumstances chronicled in a letter. He chose to set aside a good portion of each day to read letters because he believed in the message carried by the medium: Individuals – no matter who they are – deserve to be heard.

Our Brain on Reviews

There’s an interesting new study that was just published about how our brain mathematically handles online reviews that I wanted to talk about today. But before I get to that, I wanted to talk about foraging a bit.

The story of how science discovered our foraging behaviors serves as a mini lesson in how humans tick. The economists of the 1940’s and 50’s discovered the world of micro-economics, based on the foundation that humans were perfectly rational – we were homo economicus. When making personal economic choices in a world of limited resources, we maximized utility. The economists of the time assumed this was a uniquely human property, bequeathed on us by virtue of the reasoning power of our superior brains.

In the 60’s, behavior ecologists knocked our egos down a peg or two. It wasn’t just humans that could do this. Foxes could do it. Starlings could do it. Pretty much any species had the same ability to seemingly make optimal choices when faced with scarcity. It was how animals kept from starving to death. This was the birth of foraging theory. This wasn’t some homo-sapien-exclusive behavior that was directed from the heights of rationality downwards. It was an evolved behavior that was built from the ground up. It’s just that humans had learned how to apply it to our abstract notion of economic utility.

Three decades later, two researchers at Xerox’s Palo Alto Research Center found another twist. Not only had our ability to forage been evolved all the way through our extensive family tree, but we seemed to borrow this strategy and apply it to entirely new situations. Peter Pirolli and Stuart Card found that when humans navigate content in online environments, the exact same patterns could be found. We foraged for information. Those same calculations determined whether we would stay in an information “patch” or move on to more promising territory.

This seemed to indicate three surprising discoveries about our behavior:

  • Much of what we think is rational behavior is actually driven by instincts that have evolved over millions of years
  • We borrow strategies from one context and apply them in another. We use the same basic instincts to find the FAQ section of a website that we used to find sustenance on the savannah.
  • Our brains seem to use Bayesian logic to continuously calculate and update a model of the world. We rely on this model to survive in our environment, whatever and wherever that environment might be.

So that brings us to the study I mentioned at the beginning of this column. If we take the above into consideration, it should come as no surprise that our brain uses similar evolutionary strategies to process things like online reviews. But the way it does it is fascinating.

The amazing thing about the brain is how it seamlessly integrates and subconsciously synthesizes information and activity from different regions. For example, in foraging, the brain integrates information from the regions responsible for wayfinding – knowing our place in the world – with signals from the dorsal anterior cingulate cortex – an area responsible for reward monitoring and executive control. Essentially, the brain is constantly updating an algorithm about whether the effort required to travel to a new “patch” will be balanced by the reward we’ll find when we get there. You don’t consciously marshal the cognitive resources required to do this. The brain does it automatically. What’s more – the brain uses many of the same resources and algorithm whether we’re considering going to McDonald’s for a large order of fries or deciding what online destination would be the best bet for researching our upcoming trip to Portugal.

In evaluating online reviews, we have a different challenge: how reliable are the reviews? The context may be new – our ancestors didn’t have TripAdvisor or AirBNB ratings for choosing the right cave to sleep in tonight – but the problem isn’t. What criteria should we use when we decide to integrate social information into our decision making process? If Thorlak the bear hunter tells me there’s a great cave a half-day’s march to the south, should I trust him? Experience has taught us a few handy rules of thumb when evaluating sources of social information: reliability of the source and the consensus of crowds. Has Thorlak ever lied to us before? Do others in the tribe agree with him? These are hardwired social heuristics. We apply them instantly and instinctively to new sources of information that come from our social network. We’ve been doing it for thousands of years. So it should come as no surprise that we borrow these strategies when dealing with online reviews.

In a neuro-scanning study from the University College of London, researchers found that reliability plays a significant role in how our brains treat social information. Once again, a well-evolved capability of the brain is recruited to help us in a new situation. The dorsomedial prefrontal cortex is the area of the brain that keeps track of our social connections. This “social monitoring” ability of the brain worked in concert with ventromedial prefrontal cortex, an area that processes value estimates.

The researchers found that this part of our brain works like a Bayesian computer when considering incoming information. First we establish a “prior” that represents a model of what we believe to be true. Then we subject this prior to possible statistical updating based on new information – in this case, online reviews. If our confidence is high in this “prior” and the incoming information is weak, we tend to stick with our initial belief. But if our confidence is low and the incoming information is strong – i.e. a lot of positive reviews – then the brain overrides the prior and establishes a new belief, based primarily on the new information.

While this seems like common sense, the mechanisms at play are interesting. The brain effortlessly pattern matches new types of information and recruits the region that is most likely to have evolved to successfully interpret that information. In this case, the brain had decided that online reviews are most like information that comes from social sources. It combines the interpretation of this data with an algorithmic function that assigns value to the new information and calculates a new model – a new understanding of what we believe to be true. And it does all this “under the hood” – sitting just below the level of conscious thought.

Disruption in the Rear View Mirror

Oh..it’s so easy to be blasé. I always scan the Mediapost headlines each week to see if there’s anything to spin. I almost skipped right past a news post by Larissa Faw – Zenith: Google Remains Top-Ranked Media Company By Ad Revenue

“Of course Google is the top ranked media company,” I yawned as I was just about to click on the next email in my inbox. Then it hit me. To quote Michael Bublé, “Holy Shitballs, Mom!”

Maybe that headline doesn’t seem extraordinary in the context of today, but I’ve been doing this stuff for almost 20 years now, and in that context – well-it’s huge! I remembered a column I wrote ages ago about speculating that Google had barely scratched its potential. After a little digging, I found it. It was in October, 2006, so just over a decade ago. Google had just passed the 6 billion dollar mark in annual revenue. Ironically, that seemed a bigger deal then their current revenue of almost $80 billion seems today. In that column, I pushed to the extreme and speculated that Google could someday pass $200 billion in revenue. While we’re still only 1/3 of the way there, the claim doesn’t seem nearly as ludicrous as it did back then.

But here’s the line that really made me realize how far we’ve come in the ten and a half years since I wrote that column: “Google and Facebook together accounted for 20% of global advertising expenditure across all media in 2016, up from 11% in 2012. They were also responsible for 64% of all the growth in global ad spend between 2012 and 2016.”

Two companies that didn’t exist 20 years ago now account for 20% of all global advertising expenditure. And the speed with which they’re gobbling advertising budgets is accelerating. If you’re a dilettante student of disruption, as I am, those are pretty amazing numbers. In the day-to-day of Mediapost – and digital marketing in general – we tend to accept all this as normal. It’s like we’re surfing on top of a wave without realizing the wave is 300 freakin’ feet high. Sometimes, you need to zoom out a little to realizing how momentous the every day is. And if you look at this on a scale of decades rather than days, you start to get a sense that the speed of change is massive.

To me, the most interesting thing about this is that both Google and Facebook have introduced a fundamentally new relationship between advertising and it’s audience. Google’s outré is – of course – intent based advertising. And Facebook’s is based on socially mediated network effects. Both of these things required the overlay of digital connection. That – as they say – has made all the difference. And there is where the real disruption can be found. Our world has become a fundamentally different place.

Much as we remain focused on the world of advertising and marketing here in our little corner of the digital world, it behooves us to remember that advertising is simply a somewhat distorted reflection of the behaviors of the world in general. It things are being disrupted here, it is because things are being disrupted everywhere. As it regards us beings of flesh, bone and blood, that disruption has three distinct beachheads: the complicated relationship between our brains and the digital tools we have at our disposal, the way we connect with each other, and a dismantling of the restrictions of the physical world at the same time we build the scaffolding of a human designed digital world. Any one of these has the potential to change our species forever. With all three bearing down on us, permanent change is a lead-pipe cinch.

Thirty years is a nano-second in terms of human history. Even on the scale of my lifetime, it seems like yesterday. Reagan was president. We were terrorized by the Unabomber. News outlets were covering the Iran-Contra affair. U2 released the Joshua Tree. Platoon won the best picture Oscar. And if you wanted to advertise to a lot of people, you did so on a major TV network with the help of a Madison Avenue agency. 30 years ago, nothing of which I’m talking about existed. Nothing. No Google. No Facebook. No Internet – at least, not in a form any of us could appreciate.

As much as advertising has changed in the past 30 years, it has only done so because we – and the world we inhabit – have changed even more. And if that thought is a little scary, just think what the next 30 years might bring.

We’re Becoming Intellectually “Obese”

Humans are defined by scarcity. All our evolutionary adaptations tend to be built to ensure survival in harsh environments. This can sometimes backfire on us in times of abundance.

For example, humans are great at foraging. We have built-in algorithms that tell us which patches are most promising and when we should give up on the patch we’re in and move to another patch.

We’re also good at borrowing strategies that evolution designed for one purpose and applying them for another purpose. This is called exaptation. For example, we’ve exapted our food foraging strategies and applied them to searching for information in an online environment. We use these skills when we look at a website, conduct an online search or scan our email inbox. But as we forage for information – or food – we have to remember, this same strategy assumes scarcity, not abundance.

Take food for example. Nutritionally we have been hardwired by evolution to prefer high fat, high calorie foods. That’s because this wiring took place in an environment of scarcity, where you didn’t know where your next meal was coming from. High fat, high calorie and high salt foods were all “jackpots” if food was scarce. Eating these foods could mean the difference between life and death. So our brains evolved to send us a reward signal when we ate these foods. Subsequently, we naturally started to forage for these things.

This was all good when our home was the African savannah. Not so good when it’s Redondo Beach, there’s a fast food joint on every corner and the local Wal-Mart’s shelves are filled to overflowing with highly processed pre-made meals. We have “refined” food production to continually push our evolutionary buttons, gorging ourselves to the point of obesity. Foraging isn’t a problem here. Limiting ourselves is.

So, evolution has made humans good at foraging when things are scarce, but not so good at filtering in an environment of abundance. I suspect the same thing that happened with food is today happening with information.

Just like we are predisposed to look for food that is high in fats, salt and calories, we are drawn to information that:

  1. Leads to us having sex
  2. Leads to us having more than our neighbors
  3. Leads to us improving our position in the social hierarchy

All those things make sense in an evolutionary environment where there’s not enough to go around. But, in a society of abundance, they can cause big problems.

Just like food, for most of our history information was in short supply. We had to make decisions based on too little information, rather than too much. So most of our cognitive biases were developed to allow us to function in a setting where knowledge was in short supply and decisions had to be made quickly. In such an environment, these heuristic short cuts would usually end up working in our favor, giving us a higher probability of survival.

These evolutionary biases become dangerous as our information environment becomes more abundant. We weren’t built to rationally seek out and judiciously evaluate information. We were built to make decisions based on little or no knowledge. There is an override switch we can use if we wish, but it’s important to know that just like we’re inherently drawn to crappy food, we’re also subconsciously drawn to crappy information.

Whether or not you agree with the mainstream news sources, the fact is that there was a thoughtful editorial process, which was intended to improve the quality of information we were provided. Entire teams of people were employed to spend their days rationally thinking about gathering, presenting and validating the information that would be passed along to the public. In Nobel laureate Daniel Kahneman’s terminology, they were “thinking slow” about it. And because the transactional costs of getting that information to us was so high, there was a relatively strong signal to noise ratio.

That is no longer the case. Transactional costs have dropped to the point that it costs almost nothing to get information to us. This allows information providers to completely bypass any editorial loop and get it in front of us. Foraging for that information is not the problem. Filtering it is. As we forage through potential information “patches” – whether they be on Google, Facebook or Twitter – we tend to “think fast” – clicking on the links that are most tantalizing.

I would have never dreamed that having too much information could be a bad thing. But most of the cautionary columns that I’ve written about in the last few years all seem to have the same root cause – we’re becoming intellectually “obese.” We’ve developed an insatiable appetite for fast, fried, sugar-frosted information.