Some Second Thoughts on Mindless Media

When I read Tom Goodwin’s Online Spin last week, I immediately jumped on his bandwagon. How could I not? He played the evolutionary psychology card and then trumped that by applying it to the consumption of media. This was right up my ideological alley.

addict_f1pjr6Here’s a quick recap: Humans evolved to crave high calorie foods because these were historically scarce. In the last century, however, processed food manufacturing has ensured that high calorie foods are abundantly available. The result? We got fat. Really fat. Tom worries that the same thing is happening to our consumption of media. As traditional publishing channels break down, will we become a society of information snackers?

We’re rewarding pieces that are most-clickable or most easily digested, and our news diet shifts from good-for-us to snackable.”

Goodwin also mourns the death of serendipitous discovery – which was traditionally brought to us by our loyalty to a channel and the editorial control exercised by that channel. If we were loyal to the New York Times, then we were introduced to content they thought we should see. But in the age of “filter bubbles” our content becomes increasingly homogenized based on algorithms, which are drawing an ever-narrowing circle bounded by our explicit requests and our implicit behavior patterns. We become further insulated from quality by mindless social media sharing – which tends to favor content pandering to the lowest common denominator.

But the more I thought about it, the more I wondered if this wasn’t a little paradoxical? Tom’s very thoughtful column, which hardly qualifies as intellectual fast-food, didn’t come to us through traditional journalism. Tom, like myself, is not a professional journalist. And while MediaPost does provide some editorial curation, it’s purpose it to provide a fairly transparent connection between industry experts like Tom and other experts like you. Tom’s piece came to us through a much more transparent information marketplace – the very same marketplace that Tom worries is turning us into an audience of mindless media junkies. And I should add that Tom’s piece was shared through social circles over 200 times.

So where is the disconnect here? The problem is that when it comes to human behaviors, there are no universal truths. How we act in almost any given situation will eventually distribute itself across a bell curve. Let’s take obesity, for instance. If we talk trends, Tom is absolutely correct. The introduction of fast food in North America coincided with an explosion of obesity, which as a percentage of the US population rose from about 10% in the 1950’s to almost 35% in 2013. But if we accept the premise that we all mindlessly crave calories, we should all be obese. Obesity rates should also continue to be going up until they reach 100% of the population. But those two things are just not true. Obesity rates have plateaued in the last few years and there are indications that they are starting to decline amongst children. Also, although fast food is now available around the world – obesity rates vary greatly. Japan has one of the highest concentrations of McDonald’s outlets per capita (25 per million) in the world but has an obesity rate of 3.2%, the lowest in all OECD countries. The US has a higher concentration McDonald’s (45 per million) but has an obesity rate 10 times that of Japan. And my own country, Canada, almost matches the US McDonald for McDonald (41 per million) but has an obesity rate half that of the US (14.3%).

My point is not to debate whether we’re getting fatter. We are. But there’s more to it than just the prevalence of fast food. And these factors apply to our consumption of media as well. For example, there is a strong negative correlation between obesity levels and education. There is also a strong negative correlation between obesity and income. Cultural norms have a huge impact on the prevalence of obesity. There are no universal truths here. There are just a lot of nebulous factors at play. So, if we want to be honest when we draw behavioral comparisons, we have to be accepting of those factors.

Much as I believe evolution drives many of our behaviors, I also believe that more open markets are better than more restrictive ones. As the mentality of abundance takes hold, our behaviors take time to adjust. Yes, we do snack on crap. But we also have access to high quality choices we could have never dreamed of before. And the ratio of consumption between those two extremes will be different for all of us. Consider the explosion of TV programming that has happened over the last 3 decades. Yes, there is an over-abundance of mindless dreck, but there is also more quality programming than ever to choose from. The same is true of music and pretty much any other category where markets have opened up through technology.

The way to increase the quality of what we consume, whether it be food, information or entertainment, is not to limit the production and distribution of those consumables through more restrictive markets, but to improve education, access and create a culture of considered consumption. Some of us will choose crap. But some of us will choose the cream that rises to the top. The choice will be ours. The answer is not to take those choices away, but rather to create a culture that encourages wiser choices.

The Secret of Successful Marketing Lies in Split Seconds

affordanceThe other day, I was having lunch in a deli. I was also watching the front door, which you had to push to get in. Almost everyone who came to the door pulled, even though there was a fairly big sign over the handle which said “Push.” The problem? The door had the wrong kind of handle. It was a pull handle, not a push. The door had been mounted backwards. In usability terms, the door handle presented a misleading affordance.

I suspect the door had been there for many years. I was at the deli for about 30 minutes. In that time, about 70% of the people (probably close to 50) pulled rather than pushed. Extrapolating this to the whole, that means over the years, thousands and thousands of people have had to try twice to enter this particular place of business. Yet, the only acknowledgement of this instance of customer pain was the sign that had been taped to the door – “Push” – and I suspect there was an implied “(You Idiot)” following that.

I suspect most marketing falls in the same category as that sign. It’s an attempt to fight the intuitive actions that customers take – those split-second actions that happen before our brain has a chance to kick in. And we have to counteract those split-second decisions because the path we have created for our customers was built without an understanding of those intuitive actions. After we realize that our path runs counter to our customer’s natural behaviors do we rebuild the path? Does the deli owner pay a contractor to remount the door? No, we post a sign asking customers to push rather than pull. After all, all they have to do is think for a moment. It seems like a reasonable request.

But here’s the problem with that. You don’t want your customers to think. You want them to act. And you want them to act as quickly and naturally as possible. The battles of marketing are won in those split seconds before the brain kicks in.

Let me give you one example. A few years ago I did a study with Simon Fraser University in Canada. We wanted to know how the brain responded in those same split seconds to brands we like versus brands we have no particular affinity to. What we found was fascinating. In about 150 milliseconds (roughly a sixth of a second) our brain responds to a well-loved brand the same way we respond to a smiling face. This all happens before any rational part of the brain can kick in. This positive reaction sets the stage for a much different subsequent mental processing of the brand (which starts at about 450 milliseconds, or half a second). And the power of this alignment can be startling. As Dr. Read Montague discovered, it can literally alter your perception of the world.

If you can rebuild your path to purchase to align with your customer’s intuitive behaviors, you don’t need to put up “push” signs when they stray off course. You don’t have to make your customers think. Here’s why that is important. As long as we operate at the intuitive level, humans are a fairly predictable lot. Evolution has wired in a number of behaviors that are universal across the population. You would not be risking your vacation fund if you placed a bet that the majority of people would try to pull a door with a door handle that suggested your should pull it, even if there was a sign that said “push.” As long as we operate on auto-pilot, we can plot a predicted behavioral course with a fair degree of confidence (assuming, of course, we’ve taken the time to understand those behaviors).

But the minute we start to think, all bets are off. The miracle of the human brain is that it has two loops of activity – one fast and one slow. The fast loop relies on instinct and evolved behavioral habits. It’s incredibly efficient but stubbornly rigid. The slow loop brings the full power of human rationality to bear on the problem. It’s what happens when we think. And once the prefrontal cortex kicks it, we are amazingly flexible but we pay the price in efficiency. It takes time to think. It also brings a massive amount of variability into the equation. If we start thinking, behaviors become much more difficult to predict.

The longer you can keep your customers on the fast path, the closer you’ll be to a successful outcome. Plan that path carefully and remove any signs telling them to “push.”

The Messy Part of Marketing

messymarketingMarketing is hard. It’s hard because marketing reflects real life. And real life is hard. But here’s the thing – it’s just going to get harder. It’s messy and squishy and filled with nasty little organic things like emotions and human beings.

For the past several weeks, I’ve been filing things away as possible topics for this column. For instance, I’ve got a pretty big file of contradicting research on what works in B2B marketing. Videos work. They don’t work. Referrals are the bomb. No, it’s content. Okay, maybe it’s both. Hmmm..pretty sure it’s not Facebook though.

The integration of marketing technology was another promising avenue. Companies are struggling with data. They’re drowning in data. They have no idea what to do with all the data that’s pouring in from smart watches and smart phones and smart bracelets and smart bangles and smart suppositories and – okay, maybe not suppositories, but that’s just because no one thought of it till I just mentioned it.

Then there’s the new Google tool that predicts the path to purchase. That sounds pretty cool. Marketers love things that predict things. That would make life easier. But life isn’t easy. So marketing isn’t easy. Marketing is all about trying to decipher the mangled mess of living just long enough to shoehorn in a message that maybe, just maybe that will catch the right person at the right time. And that mangled mess is just getting messier.

Personally, the thing that attracted me to marketing was its messiness. I love organic, gritty problems with no clear-cut solutions. Scientists call these ill-defined problems. And that’s why marketing is hard. It’s an ill-defined problem. It defies programmatic solutions. You can’t write an algorithm that will spit out perfect marketing. You can attack little slivers of marketing that lend themselves to clearer solutions, which is why you have the current explosion of ad-tech tools. But the challenge is trying to bring all these solutions together into some type of cohesive package that actually helps you relate to a living, breathing human.

One of the things that has always amazed me is how blissfully ignorant most marketers are about concepts that I think should be fundamental to understanding customer behaviors: things like bounded rationality, cognitive biases, decision theory and sense-making. Mention any of these things in a conference room full of marketers and watch eyes glaze over as fingers nervously thumb through the conference program, looking for any session that has “Top Ten” or “Surefire” in it’s title.

Take Information Foraging Theory, for instance. Anytime I speak about a topic that touches on how humans find information (which is almost always), I ask my audience of marketers if they’ve ever heard of I.F.T. Generally, not one hand goes up. Sometimes I think Jakob Nielsen and I are the only two people in the world that recognize I.F.T. for what it is: “the most important concept to emerge from Human-Computer Interaction research since 1993.” (Jakob’s words). If you take the time to understand this one concept I promise it will fundamentally and forever change how you look at web design, search marketing, creative and ad placement. Web marketers should be building a shrine to Peter Pirolli and Stuart Card. Their names should be on the tips of every marketer’s tongue. But I venture to guess that most of you reading this column never heard of them until today.

None of these fundamental concepts about human behavior are easy to grasp. Like all great ideas, they are simple to state but difficult to understand. They cover a lot of territory – much of it ill defined. I’ve spent most of my professional life trying to spread awareness of things like Information Foraging Theory. Can I always predict human behavior? Not by a long shot. But I hope that by taking the time to learn more about the classic theories of how we humans tick, I have also learned a little more about marketing. It’s not easy. It’s not perfect. It’s a lot like being human. But I’ve always believed that to be an effective marketer, you first need to understand humans.

Justine Sacco, Twitter and the End of Irony

Justine Sacco is in the news again. Not that she wants to. She’d like nothing more than to fade from the spotlight. As she recently said in an interview, “Someday you’ll Google me and my LinkedIn will be the first thing that pops up.” But today, over 15 months after she launched the tweet that just won’t go away, she’s still the poster child for career ruination via social media. The recent revival of Justine’s story comes ahead of the release of a new book by Jon Ronson, “So You’ve Been Publicly Shamed.”

Justine SaccoIf you’ve never heard of Justine Sacco, I’ll recap quickly. Just before boarding an 11-hour flight to South Africa, in what can only be called a monumental melt down of discretion, she tweeted this, “Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!” This touched off a social media feeding frenzy looking for Sacco’s blood. The world waited for her to land (#HasJustineLandedYet? became the top trender) and meet her righteous retribution.

Oh, did I mention that Justine was IAC’s Corporate Head of Communications? Yeah, I know. WTF – right?

But the point here is not whether or not Justine Sacco was wrong. I think even she’ll admit that it was a momentarily brain-dead blurb of 64-character stupidity. The point here is whether or not Sacco was a racist, cold-hearted bitch. And to that, the answer is no.  Justine meant the comment to be ironic – a satirical poke at white privilege and comfort. She never intended for it to be taken seriously. And that was where the wheels came off.

A_Modest_Proposal_1729_CoverSatire has been around for a long time. The Greeks and Romans invented it, but it was the British that perfected it. The satirical essay became an art form in the hands of Alexander Pope, John Gay and the greatest of the satirists, Jonathon Swift. Through them, irony became honed to a razor sharp scythe for social change.  Swift’s A Modest Proposal is perhaps the greatest satirical piece ever written. In it, he proposed a solution for the starving beggars of Ireland – they should sell their children, of which there was an abundant supply, to the upper classes as a food source.

Now, did the pamphlet reading public of 1729 England call for Swift’s head? Did they think he was serious when he wrote:

“A young healthy child well nursed, is, at a year old, a most delicious nourishing and wholesome food, whether stewed, roasted, baked, or boiled; and I make no doubt that it will equally serve in a fricassee, or a ragout.”

Well, perhaps a few missed the irony, but for the vast majority of Swift’s audience, the pamphlet helped make his reputation, rather than ruin it. There was no “HasSwiftreturnedfromLilliputYet?” trend on Twitter. People got it.

There is no way Sacco’s work should be compared to Swift’s in terms of literary merit, but there are some other fundamental differences we should pay attention too.

First of all, Swift was known as a satirist. Satire was an established literary form in the Age of Enlightenment. The context was in place for the audience. They were able to manage the flip of perspective required to understand the irony. But before December 20, 2013, we had never heard of Justine Sacco. The tweet was stripped of any context. There was nothing to tell us that she wasn’t being serious. Twitter fragments our view of the world into tiny missives that float unconnected and unsupported.  Twitter, by its very nature, forces us to take its messages out of context. This is not the place to hope for a nuanced understanding.

Also, Sacco’s entire tweet totaled 64 characters. Swift’s essay comes in at 3405 words, or 19,373 characters. That’s about 300 times the literary volume of Sacco’s tweet. Swift had ample opportunity to expound on his irony and make sure readers got his point.  Even Swift’s title, at a hefty 169 characters, couldn’t have squeezed into the limits of a tweet.  Tweets beg to be taken at face value, because there’s no room to aim for anything other than that.

And that brings us to the biggest difference here – the death of thoughtfulness. You can’t get irony or satire unless you’re thoughtful. You have to spend some time thinking about what you’ve read. To use Daniel Kahneman’s terminology, you have to use System 2, which specializes in slow thinking. Sacco’s tweet takes about 2 seconds to read, from beginning to end. There is no time for thought there. But there is time for visceral reaction. That’s all System 1, and System 1 doesn’t understand irony.

At the average reading speed of 300 words a minute, you’d have to invest 11.3 minutes to get through Swift’s essay. That’s plenty of time for System 2 to digest what it’s read and to look for meaning beyond face value. You have to read it in a thoughtful manner.  But it’s not only in our reading where we don’t have to be thoughtful. We can also abandon thoughtfulness in our response. We can retweet in a matter of seconds and add our own invectives. This starts a chain reaction of indignation that starts a social media brush fire. Careful consideration is not part of the equation.

Sacco’s sin wasn’t that she was being racist. Her sin was trying to be ironic in a medium that couldn’t support it. By her own admission, she had been experimenting with Twitter to see if edgy tweets got retweeted more often. The answer, as it turned out, was yes, but the experiment damned near killed her. As a communication expert, she should have known better. Justine Sacco painfully discovered that in the split second sound-bite world of social media, thoughtful reading is extinct.  And with it, irony and satire have died as well.

Mourning Becomes Electric

dreamstime_19503560Last Friday was a sad day. A very dear and lifelong friend of mine, my Uncle Al, passed away. And so I did what I’ve done before on these occasions. I expressed my feelings by writing about it. The post went live on my blog around 10:30 in the morning. By mid afternoon, it had been shared and posted through Facebook, Twitter and many other online channels. Many were kind enough to send comments. The family, in the midst of their grief, forwarded my post to their family and friends. Soon, there was an extended network of mourning that sought to heal each other, all through channels that didn’t exist just a few years ago. Mourning had moved online.

As you probably know, I’m fascinated by how we express our innate human needs through digital technologies. And death, together with birth, is the most universal of human experiences. It was inevitable that we would use online channels to grieve. So I, as I always do, asked the question – why?

First of all – why do we mourn? Well, we mourn because we are social animals. We are probably the most social of animals. So we grieve to an according degree. We miss the departed terribly. It is natural to try to fill the hole a death tears inside of us by reaching out to others who may share the same grief. James R. Averill believed we communally mourn because it cements the social bonds that make it more likely that we will survive as a species. When it comes to dealing with death, misery loves company.

Secondly, why do we grieve online? Well, here, I think it has something to do with Granovetter’s weak ties. Death is one of those life events where we reach beyond the strong ties that define our day-to-day social existence. Certainly we seek comfort from those closest to us, but the death also triggers the existence of a virtual community – defined and united by their grieving for the one who has passed away. Our digital networks allow us to eliminate the six degrees of separation in one fell swoop. We can share our grief almost instantaneously and simultaneously with family, friends, acquaintances and even people we have never met.

There are two other aspects of grief that I believe lend themselves well to online channels: the need to chronicle and the comfort of emotional distance.

Part of the healing process is sharing memories of the departed love one. And, for those like myself, just writing about our feelings helps overcome the pain. Online provides a perfect platform for chronicling. We can share our own thoughts and, in the expressing of them, start the healing process.

The comfort of physical distance seems a contradictory idea, but almost everyone I know who has gone through a deep loss has one common dread – dealing with a never-ending stream of condolences over the coming weeks and months, triggered by each new physical encounter.

When you’ve been in the middle of the storm, you are typically a few days ahead of everyone else in dealing with your grief. Your mind has been occupied with nothing else as you have sat vigil by the hospital bed. While the condolences are given with the best of intentions, you feel compelled to give a response. The problem is, each new expression of grief forces you to replay your loop of very painful memories. The amplitude of this pain increases when it’s a face-to-face encounter. Condolences that reach you through a more detached channel, such as online, can be dealt with at your discretion. You can wait until you marshall the emotional reserves necessary to respond. You can also respond to several people at a time. How many times have you heard this from a grieving loved one, “I just wish I could record my message and play it whenever I meet someone who wants to tell me how sorry they are for my loss?” It may seem callous, but no one wants to relive that pain over and over again. And let’s face it – almost no one knows the right things to say at a moment like this.

By the end of last Friday, my online social connections had helped me ease a very deep pain. I hope I was able to return the favor for others that were dealing with their own grief. There are many things about technology that I treat with suspicion, but in this case, turning online seemed like the most natural thing in the world.

How Activation Works in an Absolute Value Market

As I covered last week, if I mention a brand to you – like Nike, for instance – your brain immediately pulls back your own interpretation of the brand. What has happened, in a split second, is that the activation of that one node – let’s call it the Nike node – triggers the activation of several related nodes in your brain, which is quickly assembled into a representation of the brand Nike. This is called Spreading Activation.

This activation is all internal. It’s where most of the efforts of advertising have been focused over the past several decades. Advertising’s job has been to build a positive network of associations so when that prime happens, you have a positive feeling towards the brand. Advertising has been focused on winning territory in this mental landscape.

Up to now, we have been restricted to this internal landscape when making consumer decisions by the boundaries of our own rationality. Access to reliable and objective information about possible purchases was limited. It required more effort on our part than we were willing to expend. So, for the vast majority of purchases, these internal representations were enough for us. They acted as a proxy for information that lay beyond our grasp.

But the world has changed. For almost any purchase category you can think of, there exists reliable, objective information that is easy to access and filter. We no longer are restricted to internal brand activations (relative values based on our own past experiences and beliefs). Now, with a few quick searches, we can access objective information, often based on the experiences of others. In their book of the same name, Itimar Simonson and Emanuel Rosen call these sources “Absolute Value.” For more and more purchases, we turn to external sources because we can. The effort invested is more than compensated for the value returned. In the process, the value of traditional branding is being eroded. This is truer for some product categories than others. The higher the risk or the level of interest, the more the prospect will engage in an external activation. But across all product categories, there has been a significant shift from the internal to the external.

What this means for advertising is that we have to shift our focus from internal spreading activations to external spreading activations. Now, when we retrieve an internal representation of a product or brand, it typically acts as a starting point, not the end point. That starting point is then to be modified or discarded completely depending on the external information we access. The first activated node is our own initial concept of the product, but the subsequent nodes are spread throughout the digitized information landscape.

In an internal spreading activation, the nodes activated and the connections between those nodes are all conducted at a subconscious level. It’s beyond our control. But an external spreading activation is a different beast. It’s a deliberate information search conducted by the prospect. That means that the nodes accessed and the connections between those nodes becomes of critical importance. Advertisers have to understand what those external activation maps look like. They have to be intimately aware of the information nodes accessed and the connections used to get to those nodes. They also have to be familiar with the prospect’s information consumption preferences. At first glance, this seems to be an impossibly complex landscape to navigate. But in practice, we all tend to follow remarkable similar paths when establishing our external activation networks. Search is often the first connector we use. The nodes accessed and the information within those nodes follow predictable patterns for most product categories.

For the advertiser, it comes down to a question of where to most profitably invest your efforts. Traditional advertising was built on the foundation of controlling the internal activation. This was the psychology behind classic treatises such as Ries and Trout’s “Positioning, The Battle for Your Mind.” And, in most cases, that battle was won by whomever could assemble the best collection of smoke and mirrors. Advertising messaging had very little to do with facts and everything to do with persuasion.

But as Simonsen and Rosen point out, the relative position of a brand in a prospect’s mind is becoming less and less relevant to the eventual purchase decision. Many purchases are now determined by what happens in the external activation. Factual, reliable information and easy access to that information becomes critical. Smoke and mirrors are relegated to advertising “noise” in this scenario. The marketer with a deep understanding of how the prospect searches for and determines what the “truth” is about a potential product will be the one who wins. And traditional marketing is becoming less and less important to that prospect.

 

The Spreading Activation Model of Marketing

“Beatle.”

I have just primed you. Before you even finished reading the word above, you had things popping into your mind. Perhaps it was a mental image of an individual Beatle – either John, Paul, George or Ringo. Perhaps it was a snippet of song. Perhaps it was grainy black and white footage of the Ed Sullivan show appearance. But as the concept “Beatle” entered your working memory, your brain was hard at work retrieving what you believed were relevant concepts from your long-term memory. (By the way, if your reaction was “What’s a Beatle?” – substitute “Imagine Dragons.”)

1-brain-neural-network-pasiekaThat’s a working example of spreading activation. The activation of your working memory pulls associated concepts from your long-term memory to create a mental construct that creates your internal definition of whatever that first label was.

Now, an important second step may or may not happen. First, you have to decide how long you’re going to let the “Beatle” prime occupy your working memory. If it’s of fleeting interest, you’ve probably already wiped the slate clear, ready for the next thing that catches your interest. But if that prime is strong enough to establish a firm grip on your attention, then you have a choice to make. Is your internal representation complete, or do you require more information? If you require more information then you have to turn to external sources for that information.

Believe it or not, this column is not intended as a 101 primer in Cognitive Psych. But the mental gymnastics I describe are important when we think about marketing, as we go through exactly the same process when we think about potential purchases. If we can understand that process better, we gain some valuable hints about how to more effectively market in an exceedingly fluid technological environment.

Much of advertising is built on the first half of the process – building associative brand concepts and triggering the prime that retrieves those concepts into working memory. Most of what isn’t working about advertising lies on this side of the cognitive map. We’ve been overly focused on the internal activation, at the expense of the external. But thanks to an explosion of available (and objective) information we’re less reliant on using our internal knowledge when making purchase decisions. Itamar Simonson and Emanuel Rosen explain in their book “Absolute Value”: “A person’s decision to buy is affected by a mix of three related sources: The individual’s Prior preferences, beliefs, and experiences (P) Others. Other people and information services (O) and Marketers (M).”

Simonson and Rosen say that with near perfect information available for the consumer, we now rely more on (O) and less on (P) and (M). Let’s leave (M) and (O) aside for the moment and focus on the (P) in this equation. (P) represents our internal spreading activation. After we’re primed, we retrieve a representation of the product or service we’re thinking of. At this point, we make an internal calculation. We balance how confident we are that our internal representation is adequate to make a purchase against how much effort we have to expend to gather further information. This calculation is largely made subconsciously. It follows Herbert Simon’s principle of Bounded Rationality. It also depends on how much risk is involved in the purchase we’re contemplating. If all the factors dictate that we’re reasonably confident in our internal representation and the risk we’re assuming, we’ll pull out our wallets and buy. If, however, we aren’t confident, we’ll start seeking more information. And that’s where (O) and (M) come in.

Simonson and Rosen lay out a purchase behaviour continuum, from (O) Dependent to (O) Independent. It’s at the (O) Dependent end, where internal confidence in retrieved beliefs and experience is low, that buying behaviors are changing dramatically. And it’s there where conventional approaches to advertising are falling far short of the mark. They are still stuck in the mythical times of Mad Men, where marketers relied on a “Prime, Retrieve (Internal beliefs), Purchase” path. Today, it’s much more likely that the Prime and Retrieve stages will be followed by an external spreading activation. We’ll pick up that thread in next week’s Online Spin.

 

Consuming in Context

npharris-oscarsIt was interesting watching my family watch the Oscars Sunday night. Given that I’m the father of two millennials, who have paired with their own respective millennials, you can bet that it was a multi-screen affair. But to be fair, they weren’t the only ones splitting their attention amongst the TV and various mobile devices. I was also screen hopping.

As Dave Morgan pointed out last week, media usage no longer equates to media opportunity. And it’s because the nature of our engagement has changed significantly in the last decade. Unfortunately, our ad models have been unable to keep up. What is interesting is the way our consumption has evolved. Not surprisingly, technology is allowing our entertainment consumption to evolve back to its roots. We are watching our various content streams in much the same way that we interact with our world. We are consuming in context.

The old way of watching TV was very linear in nature. It was also divorced from context. We suspended engagement with our worlds so that we could focus on the flickering screen in front of us. This, of course, allowed advertisers to buy our attention in little 30-second blocks. It was the classic bait and switch technique. Get our attention with something we care about, and then slip in something the advertiser cares about.

The reason we were willing to suspend engagement with the world was that there was nothing in that world that was relevant to our current task at hand. If we were watching Three’s Company, or the Moon Landing, or a streaker running behind David Niven at the 1974 Oscar ceremony, there was nothing in our everyday world that related to any of those TV events. Nothing competed for the spotlight of our attention. We had no choice but to keep watching the TV to see what happened next.

But imagine if a nude man suddenly appeared behind Matthew McConaughey at the 2015 Oscars. We would immediately want to know more about the context of what just happened. Who was it? Why did it happen? What’s the backstory? The difference is now, we have channels at our disposal to try to find answers to those questions. Our world now includes an extended digital nervous system that allows us to gain context for the things that happen on our TV screens. And because TV no longer has exclusive control of our attention, we switch to the channel that is the best bet to find the answers we seek.

That’s how humans operate. Our lives are a constant quest to fill gaps in our knowledge and by doing so, make sense of the world around us. When we become aware of one of these gaps we immediate scan our environment to find cues of where we might find answers. Then, our senses are focused on the most promising cues. We forage for information to satiate our curiosity. A single-minded focus on one particular cue, especially one over which we have no control, is not something we evolved to do. The way we watched TV in the 60s and 70s was not natural. It was something we did because we had no option.

Our current mode of splitting attention across several screens is much closer to how humans naturally operate. We continually scan our environment, which, in this case, included various electronic interfaces to the extended virtual world, for things of interest to us. When we find one, our natural need to make sense sends us on a quest for context. As we consume, we look for this context. The diligence of our quest for that context will depend on the degree of our engagement with the task at hand. If it is slight, we’ll soon move on to the next thing. If it’s deep, we’ll dig further.

On Sunday night, the Hotchkiss family quest for context continually skipped around, looking for what other movies J.K. Simmons had acted in, watching the trailer for Whiplash, reliving the infamous Adele Dazeem moment from last year and seeing just how old Benedict Cumberbatch is (I have two daughters that are hopelessly in love, much to the chagrin of their boyfriends). As much as the advertisers on the 88th Oscars might wish otherwise, all of this was perfectly natural. Technology has finally evolved to give our brain choices in our consumption.

 

 

 

 

 

 

Are We Guilty of “Numbed” Marketing?

BombsightA few years ago, I was moderating a panel on mobile advertising. The room was full of marketers. After much discussion about targeting and the ability to track consumers both geographically and behaviorally, one audience member lamented, “Why don’t the carriers just share the subscriber information? They know who they are. They know addresses, family status, credit history, demographics – they have all that information. Then we could really pinpoint our market.”

I had to jump in. I asked this room full of marketers to indicate who would like to have access to that information by raising their hand. The entire room answered in the affirmative. Then I added a twist…

“Okay. Everyone in this room has a mobile phone. Who, as subscribers, would want your carrier sharing that information with anyone who wanted to target you? Keep your hands up.”

Hands wavered. You could almost hear the switch clicking in their brains. Every hand slowly went down.

That story came to mind last week when I read the following passage in a book by Arthur J. Dyck called “Rethinking Rights and Responsibilities: The Moral Bonds of Community,”

“In his study, (Robert Jay) Lifton takes note of a phenomenon he calls “numbed warfare,” a mode of combat in which participants have psychological contacts only with their military cohorts and their own equipment…. Lifton describes research that found a striking correlation between altitude and potential for guilt:

‘B-52 pilots and crews bombing at high altitudes saw nothing of their victims and spoke exclusively of professional skill and performance…’

Lifton calls these B-52 pilots “numbed warriors.” What have been numbed are their empathic emotions: ‘lacking emotional relations with his victims, the numbed warrior receives from them very little of the kind of feedback that could permit at least one layer of his mind to perceive them as human.’”

That may seem like a horrific parallel to draw with marketing, but the similarities are striking. One of the ways warriors have always desensitized themselves is by thinking of the enemy in non-human terms, either as a faceless, monolithic group, or by assigning a dehumanizing (and usually derogatory) label to them. We marketers have been doing this for years. What is more dehumanizing than taking a thinking, feeling person and calling them a “consumer?” Someone once described consumers as “mindless wallets eating shit and crapping cash.”

Warriors have to clearly delineate the concepts of “us” and “them” in order to do what they have to do. But as my room full of marketers realized, when it comes to marketing – “them” is “us.” In a recent PEW study, 80% of social network users were worried that their data would be accessed by advertisers. That means 4 out of 5 people don’t trust you, Ms. or Mr. Marker. They’d rather you didn’t know who they were. If you knocked on their door, they wouldn’t answer. Maybe it’s because you keep calling them a consumer or a target market. I’m also betting that if you were asked that question, you’d answer the same way. Because even though you’re a marketer, you don’t trust other marketers.

In a recent interview, I was asked what one piece of advice I would pass on to other marketers. I said, “Be an empathic marketer.” Or, in plainer terms, don’t numb yourself to your market. I’m not alone in saying we can be better. Fellow Spinner Cory Treffileti talked about the importance of emotion in ad messages. And Katie Meier recently asked the question, “What if data wasn’t about numbers, but instead we made it about the people the numbers represent?”

Technology has put us at a crossroads. We could use it to further distance and dehumanize our market, turning real people into digital data points. We could become “high-altitude” marketers, never coming face to face with the humans we’re trying to connect with.

Or, we could use it to create, as my friend Scott Brinker likes to say, “markets of one.” But before we do that, we have to make them want to listen to us. They have to answer their door if we knock. And that will take some work. We have to start treating them the way we want to be treated, when we’re not wearing our “marketing” hats.

Why More Connectivity is Not Just More – Why More is Different

data-brain_SMEric Schmidt is predicting from Davos that the Internet will disappear. I agree. I’ve always said that Search will go under the hood, changing from a destination to a utility. Not that Mr. Schmidt or the Davos crew needs my validation. My invitation seems to have got lost in the mail.

Laurie Sullivan’s recent post goes into some of the specifics of how search will become an implicit rather than an explicit utility. Underlying this is a pretty big implication that we should be aware of – the very nature of connectivity will change. Right now, the Internet is a tool, or resource. We access it through conscious effort. It’s a “task at hand.” Our attention is focused on the Internet when we engage with it. The world described by Eric Schmidt and the rest of the panel is much, much different.   In this world, the “Internet of Things” creates a connected environment that we exist in. And this has some pretty important considerations for us.

First of all, when something becomes an environment, it surrounds us. It becomes our world as we interpret it through our assorted sensory inputs. These inputs have evolved to interpret a physical world – an environment of things. We will need help interpreting a digital world – an environment of data. Our reality, or what we perceive our reality to be, will change significantly as we introduce technologically mediated inputs into it.

Our brains were built to parse information from a physical world. We have cognitive mechanisms that evolved to do things like keep us away from physical harm. Our brains were never intended to crunch endless reams of digital data. So, we will have to rely on technology to do that for us. Right now we have an uneasy alliance between our instincts and the capabilities of machines. We are highly suspicious of technology. There is every rational reason in the world to believe that a self-driving Google car will be far safer than a two ton chunk of accelerating metal under the control of a fundamentally flawed human, but who of us are willing to give up the wheel? The fact is, however, that if we want to function in the world Schmidt hints at, we’re going to have to learn not only to trust machines, but also to rely totally on them.

The other implication is one of bandwidth. Our brains have bottlenecks. Right now, our brain together with our senses subconsciously monitor our environment and, if the situation warrants, they wake up our conscious mind for some focused and deliberate processing. The busier our environment gets, the bigger this challenge becomes. A digitally connected environment will soon exceed our brain’s ability to comprehend and process information. We will have to determine some pretty stringent filtering thresholds. And we will rely on technology to do the filtering. As I said, our physical senses were not built to filter a digital world.

It will be an odd relationship with technology that will have to develop. Even if we lower our guard on letting machines do much of our “thinking” (in terms of processing environmental inputs for us) we still have to learn how to give machines guidelines so they know what our intentions are. This raises the question, “How smart do we want machines to become?” Do we want machines that can learn about us over time, without explicit guidance from us? Are we ready for technology that guesses what we want?

One of the comments on Laurie’s post was from Jay Fredrickson, “Sign me up for this world, please. When will this happen and be fully rolled out? Ten years? 20 years?” Perhaps we should be careful what we wish for.  While this world may seem to be a step forward, we will actually be stepping over a threshold into a significantly different reality. As we step over that threshold, we will change what it means to be human. And there will be no stepping back.