Have the Odds Caught Up with Apple?

google-vs-appleGoogle has just surpassed Apple as the most valuable brand in the world. In diving deeper on this, there are several angles one could take. If you live in the intersection of brand and technology marketing, as I have for the last several years, this is noteworthy on many levels. One, for instance, are the dramatic shifts in Millward Brown’s assigned brand value for the two companies – with Google soaring 40 percent, and Apple plunging 20 percent. According to Millward Brown’s Brandz™ Study, Google’s brand is worth $158 billion, up from $113 billion last year. And the post-Jobs Apple is down to $147 billion from last years $185 billion number one spot. Combined, that’s an $83 billion swap in valuations. Apple was one of the few brands to actually loose ground in this year’s report.

I personally find this interesting because of some recent research I’ve been doing on corporate strategy for an upcoming book. It comes as a surprise to no one reading this column that I’m a big believer in corporate strategy. But in my research, I’ve been forced to admit that strategy is a little understood and over-hyped concept. Actually, let me clarify that – strategy as it’s taught in most MBA programs is little understood and over-hyped. Executives and consultants pull matrices and strategic frameworks out of thin air, and injudiciously apply them to any and all situations. With all due deference to the Michael Porters, Peter Druckers, Jim Collins and Tom Peters of the world, I suspect the world of corporate strategy is more complex than 5 universal steps, a four box matrix or simple models illustrated with a few circles and arrows. The mistaken assumption with all this is that all strategic wisdom must flow from top to bottom.

Let’s go back to Apple and Google. Apple, under Jobs, was a traditional hierarchy. More than this, it was a hierarchy ensconced in an ivory tower. Due, no doubt, to the considerable hubris of Mr. Jobs, Apple believed that all good things had to be laboriously squeezed out of their own design process and mercilessly tweaked to perfection.

Google, on the other hand, fully embraces the concept of a market to drive innovation. Notice I say “a market”, not “the market.” Here, I refer to markets as a tool, not an entity. The distinction is important. Markets are built to facilitate exchanges. They use valuation mechanisms (such as pricing) to protect fairness and introduce equilibrium in the market. It their most ideal form, markets allow any member of the marketplace to contribute and be judged on the value of their contribution, not their status. In Google’s case, the 20% free time rule, Google Labs and their experimentation with prediction markets all use market dynamics to drive both innovation and corporate strategy. Markets allow for a Darwinian approach to strategy, pulling it up from the bottom rather than driving it down from the top. And, as evolutionary biologist Leslie Orgel liked to say, “Evolution is cleverer than you are.”

But there are trade-offs. Bottom up approaches to strategy need some mechanism to pick winners and losers. There needs to be the corporate equivalent of natural selection. This, again, is where markets can help. Without robust and definitive selection tools, the bottoms-up organization can vacillate endlessly, never making any headway. Also, management of execution in bottom-up organizations can be a much more challenging balancing act. Dictatorships might not be a lot of fun for the “dictatees” but you can definitely get the trains running on time.

Here’s one last thing to keep in mind. Every time we trot out Apple in the era of Steve Jobs as an example of anything to do with corporations, we tend to forget that in the normal distribution of visionary talent, Jobs was an extreme outlier. He was a once in a generation anomaly. You can’t build a corporate strategy around the hope that you have a Steve Jobs on the executive payroll. Sooner or later, the odds will catch up to you.

Will Apple’s brand value bounce back in 2015? Perhaps. But in the dynamically complex market that is today’s reality, I’d be placing my bets on organizations that have learned to adapt and evolve in complexity.

Today, Spend Some Time in Quadrant Two

First published April 17, 2014 in Mediapost’s Search Insider

Last week, I ranted, and it was therapeutic — for me, at least. Some of you agreed that the social media landscape was littered with meaningless crap. Others urged me to “loosen up and take a chill pill,” intimating that I had slipped across the threshold of “grumpy old man-itis.” Guilty, I guess, but there was a point to my rant. We need to spend more time with important stuff, and less time with content that may be popular but trivial.

Hey, I’m the first to admit that I can be tempted into wasting gobs of time with a tweet like: “Prom season sizzles with KFC chicken corsages.” This is courtesy of Guy Kawasaki. Guy’s Twitter feed is a fire hose of enticing trivia. And the man (with the team that supports him) does have a knack of writing tweets with irresistible hooks. Come on. Who could resist checking out a fried chicken corsage?

But here’s the problem. Online is littered with fried chicken corsages. No matter where we turn, we’re bombarded by these tasty little tidbits of brain candy. Publishers have grown quite adept at stringing these together, leading us from trivial link to trivial link. Personally, I’m a sucker for Top Ten lists. But after succumbing to the temptation for “just a second” I find myself, 20 minutes later, having accomplished nothing other than learning what the 10 Biggest Reality Show Blunders were, or where the 10 Most Extravagant Homes in the U.S. happen to be.

Entertaining? Absolutely.

Useful? Doubtful.

Important?  Not a chance.

merrillcoveymatrixWe need to set aside time for important stuff. A few decades ago, I happened to read Stephen Covey’s “First Things First,” which introduced a concept I still try to live by to this day. Covey called it the Urgent/Important matrix. It’s a simple two-by-two matrix with four quadrants:

1 – Urgent and Important – for example, a fire in your kitchen.

2 – Not Urgent but Important – long-term planning.

3 – Urgent but Not Important – interruptions.

4 – Not Important and Not Urgent – time-wasters.

Covey’s Advice? Better balance your time in these quadrants. Quadrant One takes care of itself. We can’t ignore these types of crises. But we should try to minimize the distractions that fall into Quadrant Three and cut down the time we spend in Quadrant Four. Then, we should move as much of this freed-up time as possible into Quadrant Two.

Covey’s Quadrants are more applicable than ever to the online world.  I suspect most of us spend the majority of time in the online equivalents of Quadrant Three (responding to emails or other instant forms of messaging that aren’t really important) or Quadrant Four (online time wasters). We probably don’t spend much time in Quadrant Two (which I’ll abbreviate it to Q2). In fact, in writing this column, I tried to find a quick guide to finding important stuff online. I have a few places I like to go, which I’ll share in a moment, but despite the vast potential of online as a Q2 resource, it doesn’t seem that anyone is it making it easy to filter for “importance.” As I said in my last column, we have filters for popularity and recency, but I couldn’t find anything helping me track down Q2 candidates.

So, here is my contribution to helping you set aside more quality Q2 time:

Amazon Kindle and DevonThink: Reading thought-provoking books is my favorite Q2 activity.  I try to set aside at least an hour a day to read. Anytime someone suggests a book or I find one referenced, I download immediately it from Kindle and add it to the queue. Then, as I read, I use Kindle’s highlight feature to create a summary of the important ideas. After, I copy my highlighted notes into DevonThink, a tool that helps track and archive notes and resources for future reference.

Scientific American & Science Daily: I’m a science geek. I love learning about the latest advances — in particular, new discoveries in the areas of psychology and neuroscience. When I find an interesting article, I again save it to DevonThink.

Google Scholar and Questia: Every so often, I dive into the world of academia to find research done in a particular area, usually related to a blog post or column idea. Google Scholar usually unearths a number of publicly available papers on most topics. And, if you share my predilection for academic research, a subscription to Questia is worth considering.

Big Think, weforum.org and TED: Looking for big ideas — world-changing stuff? These three sites are the place to find them.

HBR, Wired, The Atlantic and The Economist: Another favorite topic of mine is corporate strategy — particularly how organizations have to adapt to a rapidly evolving environment. I find sites like these great for giving me a sense of what’s happening in the world of business.

Hey, it may not be a fried chicken corsage, but these aren’t bad ways to spend an hour or two a day.

 

The Bug in Google’s Flu Trend Data

First published March 20, 2014 in Mediapost’s Search Insider

Last year, Google Flu Trends blew it. Even Google admitted it. It over predicted the occurrence of flu by a factor of almost 2:1.  Which is a good thing for the health care system, because if Google’s predictions had have been right, we would have had the worst flu season in 10 years.

Here’s how Google Flu Trends works. It monitors a set of approximately 50 million flu related terms for query volume. It then compares this against data collected from health care providers where Influenza-like Illnesses (ILI) are mentioned during a doctor’s visit. Since the tracking service was first introduced there has been a remarkably close correlation between the two, with Google’s predictions typically coming within 1 to 2 percent of the number of doctor’s visits where the flu bug is actually mentioned. The advantage of Google Flu Trends is that it is available about 2 weeks prior to the ILI data, giving a much needed head start for responsiveness during the height of flu season.

FluBut last year, Google’s estimates overshot actual ILI data by a substantial margin, effectively doubling the size of the predicted flu season.

Correlation is not Causation

This highlights a typical trap with big data – we tend to start following the numbers without remembering what is generating the numbers. Google measures what’s on people’s minds. ILI data measures what people are actually going to the doctor about. The two are highly correlated, but one doesn’t not necessarily cause the other. In 2013, for instance, Google speculated that increased media coverage might be the cause for the overinflated predictions. More news coverage would have spiked interest, but not actual occurrences of the flu.

Allowing for the Human Variable

In the case of Google Flu Trends, because it’s using a human behavior as a signal – in this case online searching for information – it’s particularly susceptible to network effects and information cascades. The problem with this is that these social signals are difficult to rope into an algorithm. Once they reach a tipping point, they can break out on their own with no sign of a rational foundation. Because Google tracks the human generated network effect data and not the underlying foundational data, it is vulnerable to these weird variables in human behavior.

Predicting the Unexpected

A recent article in Scientific American pointed out another issue with an over reliance on data models –  Google Flu Trends completely missed the non-seasonal H1N1 pandemic in 2009. Why? Algorithmically, Google wasn’t expecting it. In trying to eliminate noise from the model, they actually eliminated signal coming during an unexpected time. Models don’t do very well at predicting the unexpected.

Big Data Hubris

The author of the Scientific American piece, associate editor Larry Greenemeier, nailed another common symptom of our emerging crush on data analytics – big data hubris. We somehow think the quantitative black box will eliminate the need for more mundane data collection – say – actually tracking doctor’s visits for the flu. As I mentioned before, the biggest problem with this is that the more we rely on data, which often takes the form of arm’s length correlated data, the further we get from exploring causality. We start focusing on “what” and forget to ask “why.”

We should absolutely use all the data we have available. The fact is, Google Flu Trends is a very valuable tool for health care management. It provides a lot of answers to very pertinent questions. We just have to remember that it’s not the only answer.

Will Women Make More Empathetic Marketers?

First published Feb 27, 2014 in Mediapost’s Search Insider

empathyAt the risk of sounding sexist, I wonder if women might make better marketers then men?

If you’ll remember, I proposed a new way of defining the job description of a marketer in last week’s column: to understand the customer’s reality, focusing on those areas where we can solve their problems and improve that reality.

If we’re painting with incredibly broad strokes here – which we are – and we had to attach that description to one gender, which gender would you pick?

I know I’m dancing on shaky ground here – or, in my case – thin ice, but I think we all agree that while equal, men and women are different. Men are better at some things. Women are better at others. Yes, there’s a normal distribution curve in both cases, but for some things, the female curve is going to be further to the right. When I look at the qualities that might make an awesome marketer in the new world order, I have to say it seems better suited to the natural strengths of women. That’s why I don’t believe it was coincidence that more women showed a positive response to my column last week then men.

Let me give you an example of a sex based difference we found in our own research that will help explain my reasoning. We looked at how men and women navigate websites using an eye tracking station. When we looked at aggregate heat maps, which showed all activity, there was little difference. But when we sliced the activity into half-second by half-second increments, there was a significantly different scan pattern between men and women. Men went right to the navigation bar and starting mapping out the architecture of the site. They made a mental wireframe to help them get around. Their first priority was how they were going to get things done. Women, however, first looked at images, especially people and the main content on the homepage. Their first priority was whom they were dealing with and what the site was about.

That, in a nutshell, sums up a crucial difference between men and women. Men are driven by tasks – they work to get stuff done. Women are empathizers – they work with people.  In the end, both often get to the same place. But they may take very different paths to get there.

The new world of marketing I’m proposing is all about nurturing relationships – true one-to-one relationships. It’s much more about “who” and “why”, and less about “what”.  It’s about sensing what the world looks like from the prospect’s perspective and moving an organization’s internal strategy closer to that perspective. I’m not saying men can’t do that, but I am saying that women can do it at least as well as men. And perhaps that can help bring more balance to the world of marketing. While total head counts of men and women in marketing are roughly equal (with some reports giving women a slight edge) the same cannot be said of pay scales. According to the latest Marketing Rewards Survey, published by the Chartered Institute of Marketing, the gap between men’s and women’s salaries has widened by 10% since 2012. This gap shows up most noticeably at the highest levels of the industry, where twice as many men (18%) reach director level as women (7%). This also holds true for marketing heads, with men almost doubling women again – 22% vs 12%. These numbers are out of the UK, but the Bureau of Labor Statistics has similar numbers for the US.  They’ve lumped in Marketing and Sales Managers, but the stats show that women earn about 67.7% of what men earn.

There are going to be some massive shifts in marketing in the coming decades. One of them might be between the genders in who holds the top marketing roles.

 

Now, That’s a Job Description I Could Get Behind!

First published February 20, 2014 in Mediapost’s Search Insider

I couldn’t help but notice that last week’s column, where I railed against the marketer’s obsession with tricks, loopholes and pat sound bites got a fair number of retweets. The irony? At least a third of those retweets twisted my whole point – that six seconds (or any arbitrary length of message) isn’t the secret to getting a prospect engaged. The secret is giving them something they want to engage with.

tweet ss

As anyone who has been unfortunate to spend some time with me when I’m in particularly cynical mood about marketing can attest to, I go a little nuts with this “Top Ten Tricks” or “The Secret to…” mentality that seems pervasive in marketing. I’m pretty sure that anyone who retweeted last week’s column with a preface like “Does your advertising engage your consumer in 6 seconds or less? If not, you’re likely losing customers” didn’t bother to actually read past the first paragraph. Maybe not even the first line.

And that’s the whole problem. How can we expect marketers to build empathy, usefulness and relevance into their strategy when many of them have the attention span of a small gnat? As my friend Scott Brinker likes to say when it comes to marketer’s misbehaving, “This is why we can’t have nice things.”

Marketing – good marketing – is not easy but it’s also not a black box. It’s not about secrets or tricks or one-off tactics. It’s about really understanding your customers at an incredibly deep level and then working your ass off to create a meaningful engagement with them. Trying to reduce marketing to anything less than that is like trying to breeze your way through 50 years of marriage by following the Top 3 Tricks to get lucky this Friday night.

Again, this is about meaningful engagements. And when I say meaningful, it’s the customer that gets to decide what’s meaningful. That’s what’s potentially so exciting about breakthroughs like the Oreo Super Bowl campaign. It’s the opportunity to learn what’s meaningful to prospects and then to shift and tailor our responses in real time. Until now, marketing has been “Plan, Push and Pray.” We plan our attack, we push out our message and we pray it finds it’s target and that they respond by buying stuff. If they don’t buy stuff, something went wrong, probably in the planning stage. But that is an awfully long feedback loop.

You’ll notice something about this approach to marketing. The only role for the prospect is as a consumer. If they don’t buy, they don’t participate.  This comes as a direct result of the current job description of a marketer: Someone who gets someone else to buy stuff. But what if we rethink that description? Technology that enables real time feedback is allowing us to create an entirely new relationship with customers. What would happen if we redefined marketing along these lines: To understand the customer’s reality, focusing on those areas where we can solve their problems and improve that reality?

And as much as that sounds like a pat sound bite, if you really dig into it, it’s far from a quick fix. This is a way to make a radically different organization. And it moves marketing into a fundamentally different role. Previously, marketing got its marching orders from the CEO and CFO. Essentially, they were responsible for moving the top line ever northward. It was an internally generated mandate – to increase sales.

But what if we rethink this? What if the entire organization’s role is to constantly adapt to a dynamic environment, looking for advantageous opportunities to improve that environment? And, in this redefined vision, what if marketing’s role was to become the sense-making interface of the company? What if it was the CMO’s job was to consistently monitor the environment, create hypotheses about how to best create adaptive opportunities and then test those hypotheses in a scientific manner?

In this redefinition of the job, Big Data and Real Time Marketing take on significantly new qualities, first as a rich vein of timely information about the marketplace and secondly as a never ending series of instant field experiments to provide empirical backing to strategy.

Now, marketing’s job isn’t to sell stuff, it’s to make sense of the market and, in doing so, help define the overall strategic direction of the company. There are no short cuts, no top ten tricks, but isn’t that one hell of a job description?

The Psychology of Usefulness: The Acceptance of Technology – Part Three

In Part Two of this series, I looked at Davis and Bagozzi’s Technology Acceptance Model, first proposed in 1989.

Technology_Acceptance_Model

As I said, while the model was elegant and parsimonious, it seems to simplify the realities of technology acceptance decisions too much. In 2000, Venkatesh and Davis tried to deal with this in TAM 2 – the second version of the Technology Acceptance Model.

TAM2

In this version, they added several determinants of Perceived Usefulness and demoted Perceived Ease of Use to being just one of the factors that impacted Perceived Usefulness.  Impacting this mental calculation were two mediating factors: Experience and Voluntariness. This rebalancing of factors provides some interesting insights into the mental process we go through when making a decision whether we’ll accept a new technology or not.

Let’s begin with the determinants of Perceived Usefulness in the order they appear in Venkatesh and Davis’s model:

Subjective Norm: TAM 2 resurrects one of the key components of the original Theory of Reasoned Action model – the opinions of others in your social environment.

Image: Venkatesh and Davis also included another social factor in their list of determinants – how would the acceptance of this technology impact your status in your social network? Notice that our calculation of the image enhancement potential has the Subjective Norm as an input. It’s a Bayesian prediction – we start with our perceived social image status (the prior) and adjust it based on new information, in this case the acceptance of a new technology.

Job Relevance: How applicable is the technology to the job you have to do?

Output Quality: How will this technology impact your ability to perform your job well?

Result Demonstrability: How easy is it to show the benefits of accepting the technology?

It’s interesting to note how these factors split: the first two (subjective norm and image) being related to social networks, the next two (Job Relevance and Output Quality) being part of a mental calculation of benefit and the last one, Demonstrability, bridging the two categories: How easy will it be to show others that I made the right decision?

According to the TAM 2 model, we use these factors, which combine practical task performance considerations and social status aspirations, into a rough calculation of the perceived usefulness of a technology. After this is done, we start balancing that with how easy we perceive the new technology to be to use. Venkatesh and Davis commented on this and felt that Perceived Ease of Use has a variable influence in two areas, the forming of an attitude towards the technology and a behavioral intention to use the technology. The first is pretty straight forward. Our attitude is our mental frame regarding the technology. Again, to use a Bayesian term, it’s our prior. If the attitude is positive, it’s very probably that we’ll form a behavioral intention to use the technology. But there are a few mediating factors at this point, so let’s take a closer look at the creation of Behavioral Intention..

In forming our intention, Perceived Ease of Use is just one of the determinants we use in our “Usefulness” calculation, according to the model. And it depends on a few things. It depends on efficacy – how comfortable we judge ourselves to be with the technology in question. It also depends on what resources we feel we will have access to to help us up the learning curve. But, in the forming of our attitude (and thereby our intention), Venkatesh and Davis felt that Perceived Usefulness will typically be more important than Perceived Ease of Use. If we feel a technology will bring a big enough reward, we will be willing to put up with a significant degree of pain. At least, we will in what we intend to do. It’s like making a New Year’s Resolution to lose weight. At the time we form the intention, the pain involved is sometime in the future, so we go forward with the best of intentions.

As we move forward from Attitude to Intention, this transition if further mediated in the model by our subjective norm – the cognitive context we place the decision in. Into this subjective norm falls our experience (our own evaluation of our efficacy), the attitudes of others towards the technology and also the “Voluntariness” of the acceptance. Obviously, our intention to use will be stronger if it’s a non-negotiable corporate mandate, as opposed to a low priority choice we have the latitude to make.

What is missing from the TAM 2 model is the link between Perceived Ease of Use and actual Usage. Just like a New Year’s Resolution, intentions don’t always become actions. Venkatesh and Davis said Perceived Ease of Use is a moving, iteratively updated calculation. As we gain hands-on experience, we update our original estimate of Ease of Use, either positively or negatively. If it’s positive, it’s more likely that Intention will become Usage. If negatively, the technology may fail to become accepted. In fact, I would say this feedback loop is an ongoing process that may repeat several times in the space between Intention and Usage. The model, with a single arrow going in one direction from Intention to Usage, belies the complexity of what is happening here.

Venkatesh and Davis wanted to create a more realistic model, expanding the front end of the model to account for determinants going into the creation of Intention. They also wanted to provide a model of the decision process that better represented how we balance Perceived Usefulness and Perceived Ease of Use. I think they made some significant gains here. But the model is still a linear one – going in one direction only. What they missed is the iterative nature of acceptance decisions, especially in the gap between Intention and Behavior.

In Part Four, we’ll look at TAM 3 and see how Venkatesh further modified his model to bring it closer to the real world.

So, Six Seconds is the Secret, Huh?

First published February 13, 2014 in Mediapost’s Search Insider

oreo-superbowl-blackout-adApparently, the new official time limit for customer engagement is 6 seconds, according to a recent post on Real Time Marketing. How did we come up with 6? Well, in the world of social media engagement it seemed like a good number and no one has called bull shit on it yet, so 6 it is

Marketers love to talk about time – just in time, real time, right time. At the root of all this “time talk” is the realization that customers really don’t have any time for us, so we have to somehow jam our messages into the tiny little cracks that may appear in the wall of willful ignorance they carefully build against marketing. The marketer’s goal is to erode their defenses by looking for any weakness that may appear.

Look at the supposed poster child for Real Time Marketing – the Oreo coup staged during the black out in the 2013 Super Bowl. Because the messaging was surprising and clever, and because, let’s face it, we weren’t doing much of anything else anyway, Oreo managed to gain a foothold in our collective consciousness for a few precious seconds. So, marketers being marketers, we all stumbled over ourselves to proclaim a new channel and launch a series of new micro-attacks on consumers. That’s where the 6 seconds came from. Apparently, that’s the secret to storming the walls. Five seconds and you’re golden. Seven seconds and you’re dead.

Oreo surprised us, and it wasn’t because the message was 6 seconds long. It was because we weren’t expecting a highly relevant, highly timely message. Humans are built to respond to things that don’t fit within our expected patterns. The whole approach of marketing is to constantly blanket us with untimely, irrelevant messages. Marketers, to be fair, try to deliver the right message at the right time to the right person, but it’s really hard to do that. So, we overcompensate by delivering lots of messages all the time to everyone, hoping to get lucky. Not to take anything away from the cleverness and nimbleness of the Oreo campaign, but they got lucky. We were surprised and we let our defenses down long enough to be amused and entertained. Real time marketing wasn’t a brilliant new channel; it was a shot in the dark – literally.

And there’s no six-second gold standard of engagement. If you can deliver the right message at the right time to the right person, you can spend hours talking to your prospective customer.  It’s only when you’re trying to interrupt someone with something irrelevant that you have to hopefully shoehorn it into their consciousness. Think of it like a Maslow’s hierarchy of advertising effectiveness.  At it’s best advertising should be useful. This sits at the top of the pyramid. After usefulness comes relevance – even if I don’t find the ad useful to me right now, at least you’re talking to the right person. After relevance comes entertainment – I’ll willingly give you a few seconds of my time if I find your message amusing or emotionally engaging.  I may not buy, but I’ll spend some time with you. After entertainment comes the category the majority of advertising falls into – a total waste of my time.  Not useful, irrelevant, not emotionally engaging. And making an ad that falls into this category 5 seconds long, no matter what channel it’s delivered through, won’t change that. You may fool me once, but next time, I’m still going to ignore you.

There was something important happening during the Oreo campaign at the 2013 Super Bowl, but it had nothing to do with some new magic formula, some recently discovered loophole in our cognitive defenses. It was a sign of what may, hopefully, emerge as trend in advertising – nimble, responsive marketing that establishes a true feedback loop with prospects. What may have happened when the lights went out in New Orleans is that we may have found a new, very potent way to make sense of our market and establish a truly interactive, responsive dialogue with them. If this is the case, we may have just found a way climb a rung or two on the Advertising Effectiveness Hierarchy.

How Can Humans Co-Exist with Data?

First published February 6, 2014 in Mediapost’s Search Insider

tumblr_inline_mpt49sqAwV1qz4rgpLast week, I talked about our ability to ignore data. I positioned this as a bad thing. But Pete Austin called me on it, with an excellent counterpoint:

Ignoring Data is the most important thing we do. Only the people who could ignore the trees and see the tiger, in real-time, survived to become our ancestors.”

Too true. We’re built to subconsciously filter and ignore vast amounts of input data in order to maintain focus on critical tasks, such as avoiding hungry tigers. If you really want to dive into this, I would highly recommend Daniel Simons and Christopher Chabris’s “The Invisible Gorilla.” But, as Simons and Chabris point out, with example after example of how our intuitions (which we use as filters) can mislead us, this “inattentional blindness” is not always a good thing. In the adaptive environment in which we evolved, it was pretty effective at keeping us alive.  But in a modern, rational environment, it can severely inhibit our ability to maintain an objective view of the world.

But Pete also had a second, even more valid point:

“What you need to concentrate on now is “curated data”, where the junk has already been ignored for you.”

And this brought to mind an excellent example from a recent interview I did as background for an upcoming book I’m working on.  This idea of pre-filtered, curated data becomes a key consideration in this new world of Big Data.

Nowhere are the stakes higher for the use of data than in healthcare. It’s what lead to the publication of a manifesto in 1992 calling for a revolution in how doctors made life and death decisions. One of the authors, Dr. Gordon Guyatt, coined the term “Evidence based medicine.” The rational is simple here. By taking an empirical approach to not just diagnosis but also to the best prescriptive path, doctors can rise above the limitations of their own intuition and achieve higher accuracy. It’s data driven decision-making, applied to health care. Makes perfect sense, right? But even though Evidence based medicine is now over 20 years old, it’s still difficult to consistently apply at the doctor to individual patient level.

I had the chance to ask Dr. Guyatt why this was:

“Essentially after medical school, learning the practice of medicine is an apprenticeship exercise and people adopt practice patterns according to the physicians who are teaching them and their role models and there is still a relatively small number of physicians who really do good evidence-based practice themselves in terms of knowing the evidence behind what they’re doing and being able to look at it critically.”

The fact is, a data driven approach to any decision-making domain that previously used to rely on intuition just doesn’t feel – well – very intuitive. It’s hard work. It’s time consuming. It, to Mr. Austin’s point, runs directly counter to our tiger-avoidance instincts.

Dr. Guyatt confirms that physicians are not immune to this human reliance on instinct:

“Even the best folks are not going to do it – maybe the best folks – but most folks are not going to be able to do that very often.”

The answer in healthcare, and likely the answer everywhere else where data should back up intuition, is the creation of solid data based resources, which adhere to empirical best practices without requiring every single practitioner to do the necessary heavy lifting. Dr. Guyatt has seen exactly this trend emerge in the last decade:

“What you need is preprocessed information. People have to be able to identify good preprocessed evidence-based resources where the people producing the resources have gone through that process well.”

The promise of curated, preprocessed data is looming large in the world of marketing. The challenge is that, unlike medicine, where data is commonly shared and archived, in the world of marketing much of the most important data stays proprietary. What we have to start thinking about is a truly empirical, scientific way to curate, analyze and filter our own data for internal consumption, so it can be readily applied in real world situations without falling victim to human bias.

Never Underestimate the Human Ability to Ignore Data

First published January 30, 2014 in Mediapost’s Search Insider

ignore_factsIt’s one thing to have data. It’s another to pay attention to it.

We marketers are stumbling over ourselves to move to data-driven marketing. No one would say that’s a bad thing. But here’s the catch in that. Data driven marketing is all well and good when it’s a small stakes game – optimizing spend, targeting, conversion rates, etc. If we gain a point or two on the topside, so much the better. And if we screw up and lose a point or two – well – mistakes happen and as long as we fix it quickly, no permanent harm done.

But what if the data is telling us something we don’t want to know? I mean – something we really don’t want to know. For instance, our brand messaging is complete BS in the eyes of our target market, or they feel our products suck, or our primary revenue source appears to be drying up or our entire strategic direction looks to be heading over a cliff? What then?

This reminds me of a certain CMO of my acquaintance who was a “Numbers Guy.” In actual fact, he was a numbers guy only if the numbers said what he wanted them to say. If not, then he’d ask for a different set of numbers that confirmed his view of the world. This data hypocrisy generated a tremendous amount of bogus activity in his team, as they ran around grabbing numbers out of the air and massaging them to keep their boss happy. I call this quantifiable bullshit.

I think this is why data tends to be used to optimize tactics, but why it’s much more difficult to use data to inform strategy. The stakes are much higher and even if the data is providing clear predictive signals, it may be predicting a future we’d rather not accept. Then we fall back on our default human defense: ignore, ignore, ignore.

Let me give you an example. Any human who functions even slightly above the level of brain dead has to accept the data that says our climate is changing. The signals couldn’t be clearer. And if we choose to pay attention to the data, the future looks pretty damn scary. Best-case scenario – we’re probably screwing up the planet for our children and grand children. Worst-case scenario – we’re definitely screwing up the planet and it will happen in our lifetime. And we’re not talking about an increased risk of sunburn. We’re talking about the potential end of our species. So what do we do? We ignore it. Even when flooding, drought and ice storms without historic precedent are happening in our back yards. Even when Atlanta is paralyzed by a freak winter storm. Nothing about what is happening is good news, and it’s going to get worse. So, damn the data, let’s just look the other way.

In a recent poll by the Wall Street Journal, out of a list of 15 things that Americans believed should be top priorities for President Obama and Congress, climate change came out dead last – behind pension reform, Iran’s nuclear program and immigration legislation. Yet, if we look at the data that the UN and the World Economic Forum collects, quantifying the biggest threats to our existence, climate change is consistently near the top, both in terms of likelihood and impact. But, it’s really hard to do something about it. It’s a story we don’t want to hear, so we just ignore the data, like the afore-said CMO.

As we get access to more and more data, it will be harder and harder to remain uninformed, but I suspect it will have little impact on our ability to be ignorant. If we don’t know something, we don’t know it. But if we can know something, and we choose not to, that’s a completely different matter. That’s embracing ignorance. And that’s dangerous. In fact, it could be deadly.

What’s Apple’s Plan for 2014?

First published January 2, 2014 in Mediapost’s Search Insider

apple-storeWhen new markets open, value chains first build up, then across. Someone first creates a vertically integrated experience, and then the market opens up as free competition drives efficiency. This is the challenge that currently lies ahead of Apple.

Apple has been the acknowledged master at creating seamless vertically integrated experiences. They did it with the personal computer. They did it with music. They did it with mobile. They did it with tablets. The advantage of working within a closed value chain is that you control every aspect of the experience. You can make sure that everyone plays nice with each other.

The challenge is that at some point, as adoption heats up, you simply cannot scale fast enough to meet market demand. Open competition drives horizontal competition, which drives down prices. The lack of control up and down the chain introduces some short-term user pain, but eventually the dynamics of an open market overcome this and the advantages of having several companies working on an opportunity outweigh the disadvantages.

Apple loves early markets. Or, at least, they have in the past. Under Jobs, they had a knack of creating an elegantly integrated experience that was carefully crafted from top to bottom within the walls of Cupertino. The vision and obsession with detail that defined the Jobs era was a potent combination when it came to building vertical experiences. Somehow, Apple was able to open new markets over and over again, seemingly at will. They were able to bridge Geoffrey Moore’s “Chasm” – by making new experiences painless enough for the front end of the adoption bell curve. As markets rode up the curve, markets turned from vertical to horizontal, driving a decline in margins and prices. This is where Apple tended to kick out and look for the next wave to catch.

But that was then, and this is now. As mentioned, Apple doesn’t do very well when markets turn horizontal. They depend on high margins. Only once, with the Mac, were they able to come back and stake out a respectable claim in a horizontal market. And they almost disappeared in the process. The number of dependent circumstances that would be required to repeat that trick is such that I doubt they’re eager to go down the same path with the iPhone or iPad.

In the year end summaries, many are talking about a seeming anomaly –  that despite Android’s massive market share dominance over iOS (81% vs 12.9%, according to a recent Forbes article) it’s Apple that’s ringing up the holiday sales with mobile shoppers (23% vs Android’s paltry 5%).  This becomes more understandable when you put it in the context of a vertical market that is becoming horizontal. Shopping experiences are still much less painful on iOS. And, you have a user base that is much more comfortable with mobile ecommerce because they’re on the leading edge of the adoption curve. They’ve had a mobile device for a number of years now. Android users, in general, tend to be further back on the curve. As the benefits of Darwinian competition redefine the mobile marketplace along more horizontal lines, those ecommerce numbers will revert to a more natural balance, but it will take some time.

As this inevitable change in the marketplace happens, the question then becomes, “What does Apple do next?” Can they find the next wave? And, if they do, does an Apple without Jobs still have what it takes to create the vertical experience that can open up a new market? There are plenty of opportunities – the two most notable ones being connected entertainment devices (the much-rumored new generation of Apple TV) and wearable technology (iWatches, etc).

Apple has always been known for keeping their cards glued against their chest. In 2014, it remains to be seen if they have anything amazing up their sleeve.