#AlexfromTarget – An Unexpected Consequence of Technology

1414997478566_wps_10_Original_Tweet_of_Alex_frYes, I’m belatedly jumping on the #AlexfromTarget bandwagon, but it’s in service of a greater truth that I’m trying to illustrate. Last column, I spoke about the Unintended Consequences of Technology. I think this qualifies. And furthermore, this brings us full circle to Kaila Colbin’s original point, which started this whole prolonged discussion.

It is up to us to decide what is important, to create meaning and purpose. And, personally, I think we could do a better job than we’re doing now.

So, why did the entire world go ga-ga over a grocery bagger from Texas? What could possibly be important about this?

Well – nothing – and that’s the point. Thinking about important things is hard work. Damned hard work – if it’s really important. Important things are complex. They make our brains hurt. It’s difficult to pin them down long enough to plant some hooks of understanding in them. They’re like eating broccoli, or doing push ups. They may be good for us, but that doesn’t make them any more fun.

Remember the Yir Yoront from my last column – the tribal society that was thrown into a tail spin by the introduction of steel axes? The intended consequence of that introduction was to make the Yir Yoront more productive. The axes did make the tribe more productive, in that they were able to do the essential tasks more quickly, but the result was that the Yir Yoront spent more time sleeping.

Here’s the thing about technology. It allows us to be more human – and by that I mean the mixed bag of good and bad that defines humanity. It extends our natural instincts. It’s natural to sleep if you don’t have to worry about survival. And it’s also natural for young girls to gossip about adorable young boys. These are hard-wired traits. Deep philosophical thought is not a hard-wired trait. Humans can do it, but it takes conscious effort

Here’s where the normal distribution curve comes in. Any genetically determined trait will have a normal distribution over the population. How we apply new technologies will be no different. The vast majority of the population will cluster around the mean. But here’s the other thing – that “mean” is a moving target. As our brains “re-wire” and adapt to new technologies, the mean that defines typical behavior will move over time. We adapt strategies to incorporate our new technology-aided abilities. This creates a new societal standard and it is also human to follow the unwritten rules of society. This creates a cause and effect cycle. Technologies enable new behaviors that are built on top of the foundations of human instinct – society determines whether these new behaviors are acceptable – and if they are acceptable, they become the new “mean” of our behavioral bell curve. We bounce new behaviors off the backboard of society. So, much as we may scoff at the fan-girls that gave “Alex” insta-fame – ultimately it’s not the girl’s fault, or technology’s. The blame lies with us. It also lies with Ellen DeGeneres, the New York Times, and the other barometers of societal acceptance that offered endorsement of the phenomenon.

It’s human to be distracted by the titillating and trivial. It’s also human to gossip about it. There’s nothing new here. It’s just that these behaviors used to remain trapped within the limited confines of our own social networks. Now, however, they’re amplified through technology. It’s difficult to determine what the long-term consequences of this might be. Is Nicholas Carr right? Is technology leading us down the garden path to imbecility, forever distracted by bright, shiny objects? Or is our finest moment yet to come?

The Unintended Consequences of Technology

Who_caresIn last Friday’s Online Spin Column, Kaila Colbin asks a common question when it comes to the noise surrounding the latest digital technologies: Who Cares? Colbin rightly points out that we tend to ascribe unearned importance to whatever digital technology we seemed to be focused on at the given time. This is called, aptly enough, the focusing illusion and in the words of Daniel Kahneman, who coined the term, “Nothing in life is as important as you think it is, while you are thinking about it.”

But there’s another side to this. How important are the things we aren’t thinking about? For example, because it’s difficult to wrap our minds around big picture consequences in the future, we tend not to think as much as we should about them. In the case of digital technology shifts such as the ones Kaila mentioned, what we should care about is the overall shift caused by the cumulative impact of these technologies, not the individual components that make up the wave.

When we introduce a new technology, we usually have some idea of the impact they will have. These are the intended consequences. And we focus on these, which makes them more important in our minds. But some things will catch us totally by surprise. These are called unintended consequences. We won’t know them until the happen, but when they do, we will very much care about them. To illustrate that point, I’d like to tell the story about the introduction of one technology that dramatically changed one particular society.

yiryorontThe Yir Yoront were a nomadic tribe in Australia that somehow managed to avoid significant contact with the western world until well into the 20th century. In Yir Yoront society, one of the most valuable things you could possess was a stone axe. The making of these axes took time and skill and was typically done by elder males. In return, these “axe-makers” were conferred special status in aboriginal society. Only a man could own an axe and if a woman or child needed one, they had to borrow it. A complex social network evolved around the ownership of axes.

In 1915 the Anglican Church established a mission in Yir Yoront territory. The missionaries brought with them a large supply of steel hatchets. They distributed these freely to any Yir Yoront that asked for them. The intended consequence was to make life easier for the tribe and trigger an improvement in living conditions.

As anthropologist Lauriston Sharp chronicled, steel axes spread rapidly through the Yir Yoront. But they didn’t spread evenly. Elder males held on to their stone axes, both as a symbol of their status and because of their distrust of the missionaries. It was the younger men, women and children that previously had to borrow stone axes who eagerly adopted the new steel axes. The steel axes were more efficient, and so jobs were done in much less time. But, to the missionary’s horror, the Yir Yoront spent most of their extra leisure time sleeping.

Sleeping, however, was the least of the unintended consequences. Social structures, which had evolved over thousands of years, were dismantled overnight. Elders were forced to borrow steel axes from what would have been their social inferiors. People no longer attended important intertribal gatherings, which were once the exchange venues for stone axes. Traditional trading channels and relationships disappeared. Men began prostituting their daughters and wives in exchange for someone else’s steel ax. The very fabric of Yir Yoront society began unraveling as a consequence of the introduction of steel axes by the Anglican missionaries.

Now, one may argue that there were aspects of this culture that were overdue for change. A traditional Yir Yoront society was undeniably chauvinistic. But the point of this story is not to pass judgment. My only purpose here is to show how new technologies can bring massive and unanticipated disruption to a society.

Everett Rogers used the Yir Yoront example in his seminal book Diffusion of Innovations. In it, he said that introductions of new technologies typically have three components: Form, Function and Meaning. The first two of these tend to be understood and intended during the introduction. Both the Yir Yoront and the Anglican missionaries understand the form and function of the steel ax. But neither understood the meaning, because meaning was determined over time through the absorption of the technology into the receiving culture. This is where unintended consequences come from.

When it comes to digital technologies, we usually talk about form and function. We focus on what a technology is and what it will do. We seldom talk about what the meaning of a new technology might be. This is because form and function can be intentionally designed and defined. Meaning has to evolve. You can’t see it until it happens.

So, to return to Kaila’s question. Who cares? Specifically, who cares about the meaning of the new technologies we’re all voraciously adopting? If the story of the Yir Yoront is any lesson, we all should.

The Virtuous Cycle and the End of Arm’s Length Marketing

brandstewardshipLast week I wrote what should have been an open and shut column – looking at why SEO never really lived up to the potential of the business opportunity. Then my friend Scott Brinker had to respond with this comment:

“Seems like Google has long been focused on making SEO a “result” of companies doing good things, rather than a search-specific optimization “cause” to generate good rankings. They seem to have gotten what they wanted. Now as Google starts to do that with paid search, the world gets interesting for those agencies too..”

Steven Aresenault jumped on the bandwagon with this:

“Companies are going to wake up to the reality that part of their marketing is really about creating content. Content is everywhere and everything. Reality is I believe that it is a new way of thinking.”

As they both point out, SEO should be a natural result of a company doing good things, not the outcome of artificial manipulations practiced by a third party. It has to be baked into and permeate through the operating DNA of a company. But, as I started this column, I realized that this doesn’t stop at SEO. This is just the tip of a much bigger iceberg. Marketing, at least the way it’s been done up to now, is fundamentally broken. And it’s because many companies still rely on what I would call “Arm’s Length Marketing.”

Brand Stewardship = B.S.

Here is a quote lifted directly from the Ogilvy Mather website:

We believe our role as 360 Degree Brand Stewards is this: Creating attention-getting messages that make a promise consistent and true to the brand’s image and identity. And guiding actions, both big and small, that deliver on that brand promise. To every audience that brand has. At every brand intersection point. At all times.

Now, Ogilvy is very good at crafting messages and this one is no exception. Who could possibly argue with their view of brand stewardship? The problem comes when you look at what “stewardship” means. Here’s the Merriam Webster definition:

the conducting, supervising, or managing of something; especially :  the careful and responsible management of something entrusted to one’s care

The last five words are the key – “something entrusted to one’s care”. This implies that the agency has functional control of the brand, and with due apologies to David Ogilvy and his cultural legacy, that is simply bullshit.

Brands = Experience

Hmmm - coincidence?

Hmmm – coincidence?

Maybe Arm’s Length Brand Stewardship was possible in the era of David Ogilvy, Don Draper and Darrin Stephens (now, there’s a pop culture trifecta for you) – where brand messaging defined the brand, but that era is long gone. Brands used to be crafted from exposure, but now they’re created through experience, amplified through the resonant network of the online community. And an arm’s length third party cannot, nor should they, control that experience. It has to live at the heart of the company. For decades, companies abdicated the responsibility of brand stewardship to the communication experts – or, to do a little word crafting – they “entrusted (it) to (their) care.” That has to change. Marketing has to come back home.

The Virtuous Marketing Cycle

Scott talked about the SEO rewards that come from doing good things. Steven talked about authentic content creation being one of those good things. But this is a much bigger deal. This is about forcefully moving marketing’s place in the strategic chain. Currently, the order is this: Management > Strategy > Marketing > Revenue. Marketing’s current job is to execute on strategy, which comes from management. And, in that scenario, it’s plausible to execute at arm’s length. Also, things like SEO and content management fall well down the chain, typically beneath the threshold of senior management awareness. By the way, usability and other user-centric practices typically suffer the same fate.

But what if we moved our thinking from a chain to a cycle: Marketing > Management > Strategy > Marketing > Revenue > Marketing (and repeat)? Let me explain. To begin with, Marketing is perfectly situated to become the “sensemaking” interface with the market. This goes beyond market research, which very seldom truly informs strategy. Market research in its current form is typically intended to optimize the marketing program.

I’m talking about a much bigger role – Marketing would define the “outside in” view of the company which would form the context within which strategy would be determined by Management. Sensemaking as it applies to corporate strategy is a huge topic, but for brevity’s sake, let’s suppose that Marketing fills the role of the corporation’s five senses, defining what reality looks (and smells and sounds and tastes and feels) like . Then, when strategy is defined within that context, Marketing is well positioned to execute on it. Finally, execution is not the end – it is the beginning of another cycle. Sense making is an iterative process. Marketing then redefines what reality looks like and the cycle starts over again.

Bringing stewardship of marketing back to the very heart of the organization fundamentally changes things like arm’s length agency partnerships. It creates a virtuous cycle that runs through length and breadth of a company’s activities. Things like SEO, content creation and usability naturally fall into place.

Why SEO Never Lived Up to Its Potential

IAB Canada President Chris Williams asked me a great question last week.

seo9We had just finished presenting the results of the new eye tracking study I told you about in the last three columns. I had mentioned that about 84% of all the clicks on the page in the study were on some type of non-paid result. I had also polled the audience of some 400 plus Internet marketers about how many were doing some type of organic optimization. A smattering of hands (which, in case you’re wondering, is somewhere south of a dozen, or about 3% of the audience) went up. Williams picked up on the disconnect right away. “We have a multi-billion dollar interactive advertising industry here in Canada, and you’re telling me that (on search at least) that only represents about 16% of the potential traffic? Why isn’t SEO a massive industry?”

Like I said – great question. I wish I had responded with a great answer. But the best I could do fell well short of the mark: “Uhh..well…(pick up slight whining tone at this point)…SEO is just really, really hard!”

Okay, maybe I was slightly more eloquent than that – but the substance of my reply was essentially that flimsy. SEO is a backbreaking way to earn a living, whether you’re a lone consultant, an agency or an in-house marketer.

Coincidentally, I was also in an inaugural call last week with a dear friend of mine who asked me to serve on the advisory board of his successful digital agency. I asked if they offered SEO services. I got the same answer from him – SEO was just too hard to make it profitable. They dropped it a few years ago from their services portfolio.

It Was a Case of Showing Search the Money

The potential value of SEO hasn’t changed in the almost 20 years since I started in this biz. In fact, it’s probably greater than ever. But SEO never seems to gain traction. The reason becomes clear when you start following the money. Goto.com (which became Overture, which was swallowed by Yahoo) sealed SEO’s fate when it started auctioning off search ads in 1998.  Google eventually followed suit in 2000 and the rest, along with SEO, was history. Even devout SEOers (myself included) eventually followed the money trail to the paid side of the house. The reasons why were abundantly and painfully clear when you consider this one particular example. We had the SEO contract with one fortune 500 brand that brought in about $300K annually. At the time, it would have been our biggest SEO contract, but it also was resource intensive. We had an entire team working on it. We did well, securing a number of first page results for some very high traffic terms. Based on what analytics we had it appeared that SEO was driving about 90% of the traffic and was converting substantially better than any other traffic source, including paid search. This translated into hundreds of millions of dollars in business yearly. But we could never seem to grow our contract beyond that $300K ceiling.

Paid search was another story. From fairly humble beginnings, that same brand became one of Google’s top advertisers, spending over $30 million per year. The management of that contract became a multimillion-dollar account. Unfortunately, it wasn’t our account. It belonged to another agency – a much smarter and more profitable agency.

Why We Got Pigeon Holed with SEO

If, as a service provider, you live and die by SEO, it’s probable that you’ll end up dying by SEO. Here’s why. To gain any traction you need to have influence over almost every aspect of the business. SEO has to become systemic. It has to be baked into the way an organization does business. It can’t be done as window dressing.

Most organizations don’t get that. They get tantalized by initial easy wins – things like cleaning up code, improving crawlability and doing some basic content optimization. Organic traffic skyrockets and everyone cheers. Life is good. But then it gets hard. The next step means rolling up your sleeves and diving deep into the guts of the organization. And if that organization isn’t ready to open the kimono to the SEO consultant at all levels, you hit a brick wall. This is typically where the organization falls prey to more unscrupulous SEO promises and practices from other vendors, which invariably get slammed by a future algo-update. And that brings us to the last challenge for SEO.

Flip Your SEO Coin

Even the best SEOers can get blindsided by Google. A tweak in an algorithm or a shift in ranking factors can drop you like a rock from the first page. And, if the recent study showed anything, it was that you can’t afford to drop from the first page. Traffic can go from a roar to a whisper overnight. That’s tough for the marketing department of an organization to swallow. People in the C-Suite that sign off on a sizable SEO contract have a tough time understanding why their investment suddenly got flushed down Google’s drain, perhaps never to resurface. They love control, and SEO offers anything but. As important as SEO is, it’s not predictable. You can’t bank on it.

So Chris…thanks for the question. Like I said, it was a really good one. And I hope this is a little better answer than the one I came up with on the spot.

Evolved Search Behaviors: Take Aways for Marketers

In the last two columns, I first looked at the origins of the original Golden Triangle, and then looked at how search behaviors have evolved in the last 9 years, according to a new eye tracking study from Mediative. In today’s column, I’ll try to pick out a few “so whats” for search marketers.

It’s not about Location, It’s About Intent

In 2005, search marketing as all about location. It was about grabbing a part of the Golden Triangle, and the higher, the better. The delta between scanning and clicks from the first organic result to the second was dramatic – by a factor of 2 to 1! Similar differences were seen in the top paid results. It’s as if, given the number of options available on the page (usually between 12 and 18, depending on the number of ads showing) searchers used position as a quick and dirty way to filter results, reasoning that the higher the result, the better match it would be to their intent.

In 2014, however, it’s a very different story. Because the first scan is now to find the most appropriate chunk, the importance of being high on the page is significantly lessened. Also, once the second step of scanning has begun, within a results chunk, there seems to be more vertical scanning within the chunk and less lateral scanning. Mediative found that in some instances, it was the third or fourth listing in a chunk that attracted the most attention, depending on content, format and user intent. For example, in the heat map shown below, the third organic result actually got as many clicks as the first, capturing 26% of all the clicks on the page and 15% of the time spent on page. The reason could be because it was the only listing that had the Google Ratings Rich Snippet because of the proper use of structured data mark up. In this case, the information scent that promised user reviews was a strong match with user intent, but you would only know this if you knew what that intent was.

Google-Ford-Fiesta

This change in user search scanning strategies makes it more important than ever to understand the most common user intents that would make them turn to a search engine. What will be the decision steps they go through and at which of those steps might they turn to a search engine? Would it be to discover a solution to an identified need, to find out more about a known solution, to help build a consideration set for direct comparisons, to look for one specific piece of information (ie a price) or to navigate to one particular destination, perhaps to order online? If you know why your prospects might use search, you’ll have a much better idea of what you need to do with your content to ensure you’re in the right place at the right time with the right content.  Nothing shows this clearer than the following comparison of heat maps. The one on the left was the heat map produced when searchers were given a scenario that required them to gather information. The one on the right resulted from a scenario where searchers had to find a site to navigate to. You can see the dramatic difference in scanning behaviors.

Intent-compared-2

If search used to be about location, location, location, it’s now about intent, intent, intent.

Organic Optimization Matters More than Ever!

Search marketers have been saying that organic optimization has been dying for at least two decades now, ever since I got into this industry. Guess what? Not only is organic optimization not dead, it’s now more important than ever! In Enquiro’s original 2005 study, the top two sponsored ads captured 14.1% of all clicks. In Mediative’s 2014 follow up, the number really didn’t change that much, edging up to 14.5% What did change was the relevance of the rest of the listings on the page. In 2005, all the organic results combined captured 56.7% of the clicks. That left about 29% of the users either going to the second page of results, launching a new search or clicking on one of the side sponsored ads (this only accounted for small fraction of the clicks). In 2014, the organic results, including all the different category “chunks,” captured 74.6% of the remaining clicks. This leaves only 11% either clicking on the side ads (again, a tiny percentage) or either going to the second page or launching a new search. That means that Google has upped their first page success rate to an impressive 90%.

First of all, that means you really need to break onto the first page of results to gain any visibility at all. If you can’t do it organically, make sure you pay for presence. But secondly, it means that of all the clicks on the page, some type of organic result is capturing 84% of them. The trick is to know which type of organic result will capture the click – and to do that you need to know the user’s intent (see above). But you also need to optimize across your entire content portfolio. With my own blog, two of the biggest traffic referrers happen to be image searches.

Left Gets to Lead

The Left side of the results page has always been important but the evolution of scanning behaviors now makes it vital. The heat map below shows just how important it is to seed the left hand of results with information scent.

Googlelefthand

Last week, I talked about how the categorization of results had caused us to adopt a two stage scanning strategy, the first to determine which “chunks” of result categories are the best match to intent, and the second to evaluated the listings in the most relevant chunks. The vertical scan down the left hand of the page is where we decide which “chunks” of results are the most promising. And, in the second scan, because of the improved relevancy, we often make the decision to click without a lot of horizontal scanning to qualify our choice. Remember, we’re only spending a little over a second scanning the result before we click. This is just enough to pick up the barest whiffs of information scent, and almost all of the scent comes from the left side of the listing. Look at the three choices above that captured the majority of scanning and clicks. The search was for “home decor store toronto.” The first popular result was a local result for the well known brand Crate and Barrel. This reinforces how important brands can be if they show up on the left side of the result set. The second popular result was a website listing for another well known brand – The Pottery Barn. The third was a link to Yelp – a directory site that offered a choice of options. In all cases, the scent found in the far left of the result was enough to capture a click. There was almost no lateral scanning to the right. When crafting titles, snippets and metadata, make sure you stack information scent to the left.

In the end, there are no magic bullets from this latest glimpse into search behaviors. It still comes down to the five foundational planks that have always underpinned good search marketing:

  1. Understand your user’s intent
  2. Provide a rich portfolio of content and functionality aligned with those intents
  3. Ensure your content appears at or near the top of search results, either through organic optimization or well run search campaigns
  4. Provide relevant information scent to capture clicks
  5. Make sure you deliver on what you promise post-click

Sure, the game is a little more complex than it was 9 years ago, but the rules haven’t changed.

Strategic Planning as though the Future Matters – Strategy and Leadership

chesspiecesWhy do organizations get blindsided by market transformations that could have been anticipated? It may not be because their planning methods are flawed, but rather that they undertake strategic planning processes like scenario development without seeing them as a unique opportunity for learning about and exploring the future.

To help planners avoid strategic surprise, Monitor 360 has created a five-step strategic planning process that has been tested in interactions with leaders in the military, intelligence community, and corporations. This paper guides you through a systematic process for incorporating plausible but challenging future scenarios into your organization’s learning processes, to help mitigate risk and decrease the likelihood of being unprepared for discontinuities.

The PDF is available for download.

Why do organizations get blindsided by market transformations that could have been anticipated? After all, scenario planning has been a widely used strategic planning tool for decades and most managers are familiar with the process of considering how they would operate in alternative futures. The reason most organizations get surprised by game-changing events, in my experience, is not that their planning methods are bad. The problem is that they undertake strategic planning processes like scenario development without seeing them as a unique opportunity for learning about and exploring the future. In some cases this is because management lacks sufficient appreciation for the uncertainty and ambiguity their organizations face. More often, however, management is fully aware of the uncertainty of their situation but is seemingly powerless to prepare to adapt to new business realities, especially unpleasant ones.

To help planners avoid strategic surprise, Monitor 360 has created a five-step strategic planning process that has been tested in interactions with leaders in the military, intelligence community, and corporations. By systematically incorporating plausible but challenging future scenarios into their learning processes, decision makers can both mitigate risk and decrease the likelihood of not being prepared for discontinuities. This approach overcomes the paralysis that sometimes happens when people see all the uncertainty their organization faces, as well as the denial that happens when they don’t.

Multiple futures

When thinking about the future, many strategic planners make the mistake of asking, “What will the future be?” Because the future is the net result of so many complex and interdependent issues the question is daunting, and perhaps unanswerable.

A more realistic question is, “What are the possible challenging futures?” Exploring multiple possible ways the future could unfold in ways that would require the organization to radically adapt enables leaders to better prepare for a wide range of contingencies, and to manage the consequences more effectively when surprises do occur.

Scenario analysis can provide planners with a systematic way of imaging the future and identifying winning long-term strategies that respond to the many ways the future could play out. It helps individuals and their organizations identify and challenge their entrenched mental models and assumptions about what the future might hold, while helping bound the uncertainties they face.

Instead of attempting to predict what’s going to happen, the scenario methodology offers a way to see the forces as they are taking shape and not be blindsided when they lead to major changes. Anticipating the future gives decision-makers the ability to look in the right place for game-changing events, to rehearse the appropriate responses and to systematically tack indicators of change.

Five Mindsets for Managing Uncertainty

Scenario thinking is the foundation of our five-step toolkit because of the unique ways it allows leaders to explore and exploit the unknown, and because it offers managers a methodology to consider alternatives in the face of uncertainty. To make scenario planning more effective, we’ve identified five discrete steps in the process, each of which should be undertaken with a distinct mindset. It is important to take these steps one at a time and in order, rather than skipping right away to decision-making.

Create Scenarios — Unleash your Imagination

Scenarios are plausible narratives about futures that are distinctly different from the present. If they are well prepared, they allow for a thorough exploration of future risks and opportunities. Scenario thinkers begin at the same place as traditional risk managers, skillfully making an inventory of what is known about the future. After exploring issues such as demographics as well as aspects of industry structure and customer behavior, scenario thinkers turn to the unknown, the unknowable, and the perceptions that should be challenged. Following a rigorous analytical process aimed at articulating the range of uncertainties an organization could face and all of the relevant outcomes, scenario thinkers design a number of cogent narratives about relevant futures.

Scenarios are written as plausible stories — not probable ones. Traditional risk management is based on probabilities, actuary tables, and other known and measurable quantities. But scenarios are intended to provoke the imagination and provide a more comprehensive view of risk, so that the results can shed light on critical strategic decisions.

It is important to note that scenario developers create multiple futures, rather than just one. This allows for a more complete exploration of the future, thus avoiding getting wedded to specific set of assumptions about how uncertainties will unfold. The process of developing multiple scenarios helps to increase the possibility that leaders will not be surprised, because it allows them to rehearse multiple unique futures. Importantly, it also grounds decision-makers in the reality that, in most circumstances, they cannot accurately predict the future. Rathe than falsely assuming one outcome will happen, leaders learn that they must make decisions in light of the true uncertainty they face.

As an example of this process, the U.S. Navy developed a set of scenarios that would help guide the development of the first unified strategy of all the country’s maritime forces in the “A Cooperative Strategy for 21st Century Seapower” released October 2007. The first step was to develop four working scenarios. These were discussed and refined in a series of eight working sessions around the country with people from the business, government, and academic sectors who could provide valuable insight about issues the Navy needed to address in the future. The participation of these experts, and their feedback, helped to test the validity of scenarios, which were then refined for publication and dissemination.

The scenarios had a significant impact on the future strategy of the Navy. For example, the scenarios helped to provide a new mission for the Navy in its response to humanitarian crises. the report concluded: “Building on relationships forged in times of calm, we will continue to mitigate human suffering as the vanguard of interagency and multinational efforts, both in a deliberate, proactive fashion and in response to crises. Human suffering moves us to act, and the expeditionary character of maritime forces uniquely positions them to provide assistance.”

Determine Required Capabilities for each Scenario — Give your Creativity Free Rein

The second step of the process is to identify what it takes to be successful in each of the futures identified. After the scenario process has imagined distinctly different future worlds that the organization’s leaders have acknowledge are plausible, relevant, and important, what would a high-performing organization look like in each of these worlds? That is, if an organization were dealt the card of one scenario, what would it need to do in order to be successful?

To answer this question, planners need to make a list of key success factors and capabilities. Key capabilities for militaries or intelligence agencies might be the ability to project force rapidly abroad or the ability to collect and process open-source information. Capabilities often start with “the ability to….” For companies these might be the ability to build brands that address customer needs and inspire loyalty as well as the ability to launch products quickly.

As a case in point, a major software company needed to determine where to invest its limited resources to succeed in a market roiled by new competitors. In one scenario, the company needed good relationships with its value-added resellers and excellent customer service. In another scenario, it needed an entirely different set of capabilities, including low cost, operating system integration.

It’s important to address the capabilities question as if it were a set of independent problems: what it would take to be a winner in a given scenario? Doing so encourages bold, creative thinking, and avoids the trap of limiting the alternatives to those that are doable with current capabilities and resources. By keeping this step separate from the next one, assessing current capabilities, planners are not hobbled by only thinking about what they are good at today or nor do they have struggle with imagining themselves in four different worlds at the same time.

It is often a wrenching experience for leaders to simply look for the absolutely best strategic posture for their organization in each scenario. This is one measure of how hard it is for them to imagine doing business in any future that has totally different success factors from the current environment.

Assess Current Capabilities — Be Painfully Realistic

Separate from the critical examination of the capabilities needed for success in each scenario, planners must ask: What are we good at right now? The answers could be human capital, relationship, or operational efficiency. These capabilities are generally described as competitive assets that cannot be bought and sold on the free market. Organizations can’t just say, “We’ll invest $100 million next month, and then we’ll have that ability” or “We want to do that.” They have to build the capability over time.

Often outside perspective — for example, based on detailed discussions with customers — can be helpful in getting in unbiased evaluation of what capabilities an organization excels in.

Identify Gaps — Provide Honest Analysis

Next organization should compare its own capabilities with the capabilities needed to succeed in the scenarios. Such capability maps will not only highlight what capabilities it needs to develop — the capability gaps — but also what capabilities it has already invested in that may become redundant.

Make Choices — Consider your Options

Once organizations have analyzed the gap between their strengths and the capabilities needed in each scenario, they face some big decisions. There could be capabilities that they need in all the scenarios imagined but that they don’t currently have. As a first step, an organization might safely develop these. We call those “no-regrets moves.”

Others move are what we call “big bets.”  These are capabilities needed in a particular scenario or a small number of scenarios, but not in others. Organizations make bets consciously after systematically thinking through the types of capabilities, their relationship to the environment around them, or the futures that they feel are likely to occur. They can adjust their decision when more data is collected or events unfold in the world. This process is based on the theory of real options, which suggests an organization can gain an advantage by making many small bets, and as information accumulates, start to increase or decrease those bets accordingly.

The crucial questions for organizations to ask when making choices are: What would be the risk if a scenario happened and we didn’t have this capability? And what would be the risk if a scenario didn’t happen and we did have it?

What’s Clouding the Future?

It’s painfully difficult for individual leaders to keep their minds open to multiple futures and to follow a systematic process like the one described above. IBM”s famous story illustrates this all-to-common tendency.

In 1980, the personal computer represented a tiny niche market. When IBM was considering developing a computer for the masses it convened a working group to forecast its potential market. The team projected that the market for PCs would total a mere 241,683 units over the next five years, and that demand would peak after two years and then trend downward. They believed that since existing computers had such a small amount of processing power, people would not want to purchase a second one.

As a result, IBM determined that there was no potential in the marketplace and effectively killed its effort to dominate the personal computer market, ceding the operating system to Microsoft and the processor to Intel. IBM lost out on a $240 billion market, one in which nearly every household in the developed world would eventually want one or more of the machines and then would want to upgrade them every few years.

But what if someone in the room had asked, “What if people want a PC on every desktop?” What if individuals start carrying PC’s in their pockets? What if PCs develop a communications capability?What if they are widely used to play games? Maybe we should think of a different scenario where the market would be more like 20 million units?” These would have been completely off-the-wall, outrageous ideas at the time, but if just one person in the room had explored such different lines of thought, the futures of Microsoft, Intel and IBM might have evolved differently.

The Benefits of a Systematic, Disciplined Approach

Anticipating the future isn’t just about avoiding strategic surprise or minimizing the downside risk. There’s also a huge upside: You are creating the future that you want and making sense of how the world may play out. Understanding your choices can be an empowering process.

When planners follow a process that systematically cuts through the barriers to effective group learning and decision-making, and combine that process with principles that give discipline and robustness to the entire endeavor, the future, and our place in it, comes into a much sharper focus.

Google’s Golden Triangle – Nine Years Later

Last week, I reviewed why the Golden Triangle existed in the first place. This week, we’ll look at how the scanning patterns of Google user’s has evolved in the past 9 years.

The reason I wanted to talk about Information Foraging last week is that it really sets the stage for understanding how the patterns have changed with the present Google layout. In particular, one thing was true for Google in 2005 that is no longer true in 2014 – back then, all results sets looked pretty much the same.

Consistency and Conditioning

If humans do the same thing over and over again and usually achieve the same outcome, we stop thinking about what we’re doing and we simply do it by habit. It’s called conditioning. But habitual conditioning requires consistency.

In 2005, The Google results page was a remarkably consistent environment. There was always 10 blue organic links and usually there were up to three sponsored results at the top of the page. There may also have been a few sponsored results along the right side of the page. Also, Google would put what it determined to be the most relevant results, both sponsored and organic, at the top of the page. This meant that for any given search, no matter the user intent, the top 4 results should presumably include the most relevant one or two organic results and a few hopefully relevant sponsored options for the user. If Google did it’s job well, there should be no reason to go beyond these 4 top results, at least in terms of a first click. And our original study showed that Google generally did do a pretty good job – over 80% of first clicks came from the top 4 results.

In 2014, however, we have a much different story. The 2005 Google was a one-size-fits-all solution. All results were links to a website. Now, not only do we have a variety of results, but even the results page layout varies from search to search. Google has become better at anticipating user intent and dynamically changes the layout on each search to be a better match for intent.

google 2014 big

What this means, however, is that we need to think a little more whenever we interact with a search page. Because the Google results page is no longer the same for every single search we do, we have exchanged consistency for relevancy. This means that conditioning isn’t as important a factor as it was in 2005. Now, we must adopt a two stage foraging strategy. This is shown in the heat map above. Our first foraging step is to determine what categories – or “chunks” of results – Google has decided to show on this particular results page. This is done with a vertical scan down the left side of the results set. In this scan, we’re looking for cues on what each chunk offers – typically in category headings or other quickly scanned labels. This first step is to determine which chunks are most promising in terms of information scent. Then, in the second step, we go back to the most relevant chunks and start scanning in a more deliberate fashions. Here, scanning behaviors revert to the “F” shaped scan we saw in 2005, creating a series of smaller “Golden Triangles.”

What is interesting about this is that although Google’s “chunking” of the results page forces us to scan in two separate steps, it’s actually more efficient for us. The time spent scanning each result is half of what it was in 2005, 1.2 seconds vs. 2.5 seconds. Once we find the right “chunk” of results, the results shown tend to be more relevant, increasing our confidence in choosing them.  You’ll see that the “mini” Golden Triangles have less lateral scanning than the original. We’re picking up enough scent on the left side of each result to push our “click confidence” over the required threshold.

A Richer Visual Environment

Google also offers a much more visually appealing results page than they did 9 years ago. Then, the entire results set was text based. There were no images shown. Now, depending on the search, the page can include several images, as the example below (a search for “New Orleans art galleries”) shows.

Googleimageshot

The presence of images has a dramatic impact on our foraging strategies. First of all, images can be parsed much quicker than text. We can determine the content of an image in fractions of a second, where text requires a much slower and deliberate type of mental processing. This means that our eyes are naturally drawn to images. You’ll notice that the above heat map has a light green haze over all the images shown. This is typical of the quick scan we do immediately upon page entry to determine what the images are about. Heat in an eye tracking heat map is produced by duration of foveal focus. This can be misleading when we’re dealing with images for two reasons. First, the fovea centralis is, predictably, in the center of our eye where our focus is the sharpest. We use this extensively when reading but it’s not as important when we’re glancing at an image. We can make a coarse judgement about what a picture is without focusing on it. We don’t need our fovea to know it’s a picture of a building, or a person, or a map. It’s only when we need to determine the details of a picture that we’ll recruit the fine-grained resolution of our fovea.

Our ability to quickly parse images makes it likely that they will play an important role in our initial orientation scan of the results page. We’ll quickly scan the available images looking for information scent. It the image does offer scent, it will also act as a natural entry point for further scanning. Typically, when we see a relevant image, we look in the immediate vicinity to find more reinforcing scent. We often see scanning hot spots on titles or other text adjacent to relevant images.

We Cover More Territory – But We’re Also More Efficient

So, to sum up, it appears that with our new two step foraging strategy, we’re covering more of the page, at least on our first scan, but Google is offering richer information scent, allowing us to zero in on the most promising “chunks” of information on the page. Once we find them, we are quicker to click on a promising result.

Next week, I’ll look at the implications of this new behavior on organic optimization strategies.

The Evolution of Google’s Golden Triangle

In search marketing circles, most everyone has heard of Google’s Golden Triangle. It even has it’s own Wikipedia entry (which is more than I can say). The “Triangle” is rapidly coming up to its 10th birthday (it was March of 2005 when Did It and Enquiro – now Mediative – first released the study). This year, Mediative conducted a new study to see if what we found a decade ago still continues to be true. Another study from the Institute of Communication and Media Research in Cologne, Germany also looked at the evolution of search user behaviors. I’ll run through the findings of both studies to see if the Golden Triangle still exists. But before we dive in, let’s look back at the original study.

Why We Had a Golden Triangle in the First Place

To understand why the Golden Triangle appeared in the first place, you have to understand about how humans look for relevant information. For this, I’m borrowing heavily from Peter Pirolli and Stuart Card at PARC and their Information Foraging Theory (by the way, absolutely every online marketer, web designer and usability consultant should be intimately familiar with this theory).

Foraging for Information

Humans “forage” for information. In doing so, they are very judicious about the amount of effort they go to find the available information. This is largely a subconscious activity, with our eyes rapidly scanning for cues of relevancy. Pirolli and Card refer to this as “information scent.” Picture a field mouse scrambling across a table looking for morsels to eat and you’ll have an appropriate mental context in which to understand the concept of information foraging. In most online contexts, our initial evaluation of the amount of scent on a page takes no more than a second or two. In that time, we also find the areas that promise the greatest scent and go directly to them. To use our mouse analogy, the first thing she does is to scurry quickly across the table and see where the scent of possible food is the greatest.

The Area of Greatest Promise

Now, Imagine that same mouse comes back day after day to the same table and every time she returns, she finds the greatest amount of food is always in the same corner. After a week or so, she learns that she doesn’t have to scurry across the entire table. All she has to do is go directly to that corner and start there. If, by some fluke, there is no food there, then the mouse can again check out the rest of the table to see if there are better offerings elsewhere. The mouse has been conditioned to go directly to the “Area of Greatest Promise” first.

Golden Triangle original

F Shaped Scanning

This was exactly the case when we did the first eye tracking study in 2005. Google had set a table of available information, but they always put the best information in the upper right corner. We became conditioned to go directly to the area of greatest promise. The triangle shape came about because of the conventions of how we read in the western world. We read top to bottom, left to right. So, to pick up information scent, we would first scan down the beginning of each of the top 4 or 5 listings. If we saw something that seemed to be a good match, we would scan across the title of the listing. If it was still a good match, we would quickly scan the description and the URL. If Google was doing it’s job right, there would be more of this lateral scanning on the top listing than there would be on the subsequent listings. This F shaped scanning strategy would naturally produce the Golden Triangle scanning pattern we saw.

Working Memory and Chunking

There was another behavior we saw that helped explain the heat maps that emerged. Our ability to actively compare options requires us to hold in our mind information about each of the options. This means that the number of options we can compare at any one time is restricted by the limits of our working memory. George Miller, in a famous paper in 1956, determined this to be 7 pieces of information, plus or minus two. The actual number depends on the type of information to be retained and the dimension of variability. In search foraging, the dimension is relevancy and the inputs to the calculation will be quick judgments of information scent based on a split second scan of the listing. This is a fairly complex assessment, so we found that the number of options to be compared at once by the user tends to max out about 3 or 4 listings. This means that the user “chunks” the page into groupings of 3 or 4 listings and determines if one of the listings is worthy of a click. If not, the user moves on to the next chunk. We also see this in the heat map shown. Scanning activity drops dramatically after the first 4 listings. In our original study, we found that over 80% of first clicks on all the results pages tested came from the top 4 listings. This is also likely why Google restricted the paid ads shown above organic to 3 at the most.

So, that’s a quick summary of our findings from the 2005 study. Next week, we’ll look how search scanning has changed in the past 9 years.

Note: Mediative and SEMPO will be hosting a Google+ Hang Out talking about their research on October 14th. Full details can be found here.

The Metaphysical Corporation and The Death of Capitalism?

Something strange is happening to companies. More and more, their business is being conducted in non-physical markets. Businesses used to produce stuff. Now, they produce ideas. A recent op-ed piece from Wharton speculated that companies are working their way up Maslow’s Hierarchy. The traditional business produced things that met the needs of the lowest levels of the pyramid – shelter, food, warmth, security. As consumerism spread, companies worked their way up to next levels: entertainment, attainment and enjoyment.  Now, the things that companies sell sit at the top of the pyramid – fulfillment, creativity, self-actualization.

ComponentsSP500_2010The post also talks about another significant shift that’s happening on the balance sheets of Corporate America. Not only are the things that corporations sell changing, but the things that make up the value of the company itself are also changing.  According to research by Ocean Tomo, a merchant bank that specializes in intellectual property, the asset mix of companies has shifted dramatically in the past 40 years. In 1975, tangible assets (buildings, land, equipment, inventory) made up 83% of the market value of the S&P 500 companies. By 2010, that had flipped – Intangible assets (patents, trademarks, goodwill and brand) made up 80% of the market value of the S & P 500.

Chains vs Networks and the Removal of Friction

Barry Libert, Jerry Wind and Megan Beck Finley, the authors of the Wharton piece, focus mainly on the financial aspects of this shift. They point out that general accounting principles (GAAP) are quickly falling behind this corporate evolution. For example, employees are still classified as an expense, rather than an asset. I’m personally more interested in what this shift means for the very structure of a corporation.

If you built stuff, you needed a supply chain. Vertical integration was the way to remove physical transactional friction from the manufacturing process. Vertical integration bred hierarchal management styles. Over time, technology would remove some of the friction and some parts of the chain may evolve into open markets. The automotive industry is a good example. Many of the components of your 2015 Fusion are supplied to Ford by independent vendors. Despite this, makers of “stuff” still want to control the entire chain through centralized management.

But if you sell ideas, you need to have a network. Intangible products don’t have any physical friction, so supply chains are not required. And if you try to control a network with a centralized hierarchy, branches of your network soon wither and die.

The New Real Thing

coca-cola-freestyle-machineCoke has not been a maker of stuff for quite some time now. Sure, they make beverages, so technically they’re quenching our thirst, but the true value of Coke lies in its brand and our connection to that brand. The “Real Thing” is, ironically and quite literally, a figment of our imagination. If you were to place Coke on Maslow’s Hierarchy – it wouldn’t sit on the bottom level (physiological) but on the third (Love/Belonging) or even the fourth (Esteem).

Coke is very aware of its personal connection with it’s customers and the intangibles that come with it. That’s why the Coca-Cola Freestyle Vending Machine comes with the marketing tag line: “So many options. Thirst isn’t one of them.” You can customize your own formulation from over 100 choices, and if you have the Freestyle app, you can reorder your brand at any Coke Freestyle machine in the world. Of course, Coke is quietly gathering all this customer data that’s generated, including consumption patterns and regional preferences. Again, this intimate customer insight is just one of the intangibles that is becoming increasing valuable.

Coke is not only changing how it distributes its product. It’s also grappling with changing its very structure. In a recent conversation I had with CMO Joe Tripodi, he talked a lot about Coke’s move towards becoming a networked corporation. Essentially, Coke wants to make sure that worldwide innovation isn’t choked off by commands coming from Atlanta.

The Turning Point of Capitalism

As corporate America moves away from the making of physical stuff and towards the creation of concepts that it shares with customers, what does that mean for capital markets? If you believe Jeremy Rifkin, in his new book The Zero Marginal Cost Society, he contends that capitalism is dying a slow death. Eventually, it will be replaced by a new collaborative common market made possible by the increasing shrinkage of marginal costs. As we move from the physical to the metaphysical, the cost of producing consumable services or digital concept-based products (books, music, video, software) drops dramatically. Capital was required to overcome physical transactional friction. If that friction disappears, so does the need for capital.   Rifkin doesn’t believe the death of capitalism will be any time soon, but he does see an inevitable trend towards a new type of market he calls the Collaborative Commons.

Get Intimate

My last takeaway is this – if future business depends on connecting with customers and their conceptual needs, it becomes essential to know those customers on a deeply intimate level.  Throw away any preconceptions from the days of mass marketing and start thinking about how to connect with the “Market of One.”

A Prospect Ignored isn’t Really a Prospect

asleep at work / schoolI’ve ranted about this before and – oh yes – I shall rant again!

But first – the back-story.

I needed some work done at a property I own. I found three contractors online and reached out to each of them to get a quote.

Cue crickets.

No response. Nothing! So a few days later I politely followed up with each to prod the process along. Again, nothing. Finally, after 4 weeks of repeated e-nagging, one finally coughed up a quote. Most of the details were wrong, but at least someone at the other end was responding with minimal signs of consciousness.

Fast-forward 2 months. The work is still not done. At this point, I’m still trying to convey the specifics of the job and to get an estimated timeline. If I had an option, I’d take it. But the sad fact is, as spotty as the communication is with my contractor of choice, it’s still better than his competitors. One never did respond, even after a number of emails and voicemails. One finally sent a quote, but it was obvious he didn’t want the work. Fair enough. If the laws of supply and demand are imbalanced this much in their favor, who am I to fight it?

But here’s the thing. Market balances can change on a dime. Someday I’ll be in the driver’s seat and they’ll be scrambling to line up work to stay in business. And when they reach out to their contact list, a lot of those contacts will respond with an incredulous WTF. If you didn’t want my business when I needed you, why would you think I would give you it when you need me? A prospect spurned has a long memory for the specifics of said spurning. So, Mr. (or Ms.) Contractor, you can go take a flying leap.

If you’re going to use online channels to build your business, don’t treat it like a tap you can turn on and off at your discretion. Your online prospects have to be nurtured. If you can’t take any new business on, that’s fine. But at least have enough respect for them to send a polite response explaining the reason you can’t do the work. As long as we prospects are treated with respect, you’d be amazed at how reasonable we can be. Perhaps we can schedule the job for when you do have time. At the very least, we won’t walk away from the interaction with a bitter taste that will linger for years to come.

In 2005, Benchmark Portal did a study to compare response rates for email requests. The results were discouraging. Over 50% of SMB’s never responded at all. Only a small fraction actually managed to respond within 24 hours of the request.

I would encourage you to do a little surreptitious checking on your own response rates. Prospects contacting you need your help, and none of us like to hear our pleas for help go unanswered. 24 hours may seem like a reasonable time frame to you, but if you’re on the other end, it’s more than enough time to see your enthusiasm cool dramatically. Make it someone’s job to field online requests and set a 4-hour response time limit. I’m not talking about an auto-generated generic email here. I’m talking about a personalized response that makes it clear that someone has taken the time to read your request and is working on it. Also give a clear indication of how long it will take to follow up with the required information.

Why are these initial responses so critical? It’s not just to keep your field of potential prospects green and growing. It’s also because we prospects are using something called “signaling” to judge future interactions with a business. When we reach out to a new business we find online, we have no idea what it will be like to be their customer. We don’t have access to that information. So, we use things we do know as a proxy for that information. These things provide “signals” to help us fill in the blanks in our available information. An example would be hiring new employees. We don’t know how the person we’re interviewing will perform as an employee, so we look for certain things in a resume or an interview to act as signals that would indicate that the candidate will perform well on the job if hired.

If I’m a prospect looking for a business – especially one providing a service that will require an extended relationship between the business and myself – I need signals to show me how reliable the business will be if I chose them. Will they get the work done in a timely manner? Will the quality of the work be acceptable? Will they be responsive and accommodating to my requirements? If problems arise, will they be willing to work through those problems? Those are all questions I don’t have the answer to. All I have are indications based on my current interactions with the business. And if those interactions have required my constant nagging and clarification to avoid incorrect responses, guess what my level of confidence might be with said business?