The Straw that Broke the Market’s Back

First published May 9, 2013 in Mediapost’s Search Insider

Customers are fickle — and I suspect they’re getting more fickle.  Perhaps they’re even feeling a little entitled.A recent survey shows that customers tend to bail on a company not because of a big time screw-up, but because of the accumulation of a lot of little annoyances. Soon, their frustration reaches a tipping point and they look elsewhere.

It would be easy to point the finger at the companies and demand that they get their collective acts together. But I suspect there’s more at play here. It would be my guess that customers are getting harder to please.  And I would further guess that the Web is largely to blame. I think it comes down to a constant rise in our collective expectations, while the reality of our experiences fall behind.

The balance between our expectations and the actual experience determines our loyalty to any course of action. If we have low expectations and a poor experience, we aren’t really surprised, which dampens our subsequent disappointment and leaves us more willing to forgive and forget.  If we have low expectations but a good experience, we’re pleasantly surprised, making us more apt to return. If we have high expectations and a good experience, we get a double hit of happiness. First, we enjoy the anticipation, then we appreciate that the experience actually lives up to our expectations. For a vendor, the scariest scenario is the last of the four: high expectations but a poor experience. In this case, we walk away disappointed and frustrated.

Now, balancing expectations and experience wouldn’t be that difficult for any moderately competent company if those expectations were realistic. But I suspect that more and more of us are entering into our respective experiences with unrealistic expectations. We’re setting our vendors up to fail.

Expectations are set partly based on our past experiences, but they’re also set by the experiences of others. We create our expectation set points based, in part, on what we hear from others.

The Web has created an open, accessible market of experiences and hearsay. We hear about the bad, a feedback loop that increasingly is calling out poor customer service. But we also hear about the good.  Correction – we hear about the exceptional. The “good” is not remarkable. It generally falls within our expectations and so goes without comment. But either the very good or the very bad is exceptional, and we are more apt to comment on it online. Not only do we comment, we also embellish, accentuating the plusses and minuses to make it a better story. Therefore, what we hear from others sets either a very low or very high bar. We steer clear of the low bars, but the high bars stick with us, contributing to the setting of future expectations.

The other thing the Web has done is create expectations that overlap domains.  Previously, when our expectations were set based on our own experiences, they tended to stay domain-specific. We had an expectation of what it would be like to buy a car, stay at a hotel, eat at a restaurant or purchase a new pair of shoes. With the Web, cross-pollination between domains is increasingly common. A head marketer for a well-known industrial manufacturer once said to me, “When it comes to online experience, my competitors are not the traditional ones. I’m competing against Amazon and eBay. That type of experience is what people expect.”

This “nudging up” of expectations is done without much rational consideration. We don’t care much for the reality of operational logistics in any particular domain. We just want our expectations to be met, no matter where those expectations might come from. And when they’re not, we pull the plug on that particular vendor, assuming another vendor can do better in meeting our inflated expectations. The Web has also engendered a virulent “grass is always greener” view of the world. We know a competitor is just a click away (whether or not that vendor is any better than the incumbent).

I’ll be the first to call out a bad customer experience, but when it comes to the increasing fickleness of customers, we should remember that there are two sides to this particular story.

Anchoring and Search

First published in Mediapost’s Search Insider – April 25, 2013

A few columns back, I talked about psychological priming and how it could play out in a search environment. In today’s column, I’d like to talk about a related concept: value anchoring.

Given almost every product category, with the exception of those things we buy very frequently (in my case, chocolate bars, beer and books), we don’t really know what the current going price would be. Either we don’t buy them frequently enough, or the price is subject to market volatility. We may have a rough idea of prices, but we need to adjust this price estimate to the current market conditions.

We need a pricing framework because, as consumers, we need to establish in our own minds what a “fair” price would be. This concept of fairness taps into some pretty deep emotional triggers — ones that vendors should be aware of. I’ll explain in a minute how these concepts of fairness can play out in a typical purchase journey.

Remember, our determination of what price is fair is totally arbitrary. It’s not as if we know objectively what the “fair” price for a carton of eggs, a big-screen TV, or a hotel room in San Francisco is. We make these pricing decisions based on comparisons to available information. And it just so happens that the first piece of information that is available to us tends to play a significantly bigger role than any of the subsequent information that we may come upon. That first price we’re exposed to anchors our heuristic comparisons and tends to linger in our subconscious, triggering emotions that drive our perceptions of fairness.

If we have to adjust our pricing expectations upwards, because the first price baseline is too low, we feel frustrated and taken advantage of.  Our brain’s warning signals go off and we suddenly feel anxious and go on the defensive. Our mood takes a turn for the worst.

If, on the other hand, we are able to adjust our pricing expectations downward because we’re finding prices substantially lower than the first price encountered, we’re almost euphoric. The reward center of our brain is telling us we’re getting a great deal and the resulting dopamine hit gives us a buying high.

Once again, these feelings are based on nothing more than us grasping at the first number we see, and then judging all subsequent pricing information against it. But the fact that this is nothing more than a gut call is exactly the point; its lack of rationality does nothing to diminish its emotional punch.

Now, let’s look at how this might play out in search. Remember, there’s a pretty good likelihood that many consumer journeys may start with a search engine. It’s also likely that many search advertisers might advertise the lowest price possible in order to capture the click. Given this, it wouldn’t be surprising to see that the initial benchmark price could be a very low one. There’s nothing wrong with this, as long as the prices the buyer will eventually pay will land in the same ballpark.

But, as is often the case, if prices start rising quickly because of the inevitable “fine print” exclusions, conditions and lack of availability, the advertiser is going to trigger all the wrong emotional reactions in the prospect. Rather than “hooking” them by dangling an unobtainable low price as bait, they instead unleash a wave of negative emotions. Even if they end up still capturing the sale (due to the competition not being able to beat the inflated price) they will not be engendering any brand “love.”

This is yet another example of focusing on the end result without thinking about the journey. If we become myopically focused on conversion rates, for example, to the exclusion of all else, we might be ignorant of the long-term brand damage we might be causing by capturing those clicks through a digital version of the classic bait-and-switch con.

Read more: http://www.mediapost.com/publications/article/198937/anchoring-and-search.html#ixzz2SdvLkwGB

Psychological Priming and the Path to Purchase

First published March 27, 2013 in Mediapost’s Search Insider

In marketing, I suspect we pay too much attention to the destination, and not enough to the journey. We don’t take into account the cumulative effect of the dozens of subconscious cues we encounter on the path to our ultimate purchase. We certainly don’t understand the subtle changes of direction that can result from these cues.

Search is a perfect example of this.

As search marketers, we believe that our goal is to drive a prospect to a landing page. Some of us worry about the conversion rates once a prospect gets to the landing page. But almost none of us think about the frame of mind of prospects once they reach the landing page.

“Frame” is the appropriate metaphor here, because the entire interaction will play out inside this frame. It will impact all the subsequent “downstream” behaviors. The power of priming should not be taken likely.

Here’s just one example of how priming can wield significant unconscious power over our thoughts and actions. Participants primed by exposure to a stereotypical representation of a “professor” did better on a knowledge test than those primed with a representation of a “supermodel.”

A simple exposure to a word can do the trick. It can frame an entire consumer decision path. So, if many of those paths start with a search engine, consider the influence that a simple search listing may have.

We could be primed by the position of a listing (higher listings = higher quality alternatives).  We could be primed (either negatively or positively) by an organization that dominates the listing real estate. We could be primed by words in the listing. We could be primed by an image. A lot can happen on that seemingly innocuous results page.

Of course, the results page is just one potential “priming” platform. Priming could happen on the landing page, a third-party site or the website itself. Every single touch point, whether we’re consciously interacting with it or not, has the potential to frame, or even sidetrack, our decision process.

If the path to purchase is littered with all these potential landmines (or, to take a more positive approach, “opportunities to persuade”), how do we use this knowledge to become better marketers? This does not fall into the typical purview of the average search marketer.

Personally, I’m a big fan of the qualitative approach (I know — big surprise) in helping to lay down the most persuasive path possible. Actually talking to customers, observing them as they navigate typical online paths in a usability testing session, and creating some robust scenarios to use in your own walk-throughs will yield far better results than quantitative number-crunching. Excel is not a particularly good at being empathetic.

Jakob Nielsen has said that online, branding is all about experience, not exposure. As search marketers, it’s our responsibility to ensure that we’re creating the most positive experience possible, as our prospects make their way to the final purchase.

The devil, as always, is in the details — whether we’re paying conscious attention to them or not.

Weighing Positive and Negative Impacts on Users

First published January 31, 2013 in Mediapost’s Search Insider

We humans hate loss. In fact, we seem to value losing something about twice as high as gaining something. For example, imagine I gave you a coffee cup and then offered to buy it back from you. That’s scenario 1. In scenario 2, I ask you to buy the same coffee cup from me. The price you assign to the coffee cup in the first scenario will be, on the average, about twice as much as in the second. And yes, there’s research to back this up.

When it comes to winning and losing, it’s been proven that “loss looms larger than gains.” It’s just one of the weird glitches in our logical circuitry.  We tend to be hardwired to look at glasses as half empty.

Recently, I was reviewing an academic study done in 2008, with this scintillating title: “Procedural Priming and Consumer Judgment: Effects on the Impact of Positively and Negatively Valenced Information” by Shen and Wyer. If you can get beyond the rather dry title, you find a treasure trove of tidbits to consider when crafting your online user experience.

For example, when we evaluate a product for potential purchase, we may run across both positive and negative information. The order we run into this information can have a dramatic impact on what we do downstream from that interaction. To use psychological terms, it “primes” our mental framework.  And, because we tend to focus on negatives, less favorable information has a greater impact on our decision than positive information.

But it’s not just that we pay more attention to bad news than good news. It’s that bad news can hijack the entire consideration process. According to Shen and Wyer, if we run into negative information, it can change our information-seeking strategies, leading us down further negatively biased channels to confirm the initial information we saw. Bad news tends to lead to more bad news.

Also, we can get “bad news” hangovers. If we compare negatives in one decision process, that negative mental framework can carry over to an entirely different decision that has nothing to do with the first, giving us a heightened awareness of negative information in the new situation.

Here’s another interesting finding. If we’re rushed for time, this preoccupation with the negatives will dramatically affect the decision we make. But, if we have all the time in the world, the impact is relatively insignificant. Given time, we seem to cancel out our inherently negative biases.

All this news is not bad for marketers, however. It seems that simply getting users to state their preference for one feature over another, even though they’re not actively considering purchase at that time, leads to a much greater likelihood of purchase in the future. It seems that if you can get users to compare alternatives — and, more importantly, to commit to saying they prefer one alternative over another — they clear the mental hurdle of deciding “will I buy?” and instead start considering  “what will I buy?”

Finally, there is also a recency effect, especially if prospects had ample time to consider all their alternatives. Shen and Wyer found that the last information considered seemed to have the greatest effect on the buyer.  So, if information was both positive and negative, it was good to get the least favorable information in front of the prospect early, and then move to the most favorable information. Again, this is true only if the user had plenty of time to weigh the options. If they were rushed, the opposite was true.

All in all, these are all intriguing concepts to consider when crafting an ideal online user experience. They also underscore the importance of first impressions, especially negative ones.

The Balancing of Market Information

First published October 25, 2012 in Mediapost’s Search Insider

In my three previous columns on disintermediation, I made a rather large assumption: that the market will continue to see a balancing of information available both to buyers and sellers. As this information becomes more available, the need for the “middle” will decrease.

Information Asymmetry Defined

Let’s begin by exploring the concept of information asymmetry, courtesy of George Akerlof, Michael Spence and Joseph Stiglitz.  In markets where access to information is unbalanced, bad things can happen.

If the buyer has more information than the seller, then we can have something called adverse selection. Take life and health insurance, for example. Smokers (on the average) get sick more often and die younger than non-smokers. If an insurance company has 50% of policyholders who are smokers, and 50% who aren’t, but the company is not allowed to know which is which, it has a problem with adverse selection. It will lose money on the smokers so it will increase rates across the board. The problem is that non-smokers, who don’t use insurance as much, will get angry and may cancel their policy. This will mean the “book of business” will become even less profitable, driving rates even higher.   The solution, which we all know, is simple: Ask policy applicants if they smoke. Imperfect information is thus balanced out.

If the seller has more information than the buyer, then we have a “market for lemons” (the name of Akerlof’s paper). Here,  buyers are  assuming risk in a purchase without knowingly accepting that risk, because they’re unaware of the problems that the seller knows exists. Think about buying a used car, without the benefit of an inspection, past maintenance records or any type of independent certification. All you know is what you can see by looking at the car on the lot. The seller, on the other hand, knows the exact mechanical condition of the car. This factor tends to drive down the prices of all products –even the good ones — in the market, because buyers assume quality will be suspect. The balancing of information in this case helps eliminates the lemons and has the long-term effect of improving the average quality of all products on the market.

Getting to Know You…

These two forces — the need for sellers to know more about their buyers, and the need for buyers to know more about what they’re buying — are driving a tremendous amount of information-gathering and dissemination. On the seller’s side, behavioral tracking and customer screening are giving companies an intimate glimpse into our personal lives. On the buyer’s side, access to consumer reviews, third-party evaluations and buyer forums are helping us steer clear of lemons. Both are being facilitated through technology.

But how does disintermediation impact information asymmetry, or vice versa?

If we didn’t have adequate information, we needed some other safeguard against being taken advantage of. So, failing a rational answer to this particular market dilemma, we found an irrational one: We relied on gut instinct.

Relying on Relationships

If we had to place our trust in someone, it had to be someone we could look in the eye during the transaction. The middle was composed of individuals who acted as the face of the market. Because they lived in the same communities as their customers, went to the same churches, and had kids that went to the same schools, they had to respect their markets. If they didn’t, they’d be run out of town. Often, their loyalties were also in the middle, balanced somewhere between their suppliers and their customers.

In the absence of perfect information, we relied on relationships. Now, as information improves, we still want relationships, because that’s what we’ve come to expect. We want the best of both worlds.

Will Customer Service Disappear with the Elimination of the “Middle”?

First published October 18, 2012 in Mediapost’s Search Insider

In response to my original column on disintermediation, Joel Snyder worried about the impact on customer service: The worst casualty is relationships and people skills. As consumers circumvent middlemen, they become harder to deal with. As merchants become more automated, customer service people have less power and less skills (and lower pay).

Cece Forrester agreed: Disintermediation doesn’t just let consumers be rude. It also lets organizations treat their customers rudely.

So, is rudeness an inevitable byproduct of disintermediation?

Rediscovering the Balance between Personalization and Automation

Technology introduces efficiency. It streamlines the “noise” and marketplace friction that comes with human interactions. But with that “noise” comes all the warm and fuzzy aspects of being human. It’s what both Joel and Cece fear may be lost with disintermediation. I, however, have a different view.

Shifts in human behavior don’t typically happen incrementally, settling gently into the new norm. They swing like a pendulum, going too far one way, then the other, before stability is reached. Some force — in this case, new technological capabilities — triggers the change. As society moves, the force, plus momentum, moves too far in one direction, which triggers an opposing force which pushes back against the trend. Eventually, balance is reached.

A Redefinition of Relationships

In this case, the opposing force will be our need for those human factors. Disintermediation won’t kill relationships. But it will force a redefinition of relationships. The challenge here is that existing market relationships were all tied to the “Middle,” which served as the bridge between producers and consumers. Because the Middle owned the end connection with the customer, it formed the relationships that currently exist. Now, as anyone who has experienced bad customer service will tell you, some who lived in the Middle were much better at relationships than others. Joel and Cece may be guilty of looking at our current paradigm through rose-colored glasses. I have encountered plenty of rudeness even with the Middle firmly in place.

But it’s also true that producers, who suddenly find themselves directly connected with their markets, have little experience in forming and maintaining these relationships. However, the market will eventually dictate new expectations for customer service, and producers will have to meet those expectations. One disintermediator, Zappos, figured that out very early in the game.

Ironically, disintermediation will ultimately be good for relationships. Feedback loops are being shortened. Technology is improving our ability to know exactly what our customers think about us. We’re actually returning to a much more intimate marketplace, enabled through technology. Producers are quickly educating themselves on how to create and maintain good virtual relationships. They can’t eliminate customer service, because we, the market, won’t let them. It will take a bit for us to find the new normal, but I venture to say that wherever we find it, we’ll end up in a better place than we are today.

A Decade with the Database of Intentions

First published September 27, 2012 in Mediapost’s Search Insider

It’s been over 10 years since John Battelle first started considering what he called the “Database of intentions.” It was, and is:

The aggregate results of every search ever entered, every result list ever tendered, and every path taken as a result. It lives in many places, but three or four places in particular hold a massive amount of this data (ie MSN, Google, and Yahoo). This information represents, in aggregate form, a place holder for the intentions of humankind – a massive database of desires, needs, wants, and likes that can be discovered, supoenaed, archived, tracked, and exploited to all sorts of ends. Such a beast has never before existed in the history of culture, but is almost guaranteed to grow exponentially from this day forward. This artifact can tell us extraordinary things about who we are and what we want as a culture. And it has the potential to be abused in equally extraordinary fashion.

When Battelle considered the implications, it overwhelmed him. “Once I grokked this idea (late 2001/early 2002), my head began to hurt.” Yet, for all its promise, marketers have only marginally leveraged the Database of Intentions.

In the intervening time, the possibilities of the Database of Intention have not diminished. In fact, they have grown exponentially:

My mistake in 2003 was to assume that the entire Database of Intentions was created through our interactions with traditional web search. I no longer believe this to be true. In the past five or so years, we’ve seen “eruptions” of entirely new fields, each of which, I believe, represent equally powerful signals – oxygen flows around which massive ecosystems are already developing. In fact, the interplay of all of these signals (plus future ones) represents no less than the sum of our economic and cultural potential.

Sharing Battelle’s predilection for “Holy Sh*t” moments, a post by MediaPost’s Laurie Sullivan this Tuesday got me thinking again about Battelle’s “DBoI.” A recent study by Google and EA showed that using search data can predict 84% of video game sales.  But the data used in the prediction is only scratching the surface of what’s possible. Adam Stewart from Google hints at what might be possible, “Aside from searches, Google plans to build in game quality, TV investment, online display investment, and social buzz to create a multivariate model for future analysis.”

This is very doable stuff. All we need to create predictive models that match (and probably far exceed) the degree of accuracy already available. The data is just sitting there, waiting to be interpreted. The implications for marketing are staggering, but to Battelle’s point, let’s not be too quick to corral this simply for the use of marketers. The DBoI has implications that reach into every aspect of our society and lives. This is big — really big! If that sounds unduly ominous to you, let me give you a few reasons why you should be more worried than you are.

Typically, if we were to predict patterns in human behavior, there would be two sources of signals. One comes from an understanding of how humans act. As we speak, this is being attacked on multiple fronts. Neuroscience, behavioral economics, evolutionary psychology and a number of other disciplines are rapidly converging on a vastly improved understanding of what makes us tick. From this base understanding, we can then derive hypotheses of predicted behaviors in any number of circumstances.

This brings us to the other source of behavior signals. If we have a hypothesis, we need some way to scientifically test it. Large-scale collections of human behavioral data allow us to search for patterns and identify underlying causes, which can then serve as predictive signals for future scenarios. The Database of Intentions gives us a massive source of behavior signals that capture every dimension of societal activity. We can test our hypotheses quickly and accurately against the tableau of all online activity, looking for the underlying influences that drive behaviors.

At the intersection of these two is something of tremendous import. We can start predicting human behavior on a massive scale, with unprecedented accuracy. With each prediction, the feedback loop between qualitative prediction and quantitative verification becomes faster and more efficient. Throw a little processing power at it and we suddenly have an artificially intelligent, self-ssimproving predictive model that will tell us, with startling accuracy, what we’re likely to do in the future.

This ain’t just about selling video games, people. This is a much, much, much bigger deal.

Paralyzed by Choice

First published June 28, 2012 in Mediapost’s Search Insider

In last week’s column, I looked at how Harvard Business Review bloggers Karen Freeman, Patrick Spenner and Anna Bird spelled the end of the purchase funnel. Today, I’d like to look at the topic they tackled in the second of the three-part series, “If Customers Ask for More Choice, Don’t Listen.”

Barry Schwartz, the author of “The Paradox of Choice,” believes we’re overloaded with choices. In fact, we have so many choices to make, often about inconsequential things, that we live with the constant anxiety of making the wrong choice.

This paradox meets today’s consumer head on, over and over, in situation after situation. The other factor, which I’ve seen play a massive role in buying behaviors, is the degree of risk in the purchase. The bigger the purchase, the higher the risk.

The final piece of the buying puzzle is the reward that lies at the end of the potential purchase. Our brains are built to balance risk and reward in fractions of a second. But we don’t do it by a calm, rational weighing of pros and cons, thus engaging the enlightened thinking part of our brains. We do it by unleashing emotions from the dark, primitive core of our brain. The risk/reward balance whips up a potent mix of neural activity that sets our decision-making engine in motion.

The degree of risk or reward sets the emotional framework for a purchase. High reward, low risk generally means a fairly fast purchase, such as an impulse buy. High risk, low reward may mean a very long purchase cycle with an extended consideration process. Whatever the buying path, there will be an undercurrent of emotion running just below the surface.

Now, let’s match up the findings of the HBR team. High-risk purchases automatically ramp up the level of anxiety we feel. We’re afraid we’ll make the wrong decision. And, in a complex purchase, there’s not just one decision to be made – there are several. At each decision point, we’re bombarded by choices. If the hundreds of purchase path evaluations I’ve done are any indication, the seller spends little time worrying about presenting those choices in a user-friendly way. Catalog pages are jammed with useless and irrelevant items. Internal site search results are generally abysmal. And product information typically takes the form of a long shopping list of features. Very little of it speaks to buyers in a language they care about.

This is a dangerous combination. We have the natural anxiety that comes with risk. We have a gauntlet of decisions to make, each raising the level of anxiety. And we have websites that contribute greatly to the frustration by making it difficult to navigate the information that does exist, which is either too little, too much, too irrelevant or too salesy — never does it seem to be just right.

Again, Freeman, Spenner and Bird ask us to make it simpler for the buyer. Provide them with fewer choices, and make them as relevant and compelling as possible. Ease the burden of risk by providing information that reassures. Realize that one of the components of risk is the degree of bias in the information we’re given. It that information reeks of marketing hyperbole, it will be discounted immediately.

In our numerous eye-tracking studies, we’ve found that in most instances, three to four options seems to be the right number to consider on a Web page. These can be easily loaded into working memory and compared without causing undue wear on our mental mechanics. So, on a landing or home page, three or four groups of coherent and relevant information seems to be an optimal level. We call them “intent clusters.” For navigation bar options, we try to keep it between five and seven choices. If we expect mostly transactional traffic, we ensure there is a “fast path” to purchase. If we expect a lot of purchase research, we aim for rich promises of relevant and reliable information.

As Freeman, Spenner and Bird remind us, “The harder consumers find it to make purchase decisions, the more likely they are to overthink the decision and repeatedly change their minds or give up on the purchase altogether. In fact, regression analysis points to decision complexity and resulting cognitive overload as the single biggest barrier to purchase.”

As marketers, our job is to eliminate the barriers, not erect new ones.

The Death of the Purchase Funnel

First published June 21, 2012 in Mediapost’s Search Insider

A recent series of three posts on the Harvard Business Review blog by Karen Freeman, Patrick Spenner and Anna Bird explored some of the myths about how consumers make decisions. I think each of these has direct implications for search marketers, so over the next three weeks I want to explore them one at a time.

The first, titled “What Do Consumers Really Want? Simplicity,” talks about the breakdown of the purchase funnel. The HBR bloggers contend the funnel, which has been around for well over a hundred years, no longer applies to consumer behaviors. I concur, and said as much in my book, “The BuyerSphere Project.”

We differ a little on the reason for the demise, however. The HBR team credits the demise to cognitive overload on the part of the consumer. We’re simply bombarded by too much information on the purchase path to fit it all into the nice, simple, rational filtering process captured in St. Elmo Lewis’s elegant funnel-shaped model. The accompanying research, a survey of 7,000 consumers, shows decision simplicity was the number-one thing people wanted when making a purchase.

I agree that information overload is part of it, but I also believe that two other factors have led to the end of the purchase funnel. First, the purchase funnel assumes a rational filtering of options based on careful consideration of a consumer’s requirements. I don’t think this was ever the case. Emotions drive our decisions, and more often than not, rationality is applied after the fact to justify our choices. Prior to the Internet, emotion was tough to distinguish from rationality, as buyers didn’t have much control over the content they accessed during the consideration process. They were limited to whatever the marketer pushed out at them. So, whether driven by emotion or logic, they tended to go down the same path and display many of the same behaviors. Given the pervasive believe in humans as rational animals at the time, it was not surprising that a logic-driven model emerged.

The other factor, as I alluded to, was that the Internet shifted the balance of power during the purchase process. Suddenly, we could choose which paths we took during the consideration process. We weren’t all forced down the same path, according to some arbitrary notion of a funnel-shaped model.

What became clear, when consumers could choose their own path, was that the simplicity of the funnel model bore little relation to the actual paths consumers took. And those paths were driven by emotion. People bounced all around, depending on what they were looking to buy. They could go all the way to a shopping cart, then suddenly abandon it and go back to a destination that would be considered “upper funnel” and start all over again. From the outside looking in, this resembled a bowl of spaghetti much more than it did a funnel.

So, we have a trio of suspects in the death of the purchasing funnel: cognitive overload, emotion trumping logic, and consumers gaining more control over their consideration path. All lead to an interesting concept to consider: laying an online path that anticipates the emotional needs of the buyer, and yet keeps the information presented from overwhelming them. For example, marketing has traditionally taken a “turf war” approach to persuading a prospect: “as long as they’re on our turf, we do everything possible to close the sale.

But this doesn’t really match up with the three trends we’re talking about. What online consumers are looking for, according to the HBR research, is a safe online zone that will make their decision easier. Rather than going from site to site, collecting information and filtering out overt marketing hyperbole, what consumers want is a single information source they can trust. They want to be able to lower their “anti-BS” shields, because being a rational, cynical shopper takes a lot of time and effort.

Today, it’s extremely rare to find that trustworthy information on a site you can actually purchase from, but it’s starting to happen in some high activity categories, where independent portals facilitate this simplified approach to shopping. Travel comes to mind.

But let’s consider what would happen if a brand’s website took this approach. Rather than bombard a prospect with exaggerated sales pitches, putting them on the defensive, what if a more neutral, objective experience was provided?  After all, why shouldn’t the decision path be built on your own turf, giving you a home field advantage?

Brand Beliefs and the Facebook Factor

First published May 17, 2012 in Mediapost’s Search Insider

Last week I talked about the power of our beliefs to shape our view of the world around us. I also mentioned how our belief constructs impact our view of brands. As luck would have it, two separate pieces crossed my path this week, both of which provide excellent examples of how we may perceive brands, and how marketers often get it wrong when trying to shepherd a brand through the marketplace.

The first piece was “Does Branding Need to be Rebranded?” by Mediapost’s Matt Straz in Online Spin. In it, Matt mentioned the backlash against Sir James Dyson (he of the cool vacuums) when he dared to mention that he doesn’t believe in branding. Now, to clarify, Dyson doesn’t believe in branding the way it’s practiced by many companies, where through sheer force of advertising, their heavily controlled (and often contrived) brand story is theoretically imprinted in your brain.  This isn’t so much branding as brain-washing. Let’s call it “brand-washing.”

But let’s go back to how our beliefs define our view of brands. We use beliefs as a heuristic short cut allowing us to operate efficiently in our world. We form beliefs so we don’t have to endlessly think through every single decision. Beliefs form based on our own experience, but they are also formed based on what we’re exposed to. All this input gets synthesized into a reasonably coherent and remarkably resilient belief. Once in place, this belief guides our action.

So, from our perspective, a brand can be defined as what the buyer believes a brand to be.  In the ad community, there is much debate about the definition of a brand. But, in the final analysis, the only definition of brand that matters is the one that rests in the mind of the buyer. All else are simply inputs into that final mental model, which is created solely by the customer.

James Dyson believes the best of those paths is by producing great products and then letting them speak for themselves. If you create products that consistently exceed expectations, that is enough to build an authentic and enduring brand belief. It’s hard to argue with that logic, and, in fact, it’s what P&G called the Second Moment of Truth with consumers: their experience when your product is in their hands. In this definition, brand is intimately coupled with the product itself.

But, if Dyson is right, why is there an advertising industry at all? Even Dyson buys ads to sell vacuum cleaners. This brings us to the second piece that I saw in the past week. It was a report out of Forrester called the Facebook Factor. This is a bit of a tangential detour, so bear with me.

The report posits that we can now quantify the value of a Facebook “like.” The reasoning is fairly simple. If you add a few questions to a typical customer survey, you can start to quantify the correlation between someone liking you on Facebook and subsequent purchasing of your product. But, as Forrester points out in the report, there is a correlation/causation trap here that could lead to many marketers making the wrong conclusion.

If you try to equate people who felt motivated to “like” you on Facebook with likelihood to purchase, you run the risk of mistaking correlation for causation. People didn’t buy your product as a result of “liking” you on Facebook.  The Facebook “like” came as a result of a positive “belief” about your brand. It was an effect, not a cause. At best, the Facebook Factor should be considered as nothing more than a leading indicator of brand preference.

But many marketers will confuse cause and effect. They will believe that driving Facebook “likes” will drive higher brand loyalty.  This is where brand and product can potentially become decoupled. Here, once marketers start assigning a value to a Facebook “like” based on Forrester’s methodology, they will start regarding Facebook “likes” as the end goal, trusting in the mistaken belief that a Facebook “like” will always correlate positively to purchase behavior.

Once this decoupling happens, the value of the Facebook “like” starts to erode. The motivation for the “like” often has little to do with a positive brand experience. It’s driven by a promotion or campaign that has just one aim: to drive as many likes as possible. From the customer’s perspective, it’s easy to hit the “like” button. They have no skin in the game. There is no belief behind the action.

In the end, I believe Dyson’s definition of brand is the more authentic one. It goes back to the very roots of branding, which was a reassurance to buyers that they were buying what they believed they were buying.

Read more: http://www.mediapost.com/publications/article/174966/brand-beliefs-and-the-facebook-factor.html#ixzz2ik9IjRDB