Search and The Path to Purchase

Just how short do we want the path to purchase to be anyway?

A few weeks back Mediapost reporter Laurie Sullivan brought this question up in her article detailing how Instagram is building ecom into their app. While Instagram is not usually considered a search platform, Sullivan muses on the connecting of two dots that seem destined to be joined: search and purchase. But is that a destiny that users can “buy into?”

Again, this is one of those questions where the answer is always, “It depends.”  And there are at least a few dependencies in this case.

The first is whether our perspective is as a marketer or a consumer. Marketers always want the path to purchase to be as short as possible. When we have that hat on, we won’t be fully satisfied until the package hits our front step about the same time we first get the first mental inkling to buy.

Amazon has done the most to truncate the path to purchase. Marketers look longingly at their one click ordering path – requiring mere seconds and a single click to go from search to successful fulfillment. If only all purchases were this streamlined, the marketer in us muses.

But if we’re leading our double life as a consumer, there is a second “It depends…”  And that is dependent on what our shopping intentions are. There are times when we – as consumers – also want to fastest possible path to purchase. But that’s not true all the time.

Back when I was looking at purchase behaviors in the B2B world, I found that there are variables that lead to different intentions on the part of the buyer. Essentially, it boils down to the degree of risk and reward in the purchase itself. I first wrote about this almost a decade ago now.

If there’s a fairly high degree of risk inherent in the purchase itself, the last thing we want is a frictionless path to purchase. These are what we call high consideration purchases.

We want to take our time, feeling that we’ve considered all the options. One click ordering scares the bejeezus out of us.

Let’s go back to the Amazon example. Today, Amazon is the default search engine of choice for product searches, outpacing Google by a margin rapidly approaching double digits. But this is not really an apples to apples comparison. We have to factor in the deliberate intention of the user. We go to Amazon to buy, so a faster path to purchase is appropriate. We go to Google to consider. And for reasons I’ll get into soon, we would be less accepting of a “buy” button there.

The buying paths we would typically take in a social platform like Instagram are probably not that high risk, so a fast path to purchase might be fine. But there’s another factor that we need to consider when shortening the path to purchase – or buiding a path in the first place – in what has traditionally been considered a discovery platform. Let’s call it a mixing of motives.

Google has been dancing around a shorter path to purchase for years now. As Sullivan said in her article, “Search engines have strength in what’s known as discovery shopping, but completing the transaction has never been a strong point — mainly because brands decline to give up the ownership of the data.”

Data ownership is one thing, but even if the data were available, including a “buy now” button in search results can also lead to user trust issues. For many purchases, we need to feel that our discovery engine has no financial motive in the ordering of their search results. This – of course – is a fallacy we build in our own minds. There is always a financial motive in the ordering of search results. But as long as it’s not overt, we can trick ourselves into living with it. A “buy now” button makes it overt.

This problem of mixed motives is not just a problem of user perception. It also can lead publishers down a path that leaves objectivity behind and pursues higher profits ahead. One example is TripAdvisor. Some years ago, they made the corporate decision to parlay their strong position as a travel experience discovery platform into an instant booking platform. In the beginning, they separated this booking experience onto its own platform under the brand Viator. Today, the booking experience has been folded into the main TripAdvisor results and – more disturbingly – is now the default search order. Every result at the top of the page has a “Book Now” button.

Speaking as a sample of one, I trust TripAdvisor a lot less than I used to.

 

Personal Endeavour in the Age of Instant Judgement

No one likes to be judged — not even gymnasts and figure skaters. But at least in those sports, the judges supposedly know what it is they’re judging. So, in the spirit of instant feedback, let me rephrase: No one likes to be judged by a peanut gallery*. Or, to use a more era appropriate moniker, by a troll’s chorus.

Because of this, I feel sorry for David Benioff and D.B. Weiss, the showrunners of “Game of Thrones.” Those poor bastards couldn’t be any more doomed if they had been invited to a wedding of the red variety.

At least they were aware of their fate. In an interview with Entertainment Weekly, they disclosed their plans for the airing of the final episode. “We’ll in an undisclosed location, turning off our phones and opening various bottles,” Weiss admitted. “At some point, if and when it’s safe to come out again, somebody like [HBO’s ‘GOT’ publicist] will give us a breakdown of what was out there without us having to actually experience it.” Added Benioff: “I plan to be very drunk and very far from the internet.”

Like it or not, we now live in an era of instant judgement, from everyone. It’s the evil twin of social virality. It means we have to grow thicker skins than your average full-grown dragon**. And because I’m obsessively fixated on unintended consequences, this got me to thinking. How might all this judgement impact our motivation to do stuff?

First of all, let’s look at the good that comes from this social media froth kicked up by fervent fans. There is a sense of ownership and emotional investment in shows like “Game of Thrones” that’s reached a pitch never seen before — and I truly believe we’re getting better TV because of it.

If you look at any of the lists of the best TV shows of all time, they are decidedly back-end loaded. “Game of Thrones,” even at its worst, is better than almost any television of the ’80s or ’90s. And it’s not only because of the advances in special effects and CGI wizardry. There is a plethora of thoughtful, exquisitely scripted and superbly acted shows that have nary an enchantress, dragon or apocalypse of the walking dead in sight. There is no CGI in “Better Call Saul,” “Master of None” or “Atlanta.”

But what about the dark side of social fandom?

I suspect instant judgement might make it harder for certain people to actually do anything that ends up in the public arena. All types of personal endeavors require failure and subsequent growth as an ingredient for success. And fans are getting less and less tolerant of failure. That makes the entry stakes pretty high for anyone producing output that is going to be out there, available for anyone to pass judgement on.

We might get self-selection bias in arenas like the arts, politics and sports. Those adverse to criticism that cuts too deep will avoid making themselves vulnerable. Or — upon first encountering negative feedback — they may just throw in the towel and opt for something less public.

The contributors to our culture may just become hard-nosed and impervious to outside opinion — kind of like Cersei Lannister. Or, even worse, they may be so worried about what fans think that they oscillate trying to keep all factions happy. That would be the Jon Snows of the world.

Either way, we lose the contributions of those with fragile egos and vulnerable hearts. If we applied that same filter retroactively to our historic collective culture, we’d lose most of what we now treasure.

In the end, perhaps David Benioff got it right. Just be “very drunk and very far from the internet.”

* Irrelevant Fact #1: The term peanut gallery comes from vaudeville, where the least expensive seats were occupied by the rowdiest members of the audience. The cheapest snack was peanuts, which the audience would throw at the performers.

** Irrelevant Fact #2: Dragons have thick skin because they don’t shed their skins. It just keeps getting thicker and more armor-like. The older the dragon, the thicker the skin.

The Gap Between People and Platforms

I read with interest fellow Spinner Dave Morgan’s column about how software is destroying advertising agencies, but not the need for them. I do want to chime in on what’s happening in advertising, but I need a little more time to think about it.

What did catch my eye was a comment at the end by Harvard Business School professor Alvin Silk: “You can eliminate the middleman, but not his/her function.”

I think Dave and Alvin have put their collective thumbs on something that extends beyond our industry: the growing gap between people and platforms. I’ll use my current industry as an example – travel. It’s something we all do so we can all relate to it.

Platforms and software have definitely eaten this industry. In terms of travel destination planning, the 800-pound Gorilla is TripAdvisor. It’s impossible to overstate its importance to operators and business owners.  TripAdvisor almost single-handedly ushered in an era of do-it-yourself travel planning. For any destination in the world, we can now find the restaurants, accommodations, tours and attractions that are the favorites of other travellers. It allows us to both discover and filter while planning our next trip, something that was impossible 20 years ago, before TripAdvisor came along.

But for all its benefits, TripAdvisor also leaves some gaps.

The biggest gap in travel is what I’ve heard called the “Other Five.” I live in Canada’s wine country (yes, there is such a thing). Visitors to our valley – the Okanagan – generally come with 5 wineries they have planned to visit. The chances are very good that those wineries were selected with the help of TripAdvisor. But while they’re visiting, they also visit the “other five” – 5 wineries they discovered once they got to the destination. These discoveries depend on more traditional means – either word of mouth or sheer serendipity. And it’s often one of these “other five” that provide the truly memorable and authentic experiences.

That’s the problem with platforms like TripAdvisor, which are based on general popularity and algorithms. Technically, platforms should help you discover the long tail, but they don’t. Everything automatically defaults to the head of the curve. It’s the Matthew Effect applied to travel – advantage accumulates to those already blessed. We all want to see the same things – up to a point.

But then we want to explore the “other five” and that’s where we find the gap between platforms and people. We have been trained by Google not to look beyond the first page of online results. It’s actually worse than that. We don’t typically scan beyond the top five. But – by the very nature of ratings-based algorithms – that is always where you’ll find the “other five.” They languish in the middle of the results, sometimes taking years to bump up even a few spots. It’s why there’s still a market – and a rapidly expanding one at that – for a tour guided by an actual human. Humans can think beyond an algorithm, asking questions about what you like and pulling from their own experience to make very targeted and empathetic suggestions.

The problem with platforms is their preoccupation with scale. They feel they have to be all things to all people. I’ll call it Unicornitis – the obsession with gaining a massive valuation. They approach every potential market focused on how many users they can capture. By doing so, they have to target the lowest common denominator. The web thrives on scale and popularity; the rich get richer and the poor get poorer. Yes, there are niche players out there, but they’re very hard to find. They are the “other five” of the Internet, sitting on the third page of Google results.

This has almost nothing to do with advertising, but I think it’s the same phenomenon at work. As we rely more on software, we gain a false confidence that it replaces human-powered expertise. It doesn’t. And a lot of things can slip through the gap that’s created.

 

The Importance of Playing Make-Believe

One of my favourite sounds in the world is children playing. Although our children are well past that age, we have stayed in a neighbourhood where new families move in all the time. One of the things that has always amazed me is a child’s ability to make believe. I used to do this but I don’t any more. At least, I don’t do it the same way I used to.

Just take a minute to think about the term itself: make-believe. The very words connote the creation of an imaginary world that you and your playmates can share, even in that brief and fleeting moment. Out of the ether, you can create an ephemeral reality where you can play God. A few adults can still do that. George R.R. Martin pulled it off. J.K. Rowling did likewise. But for most of us, our days of make-believe are well behind us.

I worry about the state of play. I am concerned that rather than making believe themselves, children today are playing in the manufactured and highly commercialized imaginations of profit-hungry corporations. There is no making — there is only consuming. And that could have some serious consequences.

Although we don’t use imagination the way we once did, it is the foundation for the most importance cognitive tasks we do. It was Albert Einstein who said, “Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution.”

It is imagination that connects the dots, explores the “what-ifs” and peeks beyond the bounds of the known. It is what separates us from machines.

In that, Einstein presciently nailed the importance of imagination. Only here does the mysterious alchemy of the human mind somehow magically weave fully formed worlds out of nothingness and snippets of reality. We may not play princess anymore, but our ability to imagine underpins everything of substance that we think about.

The importance of playing make-believe is more than just cognition. Imagination is also essential to our ability to empathize. We need it to put ourselves in place of others. Our “theory of mind” is just another instance of the many facets of imagination.

This thing we take for granted has been linked to a massive range of essential cognitive developments. In addition to the above examples, pretending gives children a safe place to begin to define their own place in society. It helps them explore interpersonal relationships. It creates the framework for them to assimilate information from the world into their own representation of reality.

We are not the only animals that play when we’re young. It’s true for many mammals, and scientists have discovered it’s also essential in species as diverse as crocodiles, turtles, octopuses and even wasps.

For other species, though, it seems play is mainly intended to help come to terms with surviving in the physical world.  We’re alone in our need for elaborate play involving imagination and cognitive games.

With typical human hubris, we adults have been on a century-long mission to structure the act of play. In doing so, we have been imposing our own rules, frameworks and expectations on something we should be keeping as is. Much of the value of play comes from its very lack of structure. Playing isn’t as effective when it’s done under adult supervision. Kids have to be kids.

Play definitely loses much of its value when it becomes passive consumption of content imagined and presented by others through digital entertainment channels. Childhood is meant to give us a blank canvas to colour with our imagination.

As we grow, the real world encroaches on this canvas.  But the delivery of child-targeted content through technology is also shrinking the boundaries of our own imagination.

Still, despite corporate interests that run counter to playing in its purest sense, I suspect that children may be more resilient than I fear. After all, I can still hear the children playing next door. And their imaginations still awe and inspire me.

Selfies: A Different Take on Reality

It was a perfect evening in Sydney Harbor. I was there for a conference and the organizers had arranged an event for the speakers at Milsons Point – under the impressive span of the Harbour bridge. It was dusk and the view of downtown Sydney spread out in front of us with awesome breadth and scope. It was one of those moments that literally takes your breath away. That minute seemed eternal.

After some time, I turned around. There was another attendee, who was intently focused on taking a selfie and posting it to social media. His back was turned to the view behind him. At first, I thought I should do the same. Then I changed my mind. I’d rely on my memory and actually try to stay in the moment. My phone stayed in my pocket.

In the age of selfies, it turns out that my mini-existential crisis is getting more common. According to a new study published in the Journal of Consumer Research, something called “self-presentational concern” can creep into these lifetime moments and suck the awe right out of them. One of the study authors, Alixandra Barasch, explains, “When people take photos to share, they remember their experience more from a third-person perspective, suggesting that taking photos to share makes people consider how the event (and the photos) would be evaluated by an observer. “

Simply stated, selfies take us “out of the moment”. But this effect depends on why we’re taking the selfie in first place. The experimenters didn’t find the effect when people took selfies with the intent of just remembering the moment. It showed up when the selfie was taken for the express purpose of sharing on social media. Suddenly, we are more worried about how we look than where we are and what we’re doing.

Dr. Terri Apter, a professor of psychology at Cambridge University, has been looking at the emergence of selfies as a form of “self-definition” for some time. “We all like the idea of being sort of in control of our image and getting attention, being noticed, being part of the culture.” But when does this very human urge slip over the edge into a destructive spiral? Dr. Apter explains, “You can get that exaggerated or exacerbated by celebrity culture that says unless you’re being noticed, you’re no one,”

I suspect what we’re seeing now is a sort of selfie arms race. Can we upstage the rest of our social network by posting selfies in increasingly exotic locations, doing exceptional things and looking ever more “Mahvelous”? That’s a lot of pressure to put on something we do when we’re just supposed to be enjoying life.

A 2015 study explored the connection between personality traits and posting of selfies. In particular, the authors of the study looked at narcissism, psychopathy and self-objectification. They found that frequent posting of selfies and being overly concerned with how you look in the selfies can be tied to both self-objectification and narcissism. This is interesting, because those two things are at opposite ends of the self-esteem spectrum. Narcissists love themselves and those that self-objectify tend to suffer from low self-esteem. In both cases, selfies represent a way to advertise their personal brands to a wider audience.

There’s another danger with selfie-preoccupation that goes hand-in-hand with distancing yourselves from the moment you’re in – you can fall victim to bad judgement. It happened to Barack Obama at Nelson Mandela’s memorial ceremony. In a moment when he should have been acting with appropriate gravitas, he decided to take a selfie with Danish Prime Minister Helle Thorning-Schmidt and then British Prime Minister David Cameron. It was a stunningly classless moment from a usually classy guy. If you check a photo taken at the time, you can see that Michelle Obama was not amused. I agree.

Like many things tied to social media, selfies can represent a troubling trend in how we look at ourselves in a social context. These things seem to be pointing in the same direction: we’re spending more time worrying about an artificial reality of our own making and less time noticing reality as it actually exists.

We just have to put the phone down sometimes and admire the view across the harbor.

 

Don’t Be So Quick to Eliminate Friction

If you have the mind of an engineer, you hate friction. When you worship at the altar of optimization, friction is something to be ruthlessly eliminated – squeezed out of the equation. Friction equals inefficiency. It saps the energy out of our efforts.  It’s what stands between reality and a perfect market, where commerce theoretically slides effortlessly between participants. Much of what we call tech today is optimized with the goal of eliminating friction.

But there’s another side of friction. And perhaps we shouldn’t be too quick to eliminate it.  Without friction, there would be no traction, so you wouldn’t be able to walk. Your car would have no brakes. Nails, bolts, screws, glue and tape wouldn’t work. Without friction, there would be nothing to keep the world together.

And in society, it’s friction that slows us down and helps us smell the roses. That’s because another word for friction – when we talk about our experiential selves – is savouring.

Take conversations, for instance. A completely efficient, friction free conversation would be pretty damn boring. It would get the required information from participant A to participant B – and vice versa – in the minimum number of words. There would be no embellishment, no nuance, no humanity. It would not be a conversation we would savour.

Savouring is all about slowing down. According to Maggie Pitts, a professor at the University of Arizona who studies how we savour conversations, “Savouring is prolonging, extending, and lingering in a positive or pleasant feeling.” And you can’t prolong anything without friction.

But what about friction in tech itself?  As I said before, the rule of thumb in tech is to eliminate as much friction as possible. But can the elimination of friction go too far? Product designer Jesse Weaver says yes. In an online essay, he says we friction-obsessed humans should pay more attention to the natural world, where friction is still very much alive-and-well, thank you:

“Nature is the ultimate optimizer, having run an endless slate of A/B tests over billions of years at scale. And in nature, friction and inconvenience have stood the test of time. Not only do they remain in abundance, but they’ve proven themselves critical. Nature understands the power of friction while we have become blind to it.”

A couple weeks ago, I wrote about Yerkes-Dodson law; which states that there can be too much of a good thing – or, in this case – too little of a supposedly bad thing. According to a 2012 study, when it comes to assigning value, we actually appreciate a little friction. It’s known as the IKEA effect. There is a sweet spot for optimal effort. Too much and we get frustrated. Too little and we feel that it was too easy. When it’s just right, we have a crappy set of shelves that we love more than we should because we had to figure out how to put them together.

Weaver feels the same is true for tech.  As examples, he points to Amazon’s Dash smart button and Facebook’s Frictionless Sharing. In the first case, Amazon claims the need has been eliminated by voice-activated shopping on Alexa. In the second case, we had legitimate privacy concern. But Weaver speculates that perhaps both things just moved a little too fast for our comfort, removing our sense of control. We need a little bit of friction in the system so we feel we can apply the brakes when required.

If we eliminate too much friction, we’ll slip over that hump into not valuing the tech enabled experiences we’re having. He cites the 2018 World Happiness Report which has been tracking our satisfaction with live on a global basis for over a decade. In that time, despite our tech capabilities increasing exponentially, our happiness has flatlined.

I have issues with his statistical logic – there is a bushel basket full of confounding factors in the comparison he’s trying to make – but I generally agree with Weaver’s hypothesis. We do need some friction in our lives. It applies the brakes to our instincts. It forces us to appreciate the here and now that we’re rushing through. It opens the door to serendipity and makes allowances for savouring.

In the end, we may need a little friction in our lives to appreciate what it means to be human.

 

The Social Acceptance of Siri

There was a time, not too long ago, when I did a fairly exhaustive series of posts on the acceptance of technology. The psychology of how and when we adopted disruptive tech fascinated me. So Laurie Sullivan’s article on how more people are talking to their phone caught my eye.

If you look at tech acceptance, there are a bucket full of factors you have to consider. Utility, emotions, goals, ease of use, cost and our own attitudes all play a part. But one of the biggest factors is social acceptance. We don’t want to look like a moron in front of friends and family. It was this, more than anything else, that killed Google Glass the first time around. Call it the Glasshole factor.

So, back to Laurie’s article and the survey she referred to in it. Which shifts in the social universe are making it more acceptable to shoot the shit with Siri?

The survey has been done for the last three years by Stone Temple, so we’re starting to see some emerging trends. And here are the things that caught my attention. First of all, the biggest shifts from 2017 to 2019, in terms of percentage, are: at the gym, in Public Restrooms and in the Theatre. Usage at home has actually slipped a little (one might assume that these conversations have migrated to Alexa and other home-based digital assistants). If we’re looking at acceptance of technology and the factors driving it, one thing jumps out from the survey. All the shifts are to do with how comfortable we feel talking to our phone in publicly visible situations. There is obviously a moving threshold of acceptability here.

As I mentioned, the three social “safe zones” – those instances where we wouldn’t be judged for speaking to our phones – have shown little movement in the last three years. These are “Home Alone”, “Home with Friends” (public but presumably safe from social judgment), and “Office Alone.” As much as possible in survey-based research, this isolates the social factor from all the other variables rather nicely and shows its importance in our collective jumping on the voice technology band wagon.

This highlights an important lesson is acceptance of new technologies: you have to budget in the time required for society to absorb and accept new technologies. The more that the technology will be utilized in visibly social situations, the more time you need to budget. Otherwise, the tech will only be adopted by a tiny group of socially obtuse techno-weenies and will be stranded on the wrong side of the bleeding edge. As technology becomes more personal and tags along with us in more situations, the designers and marketers of that tech will have to understand this.

This places technology acceptance in a whole new ball park. As the tech we use increasingly becomes part of our own social facing brand, our carefully constructed personas and the social norms we have in place become key factors that determine the pace of acceptance.

This becomes a delicate balancing act. How do you control social acceptance? As an example, let’s take out one of my favorite marketing punching bags – influencer marketing – and see if we could accelerate acceptance by seeding tech acceptance with a few key social connectors. That same strategy failed miserably when it came to promoting Google Glass to the public. And there’s a perfectly irrational reason for it. It has nothing to do with rational stuff like use cases, aesthetics or technology. It had to do with Google picking the wrong influencers – the so-called Google Glass Explorers. As a group, they tended to be tech-obsessed, socially awkward and painfully uncool. They were the people you avoid getting stuck in the corner with at a party because you just aren’t up for a 90-minute conversation on the importance of regular hard drive hygiene. No one wants to be them.

If this survey tells us anything, it tells us that – sometimes – you just have to hope and wait. Ever since Everett Rogers first sketched it out in 1962, we’ve known that innovation diffusion happens on a bell curve. Some innovations get stranded on the upside of the slope and wither away to nothingness while some make it over the hump and become part of our everyday lives. Three years ago, there were certainly people talking to their phones on buses, in gyms and at movie theatres. They didn’t care if they were judged for it. But most of us did care. Today, apparently, the social stigma has disappeared for many of us. We were just waiting for the right time – and the right company.