The Psychology of Social: Are We Hardwired to Use Social Media?

Man is by nature a social animal; an individual who is unsocial naturally and not accidentally is either beneath our notice or more than human. Society is something that precedes the individual. Anyone who either cannot lead the common life or is so self-sufficient as not to need to, and therefore does not partake of society, is either a beast or a god. 

Aristotle

I’ve looked at online entertainment and I’ve looked at online tools, both in a quest to see where loyal and stable audiences might be found. But that leaves one huge part of the online landscape unexplored – online social media. In both my previous explorations, the scope of the quest quickly exploded into several posts. I think social media will be as difficult to restrict to a few posts, if not more so.

One thing that both entertainment and usefulness had in common was their foundation – our human drives. In any area I’ve explored up to now, I’ve always found our interactions with technology, as fickle as they may be, are layered over innate human drives with origins reaching back several thousands of generations. In entertainment, although the channels may have changed drastically in the past few decades (digital media, video games, virtual environments), our responses are predictably human. The things that make us cry, jump in our seats or laugh out loud really haven’t changed that much in many thousands of years. Humans adapt quickly to new technology, but our tastes remain reliably consistent.

Usefulness is a little different. In this case, our expectations of utility and the ever-rising bar of technology form somewhat of an arms race, with each upping the ante for the other. New tools allow us to do new things, which reset our expectations. These reset expectations cause us to periodically review the tools we use, and if they no longer match our expectations, we go looking for new tools. But even if we’re on the hunt for increased usefulness, we still use strategies that appear to have evolved hundreds of thousands of years ago on the savannah. I believe we forage for and evaluate useful technologies the same way we forage for food. This means that while technologies may change quickly, our behaviors towards them are remarkably predictable.

20090921_social_connectionsSo, what should we expect as we explore how the human need for society plays out in new online arenas? Again, I think it’s safe to say that our behaviors will be driven by innate human needs and strategies. So that seems to be as good a place as any to start.

In their book “Driven, How Human Nature Shapes Our Choices,” Harvard professors Paul Lawrence and Nitin Nohria tried to reduce human nature down to the lowest possible number of non-redundant factors. They came up with four irreducible drives:

  • The Need to Acquire
  • The Need to Bond
  • The Need to Learn
  • The Need to Defend

All human actions, all cultural trends, all societal behaviors will be driven by one or a combination of these factors. If Lawrence and Nohria are right, then the usage of social media should be no exception. Let’s look at the four to see how they might map onto social media usage.

The Need to Bond

I’ll start with the most obvious one – the need to Bond. Social media is all about bonding. This hits squarely at the heart of our social nature. As Aristotle said, we’re not built to be alone. Humans thrive in herds. And social media provides us a digitally mediated way to bond.

The complexity of our social bonds are staggering. It’s amazing to think of all the dimensions we impose on our social relationships. Things like status, gossip, empathy, reciprocity, jealousy, xenophobia, admiration, loyalty, love, hate and so many other emotionally charged factors constantly occupy our mind as we try to navigate the stormy waters of our social connections. One might be tempted to throw up our hands in frustration and live in social isolation, but we don’t. Why? Because evolution has proven conclusively that we’re better together than apart. That strategy has been hardwired into our genes. As much as maintaining a social network is a complete pain in the ass sometimes, it’s a necessary part of the human experience. Most times, the benefits outweigh the drawbacks.

The challenge, however, is that all this baggage will be hauled over to whatever new platforms we use to connect with others. This includes online social media. To be effective and engaging, a social media tool has to allow us to do the things we have always done to survive and thrive in our respective herd – whether it’s to increase the frequency of connection with family, gossip in real-time, brag more effectively to all of our acquaintances at once or reconnect with those that lie in the more out flung regions of our networks. While they’re all very human, these activities, when brought on to a publishing platform (which is a major feature of all social media) introduces a significant signal to noise issue.

The Need to Acquire

While we don’t usually acquire physical things through social media, we sure as hell use it to brag about the things we do acquire in the real world. A unhealthy proportion of social media activity is devoted to the acquisition of new cars, clothes, jewelry, trips, houses, boats – you name it, we tweet (or Facebook, or Instagram) about it. The arms race of social status is being waged daily on social media.

The Need to Learn

One of the biggest reasons why humans became social animals is that it was a much more efficient way to learn. In a herd, we don’t have to learn every lesson ourselves – we can learn from the experiences of other. Of course, that requires a way for lessons to spread throughout our networks. Stories, gossip, rumors – these are all social forms of information transmission. And they have all migrated onto our digital social media platforms.

The Need to Defend

This is probably the least social of Nohria and Lawrence’s Four Drives, at least as it might apply to use of social media. Humans need to defend ourselves, our kin, our community (or tribe, or nation) our possessions, our reputation, our status, our beliefs and our security. But, like all the drives, the need to defend, especially the defense of our beliefs, status or reputation, does play out in the online forum as well.

When looked at in the context of these four innate drives, it’s clear that the use of social media aligns well with our evolved requirements. It is just another channel we can use to let our pre-wired social tendencies play out. So, it passes the first gut-test. This is something we would do naturally, with or without the tools of social media. The next question is, how might our social activities change, for the good and the bad, when they’re mediated through digital channels? I’ll come back here in the next post.

 

#Meaningless #Crap

First published April 10, 2014 in Mediapost’s Search Insider

hashtagEverybody should have a voice – I get that. Thank goodness that the web and social media have democratized publication. Because of that, the power to say what’s on our mind is just a click away. From this power, great things have and will continue to come – the overthrow of tyrants, the quest for truth, freedom from oppression. I’m pretty sure those are all good things. Important things.

But I’m also pretty sure the signal to noise ratio in social media content is infinitesimal – verging on undetectable. For every post that moves humanity incrementally forward, there are thousands that drive us over the brink into mind numbing mediocrity.

For example, Justin Bieber has 51 million followers, and has tweeted 26,508 times. That, in case you’re wondering, has produced 1.35 trillion “Bieberisms,” or 193 little Bieber-tweets for every man, woman and child on planet Earth. Here’s one of his finest: “Put your heart into everything you do”. Perhaps the Biebs would be better served by using his head a little bit too. But no matter, he tweets on, sharing his special brand of wisdom. No wonder over 70% of all tweets never get read.

And, for God’s sake – stop hashtagging everything! First of all, it only belongs on Twitter and Instagram. It’s not a universal punctuation mark. And it doesn’t belong in front of every word of your post! If you’re writing about something that falls under a topic category that people actually care about – then by all means slip a hashtag in there. For example:

“Witnessing special forces retaking capital building in Kiev – #ukrainecrisis”

Or:

“Just discovered key gene in early detection of Alzheimer’s – #alzheimerresearch”

See how it works? You’re adding key content to a topic that people care about and may actually be searching for on Twitter. This is how not to use hashtags:

“Off to a funeral #selfie #zebra #sunglasses #bling #hairdown #polo #countrygirl #aero #dodge #ram #cute”

All I can say is #shoot #me.

The other problem is that with this diarrheic explosion of content flooding online, it becomes impossible to sift through all of it to find things that are truly important. Generally, most content filters use one of two criteria – recency or popularity. Recency is fine if you’re looking for breaking news. It’s a clearly understood parameter. Popularity, however, has some issues. The theory here is that the wisdom of crowds can be relied on to push the best content to the top. But that’s not really how the wisdom of crowds works. Just because something is popular doesn’t necessarily mean it’s good. And it certainly doesn’t mean it’s important. All too often, it just means that it panders to the lowest common denominator. Do we really want that to be our filtering criteria? Should Kanye West and Keeping Up with the Kardashians mark our cultural high water mark?

One last rant. “Epic” is not the right adjective to apply to concert tickets, Saturday nights at the club, bowls of chili or, when incorrectly combined with the verb “fail”, your company’s Christmas party. According to this post,

“the word epic should only be used to describe two or three things, ever. In fact, here’s a comprehensive list of all things epic: 1. Oceans 2. Lengthy Narratives 3. The Cosmos.”

That’s it.

Feel free to retweet if you wish. Or not. No one will read it anyway.

Five Years Later – An Answer to Lance’s Question (kind of)

112309-woman-internetIt never ceases to amaze me how writing can take you down the most unexpected paths, if you let it. Over 5 years ago now, I wrote a post called “Chasing Digital Fluff – Who Cares about What’s Hot?” It was a rant, and it was aimed at marketer’s preoccupation with what the latest bright shiny object was. At the time, it was social. My point was that true loyalty needs stabilization in habits to emerge. If you’re constantly chasing the latest thing, your audience will be in a constant state of churn. You’d be practicing “drive-by” marketing. If you want to find stability, target what your audience finds useful.

This post caused my friend Lance Loveday to ask a very valid question…”What about entertainment?” Do we develop loyalty to things that are entertaining? So, I started with a series of posts on the Psychology of Entertainment. What types of things do we find entertaining? How do we react to stories, or humor, or violence? And how do audiences build around entertainment? As I explored the research on the topic, I came to the conclusion that entertainment is a by-product of several human needs – the need to bond socially, the need to be special, our appreciation for others whom we believe to be special, a quest for social status and artificially stimulated tweaks to our oldest instincts – to survive and to procreate. In other words, after a long and exhausting journey, I concluded that entertainment lives in our phenotype, not our genotype. Entertainment serves no direct evolutionary purpose, but it lives in the shadows of many things that do.

So, what does this mean for stability of an audience for entertainment? Here, there is good news, and bad news. The good news is that the raw elements of entertainment haven’t really changed that much in the last several thousand years. We can still be entertained by a story that the ancient Romans might have told. Shakespeare still plays well to a modern audience. Dickens is my favorite author and it’s been 144 years since his last novel was published. We haven’t lost our evolved tastes for the basic building blocks of entertainment. But, on the bad news side, we do have a pretty fickle history when it comes to the platforms we use to consume our entertainment.

This then introduces a conundrum for the marketer. Typically, our marketing channels are linked to platforms, not content. And technology has made this an increasingly difficult challenge. While we may connect to, and develop a loyalty for, specific entertainment content, it’s hard for marketers to know which platform we may consume that content on. Take Dickens for example. Even if you, the marketer, knows there’s a high likelihood that I may enjoy something by Dickens in the next year, you won’t know if I’ll read a book on my iPad, pick up an actual book or watch a movie on any one of several screens. I’m loyal to Dickens, but I’m agnostic as to which platform I use to connect with his work. As long as marketing is tied to entertainment channels, and not entertainment content, we are restricted to targeting our audience in an ad hoc and transitory manner. This is one reason why brands have rushed to use product placement and other types of embedded advertising, where the message is set free from the fickleness of platform delivery challenges. If you happen to be a fan of American Idol, you’re going to see the Coke and Ford brands displayed prominently whether you watch on TV, your laptop, your tablet or your smartphone.

It’s interesting to reflect on the evolution of electronic media advertising and how it’s come full circle in this one regard. In the beginning, brands sponsored specific shows. Advertising messages were embedded in the content. Soon, however, networks, which controlled the only consumption choice available, realized it was far more profitable to decouple advertising from the content and run it in freestanding blocks during breaks in their programming. This decoupling was fine as long as there was no fragmentation in the channels available to consume the content, but obviously this is no longer the case. We now watch TV on our schedule, at our convenience, through the device of our choice. Content has been decoupled from the platform, leaving the owners of those platforms scrambling to evolve their revenue models.

So – we’re back to the beginning. If we want to stabilize our audience to allow for longer-term relationship building, what are our options? Obviously, entertainment offers some significant challenges in this regard, due mainly to the fragmentation of platforms we use to consume that content. If we use usefulness as a measure, the main factor in determining loyalty is frequency and stability. If you provide a platform that becomes a habit, as Google has, then you’ll have a fairly stable audience. It won’t destabilize until there is a significant enough resetting of user expectations, forcing the audience to abandon habits (always very tough to do) and start searching for another useful tool that is a better match for the reset expectations. If this happens, you’ll be continually following your audience through multiple technology adoption curves. Still, it seems that usefulness offers a better shot at a stable audience than entertainment.

But there’s still one factor we haven’t explored – what part does social connection play? Obviously, this is a huge question that the revenue models of Facebook, Twitter, Snapchat and others will depend on. So, with entertainment and usefulness explored ad nauseum, in the series of posts, I’ll start tracking down the Psychology of Social connection.

Letting the Foxes into Journalism’s Hen(Hedgehog) House

First published March 27, 2014 in Mediapost’s Search Insider

fanhI am rooting for Nate Silver and fivethirtyeight.com, his latest attempt to introduce a little data-driven veracity into the murky and anecdotal world of journalism. But I may be one of the few, at least if we take the current backlash as a non-scientific, non-quantitative sample:

I have long been a fan of Nate Silver, but so far I don’t think this is working. – Tyler Cowen, Marginal Revolution

Nate Silver’s new venture may become yet another outlet for misinformation when it comes to the issue of human-caused climate change, Michael Mann, director of the Earth System Science Center at Pennsylvania State University.

Here’s hoping that Nate Silver and company up their game, soon. – Paul Krugman, NY Times

Krugman also states:

You can’t be an effective fox just by letting the data speak for itself — because it never does. You use data to inform your analysis, you let it tell you that your pet hypothesis is wrong, but data are never a substitute for hard thinking.

Now..Nate Silver doesn’t disagree with this. In fact, he says pretty much the same thing in his book, The Signal and the Noise:

The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning.

But he goes on,

Like Caesar, we may construe them in self-serving ways that are detached from their objective reality.

And it’s this construal that Silver is hoping to nip in the bud with FiveThirtyEight. In essence, he wants to do it by being a Fox, to borrow from Isaiah Berlin’s analogy.

‘The fox knows many things, but the hedgehog knows one big thing.’ We take a pluralistic approach and we hope to contribute to your understanding of the news in a variety of ways.

Silver thinks the media’s preoccupation with punditry is a dangerous thing. Pundits, whether they’re coming from the right or left, are Hedgehogs. They get paid for their expertise on “one big thing.” And the more controversial their stand, the more attention they get. This can lead to a dangerous spiral, as researcher Philip Tetlock found out:

What experts think matters far less than how they think. If we want realistic odds on what will happen next, coupled to a willingness to admit mistakes, we are better off turning to experts who embody the intellectual traits of Isaiah Berlin’s prototypical fox—those who “know many little things,” draw from an eclectic array of traditions, and accept ambiguity and contradiction as inevitable features of life.

Tetlock was researching how expertise correlated with the ability to make good predictions. What he found was that it was actually an inverse relationship. The higher the degree of expertise, the more likely the person in question was a hedgehog. Media pundits are usually extreme versions of hedgehogs, which not only have one worldview, but also love to talk about it. Nate Silver believes that to get an objective view of world events, you need to be a fox, first, but secondly; you should be a fox that’s good at sifting through data:

Conventional news organizations on the whole are lacking in data journalism skills, in my view. Some of this is a matter of self-selection. Students who enter college with the intent to major in journalism or communications have above-average test scores in reading and writing, but below-average scores in mathematics.

So, all this makes sense. The problem in Silver’s approach is that journalism is the way it is because that’s the way humans want it. While I applaud Silver’s determination to change it, he may be trying to push water up hill. Pundits exist not just because the media keeps pushing them in front of us – they exist because we keep listening. Humans like opinions and anecdotes. We’re not hardwired to process data and objectively rationalize. We connect with stories and we’re drawn to decisive opinion leaders. Silver will have to find some middle ground here, and that seems to be where the problems arise. The minute writers add commentary to data; they have to impose an ideological viewpoint. It’s impossible not to. And when you do that, you introduce a degree of abstraction.

The backlash against Fivethirtyeight.com generally falls into two camps: Foxes like Silver that have no problem with the approach but disagree with the specific data put forward and Hedgehogs that just don’t like the entire concept. The first camp may come onside as Silver and his team work out the inevitable hiccups in their approach. The second, which, it should be noted, have a large number of pundits in their midst, will never become fans of Silver and his foxlike approach.

In the end though, it really doesn’t matter what columnists and journalists think. It’s up to the consumers of news media. We’ll decide what we like better – hedgehogs or foxes.

Can Facebook Maintain High Ground?

 First published March 13, 2014 in Mediapost’s Search Insider

SnapchatPicAs I said in my last column, Facebook’s recent acquisition spree seems to indicate that they’re trying to evolve from being our Social Landmark to being a virtual map that guides us through our social activity. But, as Facebook rolls out new features or acquires one-time competitors in order to complete this map of the social landscape, will we use it?  Snapchat CEO Evan Spiegel apparently doesn’t think so. That’s part of the reason he turned down $3 billion from Facebook.

At the end of 2012, Mark Zuckerberg paid Spiegel and his team a visit. The purpose of the visit was to scare the bejeezus out of Snapchat by threatening to crush them with the roll out of Poke.  Of course, we now know that Poke was a monumental flop while Snapchat rolled along quite nicely, thank you.  Several months later, Zuck flew out to meet with the Snapchat team again, taking a decidedly different tone this time. He also brought along a very big checkbook.  Snapchat said thanks, but no thanks.

So, how can a brash start up like Snapchat beat the 800 lb Gorilla in it’s own back yard? Why was Poke DOA? Was it a one-of-a-kind miscue on the part of Facebook – or part of a trend?

Part of the answer may lie in how we feel about novelty vs familiarity in the things we deal with. As I said in the last column, we go through 3 stages when we explore new landscapes. We move from navigating by landmarks to memorizing routes and finally, we create our own mental maps of the space, allowing us to plot our own routes as needed. It we apply this to navigating a virtual space like the online social sphere, we should move from relying on landmarks (like Facebook) to using routes (single purpose apps like Snapchat) and finally, to creating our own map that allows us to switch back and forth between apps as required.  Facebook wants to jump from the first stage to the last in order to remain dominant in the social market maintaining our map for us by becoming a hub for all required social functionality. But if the Poke story is any indication, we may not be willing to go along for the ride.

But there’s a subtle psychological point to how we learn to navigate new landscapes – we gain mastery over our environment. With this increased confidence comes a reluctance to feel we’re moving backward. We tend to discard the familiar and embrace novelty as we gain confidence. This squares with research done in the familiarity and novelty seeking in humans. We look for familiarity in things that have high degrees of risk, in the faces of others around us or when we’re operating on autopilot. But when we’re actively considering and judging options and looking for new opportunities, we are drawn to new things.

Humans are natural foragers. We have built in rules of conduct when we go out seeking things that will improve our lot, whether it be food, shelter or tools. Ideally, we look for things that will offer us a distinct advantage over the status quo with a reasonable investment of effort. We balance the two – advantage against effort. If the new options come from a overly familiar place, we tend to mentally discount the potential advantage because we no longer feel we’re moving forward. Over time, this builds into a general feeling of malaise towards the overly familiar.

Time will tell if Evan Spiegel was prescient or just plain stupid in turning down Facebook’s offer. The question is not so much will Facebook prevail, but rather will Snapchat end up emerging as a key part of the social landscape on a continuing basis? That particular landscape is notoriously unstable and it’s been known to swallow up many, many other companies with nary a burp.  Perhaps Spiegel should have taken the money and ran.

But then I wouldn’t be betting the farm on Facebook’s chances of permanence either.

Finding Our Way in the Social Landscape

First published March 6, 2014 in Mediapost’s Search Insider

social-mediaLast month, Om Malik (of GigaOM fame) wrote an article in Fast Company about the user backlash against Facebook.  To be fair, It seems that what’s happening to Facebook is not so much a backlash as apathy. You have to care to lash back. This is more of a wholesale abandonment, as millions of users are going elsewhere – using single purpose apps to get their social media fix. According to the article,

“we cycle between periods in which we want all of our Internet activity consolidated and other times in which we want a bunch of elegant monotaskers. Clearly we have reentered a simplification phase.”

There’s a reason why Facebook has been desperately trying to acquire Snapchat for a reported $3 Billion. There’s also a reason why they picked up Instagram for a billion last year.  It’s because these simple little apps are leaving the home grown Facebook alternatives in the dust. Snapchat is killing Facebook’s Poke – as Mashable pointed out in this comparison.  Snapchat has consistently stayed near the top of App Annie’s most popular download chart for the past 18 months. This coincides exactly with Facebook’s release of Poke.

Screen-Shot-2013-11-14-at-11.29.55-AM

Download rates of Facebook Poke

Screen-Shot-2013-11-14-at-12.10.02-PM

Download rates of Snapchat

Malik indicates it’s because we want a simpler, streamlined experience. A recent article in Business Insider goes one step further – Facebook is just not cool anymore. The mere name induces extended eye rolling in teenagers. It’s like parking the family mini-van in the high school parking lot.  “I hate Facebook. It’s just so boring,” said one of the teens interviewed. Hate! That’s a pretty strong word. What did the Zuck ever do to garner such contempt? Maybe it’s because he’s turning 30 in a few months. Maybe it’s because he’s an old married man.

Or maybe it’s just that we have a better alternative. Malik has a good point. He indicates that we tend to oscillate between consolidation and specialization. I take a bit different view. What’s happening in social media is that we’re getting to know the landscape better. We’re finding our way. This isn’t so much about changing tastes as it is about increased familiarity and a resetting of expectations.

If you look at how humans navigate new environments, you’ll notice some striking similarities. When we encounter a new landscape, we go through three phases of way finding. We begin with relying on landmarks. These are the “highest ground” in a new, unfamiliar landscape and we navigate relative to them. They become our reference points and we don’t stray far from them. Facebook is, you guessed it, a landmark.

The next phase is called “Route Knowledge.” Here, we memorize the routes we use to get from landmark to landmark. We become to recognize the paths we take all the time. In the world of online landscapes, you could substitute the word “app” for “route.” Instagram, Snapchat, Vine and the rest are routes we use to get where we need to go quickly and easily. They’re our virtual “short cuts.”

The last stage of way finding is “Survey Knowledge.”  Here, we are familiar enough with a landscape that we’ve acquired a mental “map” of it and can mentally calculate alternative routes to get to our destination. This is how you navigate in your hometown.

What’s happening to Facebook is not so much that our tastes are swinging. It’s just that we’re confident enough in our routes/apps that we’re no longer solely reliant on landmarks.  We know what we want to do and we know the right tool to use. The next stage of wayfinding, Survey Knowledge, will require some help, however. I’ve talked in the past about the eventual emergence of meta-apps. These will sit between us and the dynamic universe of tools available. They may be largely or even completely transparent to us. What they will do is learn about us and our requirements while maintaining an inventory of all the apps at our disposal. Then, as our needs arise, it will serve up the right app for the job. These meta-apps will maintain our survey knowledge for us, keeping a virtual map of the online landscape to allow us to navigate at will.

As Facebook tries to gobble up the Instagrams and Snapchats of the world, they’re trying to become both a landmark and a meta-app. Will they succeed? I have my thoughts, but those will have to wait until a future column.

Now, That’s a Job Description I Could Get Behind!

First published February 20, 2014 in Mediapost’s Search Insider

I couldn’t help but notice that last week’s column, where I railed against the marketer’s obsession with tricks, loopholes and pat sound bites got a fair number of retweets. The irony? At least a third of those retweets twisted my whole point – that six seconds (or any arbitrary length of message) isn’t the secret to getting a prospect engaged. The secret is giving them something they want to engage with.

tweet ss

As anyone who has been unfortunate to spend some time with me when I’m in particularly cynical mood about marketing can attest to, I go a little nuts with this “Top Ten Tricks” or “The Secret to…” mentality that seems pervasive in marketing. I’m pretty sure that anyone who retweeted last week’s column with a preface like “Does your advertising engage your consumer in 6 seconds or less? If not, you’re likely losing customers” didn’t bother to actually read past the first paragraph. Maybe not even the first line.

And that’s the whole problem. How can we expect marketers to build empathy, usefulness and relevance into their strategy when many of them have the attention span of a small gnat? As my friend Scott Brinker likes to say when it comes to marketer’s misbehaving, “This is why we can’t have nice things.”

Marketing – good marketing – is not easy but it’s also not a black box. It’s not about secrets or tricks or one-off tactics. It’s about really understanding your customers at an incredibly deep level and then working your ass off to create a meaningful engagement with them. Trying to reduce marketing to anything less than that is like trying to breeze your way through 50 years of marriage by following the Top 3 Tricks to get lucky this Friday night.

Again, this is about meaningful engagements. And when I say meaningful, it’s the customer that gets to decide what’s meaningful. That’s what’s potentially so exciting about breakthroughs like the Oreo Super Bowl campaign. It’s the opportunity to learn what’s meaningful to prospects and then to shift and tailor our responses in real time. Until now, marketing has been “Plan, Push and Pray.” We plan our attack, we push out our message and we pray it finds it’s target and that they respond by buying stuff. If they don’t buy stuff, something went wrong, probably in the planning stage. But that is an awfully long feedback loop.

You’ll notice something about this approach to marketing. The only role for the prospect is as a consumer. If they don’t buy, they don’t participate.  This comes as a direct result of the current job description of a marketer: Someone who gets someone else to buy stuff. But what if we rethink that description? Technology that enables real time feedback is allowing us to create an entirely new relationship with customers. What would happen if we redefined marketing along these lines: To understand the customer’s reality, focusing on those areas where we can solve their problems and improve that reality?

And as much as that sounds like a pat sound bite, if you really dig into it, it’s far from a quick fix. This is a way to make a radically different organization. And it moves marketing into a fundamentally different role. Previously, marketing got its marching orders from the CEO and CFO. Essentially, they were responsible for moving the top line ever northward. It was an internally generated mandate – to increase sales.

But what if we rethink this? What if the entire organization’s role is to constantly adapt to a dynamic environment, looking for advantageous opportunities to improve that environment? And, in this redefined vision, what if marketing’s role was to become the sense-making interface of the company? What if it was the CMO’s job was to consistently monitor the environment, create hypotheses about how to best create adaptive opportunities and then test those hypotheses in a scientific manner?

In this redefinition of the job, Big Data and Real Time Marketing take on significantly new qualities, first as a rich vein of timely information about the marketplace and secondly as a never ending series of instant field experiments to provide empirical backing to strategy.

Now, marketing’s job isn’t to sell stuff, it’s to make sense of the market and, in doing so, help define the overall strategic direction of the company. There are no short cuts, no top ten tricks, but isn’t that one hell of a job description?

The Psychology of Usefulness: The Acceptance of Technology – Part Two

In my last post, I talked about how the Theory of Reasoned Action and the original Technology Acceptance Model tried to predict both intention and usage of new technologies. As a quick recap, let’s look again at Davis and Bagozzi’s original model.

Technology_Acceptance_Model

In aiming for the simplest model possible, there was significant conflation applied to the front end of the model – with just one box representing external variables, which then led to two similarly conflated boxes: Perceived Usefulness and Perceived Ease of Use. While this simplification was admirable in the quest for parsimony, in real world situations it seemed like it went too far in this direction. There was a lot happening between the three boxes at the front of the model that demanded closer examination.

Davis indicated that there was an interesting relationship between Perceived Usefulness and Perceived Ease of Use. One of the mechanisms at play that has to be understood is self-efficacy. In understanding adoption of technology, self-efficacy is a key factor. Essential, it means that the easier a system is to use, the greater the user’s sense of efficacy. They believe they have control over what they are doing. And control, especially on a work context, is a strong motivational driver. There is an extensive body of work exploring the psychological importance of control. If we feel we’re in control, we also feel empowered to mitigate risk. The concept of self-efficacy helps to highlight the importance of the Perceived Ease of Use box. But what about the other box: Perceived Usefulness?

Davis, in his accompanying notes and research, indicated that Perceived Usefulness is a stronger indicator of intention than Perceived Ease of Use. In other words, we are willing to put up with some pain to learn a new technology if we feel it will offer a significant improvement in our ability to complete a task. This balancing equation requires two heuristic evaluations on the part of the user: the allocation of cognitive resources required to gain proficiency and the expected usefulness of the tool once proficiency is gained. This is exactly the same equation used in Charnov’s Marginal Value Theorem, applied in a different context. In optimal foraging, we (and all animals who forage) balance expenditure of resources required to reach a food patch against the expected food value to be derived from that patch. In technology adoption, we balance expenditure of resources required to master a new technology against the increased usefulness that technology offers.

In this heuristic evaluation, there are four key marketing lessons for anyone who’s business model relies on the adoption of new technology:

1)   Lessen the intimidation of the learning curve. Persuade the user (and this is a key point that I’ll return to in in point 4) that this is a reasonable investment of resources. Build a sense of perceived ease of use. Provide visible links to intuitive learning resources. Often, marketers overplay the feature benefits of their products to show how powerful they are. But, as they’re doing this, they fail to realize that this upsets the balance between perceived usefulness and perceived ease of use.

2)   Provide clear examples of perceived usefulness in terms that are immediately relevant. Remember, this is the key factor in the equation the prospect is trying to balance. The more salient you can make the perceived usefulness, the more likely the user is to adopt it, even if a learning curve is present. Ideally, get that usefulness across with very specific, industry relevant examples that allow the user to visualize usage of the technology.

3)   Remember that the user is balancing the two factors. Ease of use is great, but it can’t come at the expense of overall usefulness. In fact, in calculating the right balance (which should be done with extensive testing feedback from target customers) it should offer a significant gain in usefulness (as measured against any incumbent technologies) with a relatively manageable investment of resources.

4)   Remember that you’re talking to a user. When trying to strike the right balance, remember that you’ll probably be talking to different people as the decision progresses. For the user, the right balance between perceived usefulness and perceived ease of use must be struck. But at some point, you’ll be talking to a buyer, not a user, before the sale actually is closed. This would be one of those external variables that fall outside the scope of the Technology Adoption Model. This switching of roles from “doers” to “buyers” is dealt with extensively in my book, The BuyerSphere Project.

In the next post, I’ll talk about how the Technology Acceptance Model has been modified over the past 2 decades so it better reflects real world decision making.

Who Owns Your Data (and Who Should?)

First published January 23, 2104 in Mediapost’s Search Insider

Lock backgroundLast week, I talked about a backlash to wearable technology. Simon Jones, in his comment, pointed to a recent post where he raised the very pertinent point – your personal data has value. Today, I’d like to explore this further.

I think we’re all on the same page when we say there is a tidal wave of data that will be created in the coming decade. We use apps – which create data. We use/wear various connected personal devices – which create data. We go to online destinations – which create data. We interact with an ever-increasing number of wired “things” – which create data. We interact socially through digital channels – which create data.  We entertain ourselves with online content – which creates data. We visit a doctor and have some tests done – which creates data. We buy things, both online and off, and these actions also create data. Pretty much anything we do now, wherever we do it, leaves a data trail. And some of that data, indeed, much of it, can be intensely personal.

As I said some weeks ago, all this data is creating a eco-system that is rapidly multiplying and, in its current state, is incredibly fractured and chaotic. But, as Simon Jones rightly points out, there is significant value in that data. Marketers will pay handsomely to have access to it.

But what, or whom, will bring order to this chaotic and emerging market? The value of the data compounds quickly when it’s aggregated, filtered, cross-tabulated for correlations and then analyzed. As I said before, the captured data is its fragmented state is akin to a natural resource. To get to a more usable end state, you need to add a value layer on top of it. This value layer will provide the required additional steps to extract the full worth of that data.

So, to retrace my logic, data has value, even in it’s raw state. Data also has significant privacy implications. And right now, it’s not really clear who owns what data. To move forward into a data market that we can live with, I think we need to set some basic ground rules.

First of all, most of us who are generating data have implicitly agreed to a quid pro quo arrangement – we’ll let you collect data from us if we get an acceptable exchange of something we value. This could be functionality, monetary compensation (usually in the form of discounts and rewards), social connections or entertainment. But here’s the thing about that arrangement – up to now, we really haven’t quantified the value of our personal data. And I think it’s time we did that. We may be trading away too much for much too little.

To this point we haven’t worried much about what we traded off and to whom because any data trails we left have been so fragmented and specific to one context, But, as that data gains more depth and, more importantly, as it combines with other fragments to provide much more information about who we are, what we do, where we go, who we connect with, what we value and how we think, it becomes more and more valuable. It represents an asset for those marketers who want to persuade us, but more critically, that data -our digital DNA – becomes vitally important to us. In it lays the quantifiable footprint of our lives and, like all data, it can yield insights we may never gain elsewhere. In the right hands, it could pinpoint critical weaknesses in our behavioral patterns, red flags in our lifestyle that could develop into future health crises, financial opportunities and traps and ways to allocate time and resources more efficiently. As the digitally connected world becomes denser, deeper and more functional, that data profile will act as our key to it. All the potential of a new fully wired world will rely on our data.

There are millions of corporations that are more than happy to warehouse their respective data profiles of you and sell it back to you on demand as you need it to access their services or tools.  They will also be happy to sell it to anyone else who may need it for their own purposes. Privacy issues aside (at this point, data is commonly aggregated and anonymized) a more fundamental question remains – whose data is this? Whose data should it be? Is this the reward they reap for harvesting the data? Or because this represents you, should it remain your property, with you deciding who uses it and for what?

This represents a slippery slope we may already be starting down.  And, if you believe this is your data and should remain so, it also marks a significant change from what’s currently happening. Remember, the value is not really in the fragments. It’s in bringing it together to create a picture of who you are. And we should be asking the question – who should have the right to create that picture of you – you – or a corporate data marketplace that exists beyond your control ?

The Psychology of Usefulness: How Online Habits are Broken

google-searchLast post, I talked about how Google became a habit – Google being the most extreme case of online loyalty based on functionality I could think of. But here’s the thing with functionally based loyalty – it’s very fickle. In the last post I explained how Charnov’s Marginal Value Theorem dictates how long animals spend foraging in a patch before moving on to the next one. I suspect the same principles apply to our judging of usefulness. We only stay loyal to functionality as long as we believe there are no more functional alternatives available to us for an acceptable investment of effort. If that functionality has become automated in the form of a habit, we may stick with it a little longer, simply because it takes our rational brain awhile to figure out there may be better options, but sooner or later it will blow the whistle and we’ll start exploring our options. Charnov’s internal algorithm will tell us it’s time to move on to the next functional “patch.”

Habits break down when there’s a shift if one of the three prerequisites: frequency, stability or acceptable outcomes.

If we stop doing something on a frequent basis, the habit will slowly decay. But because habits tend to be stored at the limbic level (in the basal ganglia), they prove to be remarkably durable. There’s a reason we say old habits die hard. Even after a long hiatus we find that habits can easily kick back in. Reduction of frequency is probably the least effective way to break a habit.

A more common cause of habitual disruption is a change in stability. Suddenly, if something significant changes in our task environment, our  “habit scripts” start running into obstacles. Think about the last time you did a significant upgrade to a program or application you use all the time. If menu options or paths to common functions change, you find yourself constantly getting frustrated because things aren’t where you expect them to be. Your habit scripts aren’t working for you anymore and you are being forced to think. That feeling of frustration is how the brain protects habits and shows how powerful our neural energy saving mode is. But, even if the task environment becomes unstable for a time, chances are the instability is temporary. The brain will soon reset its habits and we’ll be back plugging subconsciously away at our tasks. Instability does break a habit, but it just rebuilds a new one to take its place.

A more permanent form of habit disruption comes when outcomes are no longer acceptable. The brain hates these types of disruptions, because it knows that finding an alternative could require a significant investment of effort. It basically puts us back at square one. The amount of investment required is dependent on a number of things, including the scope of change required (is it just one aspect of a multi-step task or the entire procedure?), current awareness of acceptable alternatives (is a better solution near at hand or do we have to find it?), the learning curve involved (how different is the alternative from what we’re used to using), are there other adoption requirements (do we have to make an investment of resources – including time and/or money?) and how much down time will be involved in order to adopt the alternative. All these questions are the complexities that can be factors in the Marginal Value Theorem.

Now, let’s look at how each of these potential habit breakers applies to Google. First of all, frequency probably won’t be a factor because we will search more, not less, in the future.

Stability may be a more likely cause. The fact is, the act of online searching hasn’t really changed that much in the last 20 years. We still type in a query and get a list of results. If you look at Google circa 1998, it looks a little clunky and amateurish next to today’s results page, but given that 16 years have come and gone, the biggest surprise is that the search interface hasn’t changed more than it has.

Google now and then

A big reason for this is to maintain stability in the interface, so habits aren’t disrupted. The search page relies on ease of information foraging, so it’s probably the most tested piece of online real estate in history. Every pixel of what you see on Google, and, to a lesser extent, it’s competitors, has been exhaustively tested.

That has been true in the past but because of the third factor, acceptability of outcomes, it’s not likely to remain true in the future. We are now in the age of the app. Searching used to be a discrete function that was just one step of many required to complete a task. We were content to go to a search engine, retrieve information and then use that information elsewhere with other tools or applications. In our minds, we had separate chunks of online functionality that we would assemble as required to meet our end goal.

Let me give you an example. Let’s imagine we’re going to London for a vacation. In order to complete the end goal – booking flights, hotels and whatever else is required – we know we will probably have to go to many different travel sites, look up different types of information and undertake a number of actions. We expect that this will be the best path to take to our end goal. Each chunk of this “master task” may in turn be broken down into separate sub tasks. Along the way, we’ll be relying on those tools that we’re aware of and a number of stored procedures that have proven successful in the past. At the sub-task level, it’s entirely possible that some of those actions have been encoded as habits. For an example of how these tasks and stored procedures would play out in a typical search, see my previous post, A Cognitive Walkthrough of Searching.

But we have to remember that the only reason the brain is willing to go to all this work is that it believes it’s the most efficient route available to it. If there were a better alternative that would produce an acceptable outcome, the brain would take it. Our expectation of what an acceptable outcome would be would be altered, and our Marginal Value algorithm would be reset.

Up to now, functionality and information didn’t intersect too often online. There were places we went to get information, and there were places we went to do things. But from this point forward, expect those two aspects of online to overlap more and more often. Apps will retrieve information and integrate it with usefulness. The travel aggregator sites like Kayak and Expedia are an early example of this. They retrieve pricing information from vendors, user content from review sites and even some destination related information from travel sites. This ups the game in terms of what we expect from online functionality when we book a trip. Our expectation has been reset because Kayak offers a more efficient way to book travel than using search engines and independent vendor sites. That’s why we don’t immediately go to Google when we’re planning a trip.

Let’s fast-forward a few years to see how our expectations could be reset in the future. I suspect we’re not too far away from having an app where our travel preferences have been preset. This proposed app would know how we like to travel and the things we like to do when we’re on vacation. It would know the types of restaurants we like, the attractions we visit, the activities we typically do, the types of accommodation we tend to book, etc.  It would also know the sources we tend to use when qualifying our options (i.e. TripAdvisor). If we had such an app, we would simply put in the bare details of our proposed trip: departure and return dates, proposed destinations and an approximate itinerary. It would then go and assemble suggestions based on our preferences, all in one location. Booking would require a simple click, because our payment and personal information would be stored in the app. There would be no discrete steps, no hopping back and forth between sites, no cutting and pasting of information, no filling out forms with the same information multiple times. After confirmation, the entire trip and all required information would be made available on your mobile device.  And even after the initial booking, the app would continue to comb the internet for new suggestions, reviews or events that you might be interested in attending.

This “mega-app” would take the best of Kayak, TripAdvisor, Yelp, TripIt and many other sites and combine it all in one place. If you love travel as much as I do, you couldn’t wait to get your hands on such an app. And the minute you did, your brain would have reset it’s idea of what an acceptable outcome would be. There would be a cascade of broken habits and discarded procedures.

This integration of functionality and information foraging is where the web will go next. Over the next 10 years, usefulness will become the new benchmark for online loyalty. As this happens, our expectation set points will be changed over and over again. And this, more than anything, will be what impacts user loyalty in the future. This changing of expectations is the single biggest threat that Google faces.

In the next post I’ll look at what happens when our expectations get reset and we have to look at adopting a new technology.