The Primacy of the Patch: Information Foraging is the Key to Behavior

As I said, this week I want to dissect some aspects of human behavior to show why Rupert Murdoch is seriously out of touch and how Bing can’t corner the news market.

The primary reason is that we’re changing how we get information. The implications of this are fascinating, because the implications will soon spread through all marketplaces and aspects of our society. And it comes down to one important factor to consider: Humans are inherently lazy.

Laziness is a Good Thing

Now, before you get all morally indignant on me, let me explain: humans are lazy in the evolutionary sense, the same way that Richard Dawkin’s genes are selfish. We’re lazy because it’s a natural advantage, it’s built into our genome. To be more accurate, we’re lazy when the expenditure of more energy doesn’t make sense. We’re lazy in a subtle, subconscious way. And, like all aspects of human behavior, we’re not all equally lazy. There’s a bell curve of laziness. Laziness has gotten a bad rap in our puritanical, WASPish culture, but the fact is, when it comes to survival, laziness is often the optimal strategy.

Look at it a different way. Say you need to drive from Detroit to Chicago. The only goal is to get to Chicago and pay as little for gas as possible on the way. What vehicle are you going to take – a Hummer or a Prius? The Prius is a no brainer. In terms of fuel efficiency, the Prius is a lazy car. It does what it has to do more efficiently than a Hummer. In a vehicle, this is a virtue, but somewhere in our twisted culture, it’s become a bad thing for humans.

Fat And Lazy? Maybe Not …

Calories are a human’s gas tank. We’ve been genetically hardwired to be very fuel efficient. In fact, we’ve developed very sophisticated subconscious mechanisms to ingest as many calories as possible without expending calories to find them. This worked well when we lived on the African savanna and the only food source was the odd Baobab tree. It doesn’t work so well when there’s a McDonald’s around every corner. It’s not a cruel joke what we’re attracted to high fat, high sugar foods. These provide lots of calories in one sitting. That’s why our society is fat (fat and lazy – how’s that self esteem so far?)

So, what the hell does this all have to do with search? Well, when humans are faced with new challenges, we’re stuck using the tools that evolution has endowed us with. We borrow from other abilities. The technical term for this is exaption. When digital information came along, we had to look into our evolutionary toolkit and find something that would work.

Foraging for Information

At Xerox’s PARC in the late 90’s, Peter Pirolli was exploring how humans navigated hypertext linked information environments. The invention of hyperlinking introduced a new challenge in information retrieval. Throughout history, information was structured into an imposed taxonomy or hierarchy. We sorted it alphabetically or by the Dewey decimal system. And, because information was static, it stayed within the boundaries we built for it. But the creation of the hyperlink meant that information suddenly became unstructured and organic. Topical links from source to source meant that imposed editorial restrictions no longer worked. Links kept leaping above the boundaries we tried to impose on information.

Given this new challenge, Pirolli wanted to explore the subconscious strategies we used to navigate this unstructured information environment. He wanted to reduce it to a predictable algorithm. Time after time, he was frustrated. Humans would start down a predictable path, only to suddenly take an expected turn. The patterns didn’t seem logical. But, as chance would have it, he had recently read some work on biological foraging patterns and decided to overlay that on the behaviors he was observing. It was Pirolli’s “A Ha” moment. Suddenly, the patterns made sense. Humans, Pirolli (along with Stuart Card and others) discovered, foraged for information. We used the same strategies to navigate the web that we use to look for food. And, just as is the case with calories, laziness (or efficiency) is a pretty good strategy for finding information.

In information foraging, there is one overriding concern: take the most efficient path possible to the information you seek. I won’t get too far into the mechanics of how we do that except to say this – it’s not a conscious calculation. We’re constantly scanning the environment to see if a richer information “patch” is on the horizon. Information foraging is fundamentally important to understand if you’re to understand human behavior online. Jakob Nielsen called it “the most important concept to emerge from Human-Computer Interaction research since 1993.”

So, let’s look at how this applies to the Murdoch-Bing scenario. For almost 20 years now, we’ve been retrieving information online, using our foraging strategies. In that time, we’ve become conditioned to go to the most efficient sources of information…the places where we get the biggest information “bang” for our buck – and in this case, our investment is our time. As I’ve said before, this is a conditioned behavior. We don’t consciously think our way through this. Our subconscious efficiency circuits kick in and we do this by habit. To think of how powerful these subconscious loops are, just think about how hard it is to walk past a cookie lying on the counter. It’s not that you’re a bad person if you pick it up. It’s not that you’re stupid, eating it even though you know it’s not good for you. It’s those inherent human behaviors taking over. It’s powerful stuff!

So, for well over a decade, we’ve discovered that the shortest line between our need for information and the right online destination is a search engine. If there was a more efficient retrieval mechanism, we’d use it. This isn’t about brand, or loyalty. It’s just walking past the kitchen counter and seeing a cookie there. We’ll do it without thinking.

Murdoch’s strategy is flawed because he doesn’t realize that we now seek information differently. In the past, we picked the editorial channel that best met our needs. The Wall Street Journal may have been one of our favored patches, because we agreed with it’s “editorial voice”, it met a sufficient number of our information needs and we felt the investment of our time was warranted by the information we retrieved. in return for that, we started to build up loyalty to the brand, giving the publishers the right to sell advertising against that loyalty.

But the hyperlink and the internet didn’t just make information patchy, it also created a “just in time” need for information. 30 years ago, we didn’t suddenly develop the need to know who the director of “Booty Call” was because there was no easy way to retrieve the information. It wasn’t worth the investment. But Google made instant retrieval of information possible. It dramatically improved the efficiency of information retrieval. We started Googling everything because we could, without wasting huge amounts of time.

It’s this paradigm shift in information consumption that Murdoch is completely missing. Yesterday in Search Engine Land, Danny Sullivan did a good job showing how the social web and the indexing of content makes any attempt to wall it off to preserve a revenue model futile. The one thing I disagree with Danny on is his assertion that a mutually exclusive Murdoch/Google relationship won’t hurt Google or Murdoch:

So what happens if the WSJ is out of Google? Nothing. Seriously, nothing. Remember, for years the WSJ was NOT in Google, and yet Google grew just fine. Also, the WSJ seems to have been fine. Neither is crucial to each other.

What we have here is a significant shift in human behavior, and right now we’re in the transition period. Google and other engines have dramatically changed the game of information retrieval and that means a huge upheaval in the industry. Society is moving en mass from one behavior, which publishers had build a revenue model around, to another behavior, which still hasn’t been fully monetized (Google has only monetized one small slice of it). To say that both will do fine is ignoring the lessons of history. These massive behavioral shifts are ALWAYS a zero sum game..somebody wins and somebody loses. Guess who will lose? Hint, it won’t be Google.

So, what about Bing? If my theory is correct, will Bing become the new favored patch by signing with Murdoch? I doubt it. There’s just not enough critical mass there to disrupt conditioned behaviors. The “just in time” information economy has eroded our brand affinity for favored patches. We’ve become more publisher agnostic. Again, this isn’t universally true. We still appreciate “editorial voice” for some types of information and may seek out one specific publisher, but our new promiscuity means an erosion of page views and traffic, which is killing the traditional publishing revenue model. But, more about this in tomorrow and Thursday’s  post.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.