The Status Quo Bias – Why Every B2B Vendor has to Understand It

It’s probably the biggest hurdle any B2B vendor has to get over. It’s called the Status Quo bias and it’s deadly in any high-risk purchase scenario. According to Wikipedia, the bias occurs when the current baseline (or status quo) is taken as a reference point, and any change from that baseline is perceived as a loss. In other words, if it ain’t broke don’t fix it. We believe that simply because something exists, it must have merit. The burden of proof then falls on the vendor to overcome this level of complacency

The Status Quo Bias is actually a bundle of other common biases, including the Endowment Effect, the Loss Aversion Bias, The Existence Bias, Mere Exposure effect and other psychological factors that tend to continually jam the cogs of B2B commerce. Why B2B? The Status Quo Bias is common in any scenario where risk is high and reward is low, but B2B in particular is subject to it because these are group-buying decisions. And, as I’ll soon explain, groups tend to default to Status Quo bias with irritating regularity. The new book from CEB (recently acquired by Gartner) – The Challenger Customer – is all about the status quo bias.

So why is the bias particularly common with groups? Think of the dynamics at play here. Generally speaking, most people have some level of the Status Quo Bias. Some will have it more than others, depending on their level of risk tolerance. But let’s look at what happens when we lump all those people together in a group and force them to come to a consensus. Generally, you’re going to have a one or two people in the group that are driving for change. Typically, these will be the ones that have the most to gain and have a risk tolerance threshold that allows the deal to go forward. On the other end of the spectrum you have some people who have low risk tolerance levels and nothing to gain. They may even stand to lose if the deal goes forward (think IT people who have to implement a new technology). In between you have the moderates. The gain factor and their risk tolerance levels net out to close to zero. Given that those that have something to gain will say yes and those who have nothing to gain will say no, it’s this middle group that will decide whether the deal will live or die.

Without the Status Quo bias, the deal might have a 50/50 chance. But the status quo bias stacks the deck towards negative outcomes for the vendor. Even if it tips the balance just a little bit towards “no” – that’s all that’s required to stop a deal dead in its tracks. The more disruptive the deal, the greater the Status Quo Bias. Let’s remember – this is B2B. There are no emotional rewards that can introduce a counter acting bias. It’s been shown in at least one study (Baker, Laury, Williams – 2008) that groups tend to be more risk averse than the individuals that make up that group. When the groups start discussing and – inevitably – disagreeing, it’s typically easier to do nothing.

So, how do we stick handle past this bias? The common approach is to divide and conquer – identifying the players and tailoring messages to speak directly to them. The counter intuitive finding of the CEB Challenger Customer research was that dividing and conquering is absolutely the wrong thing to do. It actually lessens the possibility of making a sale. While this sounds like it’s just plain wrong, it makes sense if we shift our perspective from the selling side to the buying side.

With our vendor goggles on, we believe that if we tailor messaging to appeal to every individual’s own value proposition, that would be a way to build consensus and drive the deal forward. And that would be true, if every member of our buying committee was acting rationally. But as we soon see when we put on the buying googles, they’re not. Their irrational biases are firmly stacked up on the “do nothing” side of the ledger. And by tailoring messaging in different directions, we’re actually just giving them more things to disagree about. We’re creating dysfunction rather than eliminating it. Disagreements almost always default back to the status quo, because it’s the least risky option. The group may not agree about much, but they can agree that the incumbent solution creates the least disruption.

So what do you do? Well, I won’t steal the CEB’s thunder here, because the Challenger Customer is absolutely worth a read if you’re a B2B vendor. The authors, Brent Adamson, Matthew Dixon, Pat Spenner and Nick Toman, lay out step by step strategy to get around the Status Quo bias. The trick is to create a common psychological frame where everyone can agree that doing nothing is the riskiest alternative. But biases are notoriously sticky things. Setting up a commonly understood frame requires a deep understanding of the group dynamics at play. The one thing I really appreciate about CEB’s approach is that it’s “psychologically sound.” They make no assumptions about buyer rationality. They know that emotions ultimately drive all human behavior and B2B purchases are no exception.

We’re Becoming Intellectually “Obese”

Humans are defined by scarcity. All our evolutionary adaptations tend to be built to ensure survival in harsh environments. This can sometimes backfire on us in times of abundance.

For example, humans are great at foraging. We have built-in algorithms that tell us which patches are most promising and when we should give up on the patch we’re in and move to another patch.

We’re also good at borrowing strategies that evolution designed for one purpose and applying them for another purpose. This is called exaptation. For example, we’ve exapted our food foraging strategies and applied them to searching for information in an online environment. We use these skills when we look at a website, conduct an online search or scan our email inbox. But as we forage for information – or food – we have to remember, this same strategy assumes scarcity, not abundance.

Take food for example. Nutritionally we have been hardwired by evolution to prefer high fat, high calorie foods. That’s because this wiring took place in an environment of scarcity, where you didn’t know where your next meal was coming from. High fat, high calorie and high salt foods were all “jackpots” if food was scarce. Eating these foods could mean the difference between life and death. So our brains evolved to send us a reward signal when we ate these foods. Subsequently, we naturally started to forage for these things.

This was all good when our home was the African savannah. Not so good when it’s Redondo Beach, there’s a fast food joint on every corner and the local Wal-Mart’s shelves are filled to overflowing with highly processed pre-made meals. We have “refined” food production to continually push our evolutionary buttons, gorging ourselves to the point of obesity. Foraging isn’t a problem here. Limiting ourselves is.

So, evolution has made humans good at foraging when things are scarce, but not so good at filtering in an environment of abundance. I suspect the same thing that happened with food is today happening with information.

Just like we are predisposed to look for food that is high in fats, salt and calories, we are drawn to information that:

  1. Leads to us having sex
  2. Leads to us having more than our neighbors
  3. Leads to us improving our position in the social hierarchy

All those things make sense in an evolutionary environment where there’s not enough to go around. But, in a society of abundance, they can cause big problems.

Just like food, for most of our history information was in short supply. We had to make decisions based on too little information, rather than too much. So most of our cognitive biases were developed to allow us to function in a setting where knowledge was in short supply and decisions had to be made quickly. In such an environment, these heuristic short cuts would usually end up working in our favor, giving us a higher probability of survival.

These evolutionary biases become dangerous as our information environment becomes more abundant. We weren’t built to rationally seek out and judiciously evaluate information. We were built to make decisions based on little or no knowledge. There is an override switch we can use if we wish, but it’s important to know that just like we’re inherently drawn to crappy food, we’re also subconsciously drawn to crappy information.

Whether or not you agree with the mainstream news sources, the fact is that there was a thoughtful editorial process, which was intended to improve the quality of information we were provided. Entire teams of people were employed to spend their days rationally thinking about gathering, presenting and validating the information that would be passed along to the public. In Nobel laureate Daniel Kahneman’s terminology, they were “thinking slow” about it. And because the transactional costs of getting that information to us was so high, there was a relatively strong signal to noise ratio.

That is no longer the case. Transactional costs have dropped to the point that it costs almost nothing to get information to us. This allows information providers to completely bypass any editorial loop and get it in front of us. Foraging for that information is not the problem. Filtering it is. As we forage through potential information “patches” – whether they be on Google, Facebook or Twitter – we tend to “think fast” – clicking on the links that are most tantalizing.

I would have never dreamed that having too much information could be a bad thing. But most of the cautionary columns that I’ve written about in the last few years all seem to have the same root cause – we’re becoming intellectually “obese.” We’ve developed an insatiable appetite for fast, fried, sugar-frosted information.

 

Damn You Technology…

Quit batting your seductive visual sensors at me. You know I can’t resist. But I often wonder what I’m giving up when I give in to your temptations. That’s why I was interested in reading Tom Goodwin’s take on the major theme at SXSW – the Battle for Humanity. He broke this down into three sub themes. I agree with them. In fact, I’ve written on all of them in the past. They were:

Data Trading – We’re creating a market for data. But when you’re the one that generated that data, who should own it?

Shift to No Screens – an increasing number of connected devices will change of concept of what it means to be online.

Content Tunnel Vision – As the content we see is increasingly filtered based on our preferences, what does that do for our perception of what is real?

But while we’re talking about our imminent surrender to the machines, I feel there are some other themes that also merit some discussion. Let’s limit it to two today.

A New Definition of Connection and Community

sapolsky

Robert Sapolsky

A few weeks ago I read an article that I found fascinating by neuroendocrinologist and author Robert Sapolsky. In it, he posits that understanding Capgras Syndrome is the key to understanding the Facebook society. Capgras, first identified by French psychiatrist Joseph Capgras, is a disorder where we can recognize a face of a person but we can’t retrieve feelings of familiarity. Those afflicted can identify the face of a loved one but swear that it’s actually an identical imposter. Recognition of a person and retrieval of emotions attached to that person are handled by two different parts of the brain. When the connection is broken, Capgras Syndrome is the result.

This bifurcation of how we identify people is interesting. There is the yin and yang of cognition and emotion. The fusiform gyrus cognitively “parses” the face and then the brain retrieves the emotions and memories that are associated with it. To a normally functioning brain, it seems seamless and connected, but because two different regions (or, in the case of emotion, a network of regions) are involved, they can neurologically evolve independently of each other. And in the age of Facebook, that could mean a significant shift in the way we recognize connections and create “cognitive communities.” Sapolsky elaborates:

Through history, Capgras syndrome has been a cultural mirror of a dissociative mind, where thoughts of recognition and feelings of intimacy have been sundered. It is still that mirror. Today we think that what is false and artificial in the world around us is substantive and meaningful. It’s not that loved ones and friends are mistaken for simulations, but that simulations are mistaken for them.

As I said in a column a few months back, we are substituting surface cues for familiarity. We are rushing into intimacy without all the messy, time consuming process of understanding and shared experience that generally accompanies it.

Brains do love to take short cuts. They’re not big on heavy lifting. Here’s another example of that…

Free Will is Replaced with An Algorithm

harari

Yuval Harari

In a conversation with historian Yuval Harari, author of the best seller Sapiens, Derek Thompson from the Atlantic explored “The Post Human World.” One of the topics they discussed was the End of Individualism.

Humans (or, at least, most humans) have believed our decisions come from a mystical soul – a transcendental something that lives above our base biology and is in control of our will. Wrapped up in this is the concept of us as an individual and our importance in the world as free thinking agents.

In the past few decades, there is a growing realization that our notion of “free will” is just the result of a cascade of biochemical processes. There is nothing magical here; there is just a chain of synaptic switches being thrown. And that being the case – if a computer can process things faster than our brains, should we simply relegate our thinking to a machine?

In many ways, this is already happening. We trust Google Maps or our GPS device more than we trust our ability to find our own way. We trust Google Search more than our own memory. We’re on the verge of trusting our wearable fitness tracking devices more than our own body’s feedback. And in all these cases, our trust in tech is justified. These things are usually right more often than we are. But when it comes to humans vs, machines, they represent a slippery slope that we’re already well down. Harari speculates what might be at the bottom:

What really happens is that the self disintegrates. It’s not that you understand your true self better, but you come to realize there is no true self. There is just a complicated connection of biochemical connections, without a core. There is no authentic voice that lives inside you.

When I lay awake worrying about technology, these are the types of things that I think about. The big question is – is humanity an outmoded model? The fact is that we evolved to be successful in a certain environment. But here’s the irony in that: we were so successful that we changed that environment to one where it was the tools we’ve created, not the creators, which are the most successful adaptation. We may have made ourselves obsolete. And that’s why really smart humans, like Bill Gates, Elon Musk and Stephen Hawking are so worried about artificial intelligence.

“It would take off on its own, and re-design itself at an ever increasing rate,” said Hawking in a recent interview with BBC. “Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”

Worried about a machine taking your job? That may be the least of your worries.

 

 

The Chaos Theory of Marketing

Last week, I wrote why marketers are struggling with job security. In an effort to provide career counseling to an industry, I would offer this suggestion: start learning about the behaviors of non-linear dynamic systems. You’re going to have to get comfortable with the special conditions that accompany complexity.

Markets are always complex, but there’s a phenomenon that gives them the illusion of predictability. This phenomenon is potential. Potential, in this instance, means the gap between the current market state and a possible future state. The presence of potential creates market demand. Every time a new product is introduced, a new potential gap is created. Supply and demand are knocked out of balance. Until balance is regained, the market becomes more predictable.

Here’s an analogy that makes it a little easier to understand how this potential can impact the behaviors of a complex market. A model that’s often used to explain complexity is to imagine a pool table filled with balls. The twist is that each of these balls is self propelled and can move in any direction at random. Imagine how difficult it would be to predict where any single ball might go.

Now, imagine taking this same pool table and lifting one of the corner legs up 6 inches, introducing the force of gravity as a variable. Individual predictions are still difficult, but you’d be pretty safe in saying that the pocket that was diagonally opposite to the raised leg would eventually collect more than it’s fair share of balls. In this example, gravity plays the role of market potential. The market still behaves in a complex manner but there is a consistent force – the force of gravity – that exerts its influence on that complexity and makes it more predictable.

Marketing is built on exploiting potential – on capitalizing on (or creating) gaps between what we have and what we want. These gaps have always been around, but the nature of them has changed. While this potential was aimed further down Maslow’s hierarchy, it was pretty easy to predict purchasing behaviors. When it comes to the basics – meeting our need of food, water, shelter, safety – humans are all pretty much alike. But when it comes to purchases higher up the hierarchy – at the levels of self-esteem or self-actualization – things become tougher to predict.

Collectively, the western world has moved up Maslow’s hierarchy. A 2011 study from Heritage.org showed that even those living below the poverty line have a standard of life that exceeds those at all but the highest income levels just a few decades before. In 2005, 98.7% of homes had a TV, 84% had air conditioning, 79% has satellite or cable TV and 68% had a personal computer.

But it’s not only the diversification of consumer demand that’s increasing the complexity of markets. The more connected that markets become, the more unpredictable they become. Let’s go back to our overly simplified pool ball analogy. Let’s imagine that not only are our pool balls self-propelled, but they also tend to randomly change direction every time they collide with another ball. The more connected the market, the greater the number of collisions and subsequent direction changes. In marketing, those “collisions” could be a tweet, a review, a Facebook post, a Google search – well – you get the idea. It’s complex.

These two factors; the fragmentation of consumer demand and the complexity of a highly interconnected market, makes predicting consumer behavior a mug’s game. The challenge here is that marketing – in a laudable attempt to become more scientific – is following in science’s footsteps by taking a reductionist path. Our marketing mantra is to reduce everything down to testable variables and there’s certainly nothing wrong with that. I’ve said it myself on many occasions. But, as with science, we must realize that when we’re dealing with dynamic complexity, the whole can be much greater than the sum of its testable parts. There are patterns that can be perceived only at a macro scale. Here there be “black swans.” It’s the old issue of ignoring the global maxima or minima by focusing too closely on the local.

Reduction and testing tends to lead to a feeling of control and predictability. And, in some cases (such as a market that has a common potential) things seem to go pretty much according to plan. But sooner or later, complexity rears its head and those best laid plans blow up in your face.

 

 

Drowning in a Sea of Tech

The world is becoming a pretty technical place. The Internet of Things is surrounding us. Which sounds exciting. Until the Internet of Things doesn’t work.

Then what?

I know all these tech companies have scores of really smart people who work to make their own individual tech as trouble free as possible. Although the term has lost its contextual meaning, we’re all still aiming for “plug and play”. For people of a certain age – me, for example – this used to refer to a physical context; being able to plug stuff into a computer and have it simply started working. Now, we plug technology into our lives and hopes it plays well with all the other technology that it finds there.

But that isn’t always the case – is it? Sometimes, as Mediapost IoT Daily editor Chuck Martin recently related, technology refuses to play nice together. And because we now have so much technology interacting in so many hidden ways, it becomes very difficult to root out the culprit when something goes wrong.

Let me give you an example. My wife has been complaining for some time that her iPhone has been unable to take a picture because it has no storage available, even though it’s supposed to magically transport stuff off to the “Cloud”. This past weekend, I finally dug in to see what the problem was. The problem, as it turned out, was that the phone was bloated with thousands of emails and Messenger chats that were hidden and couldn’t be deleted. They were sucking up all the available storage. After more than an hour of investigation, I managed to clear up the Messenger cache but the email problem – which I’ve traced back to some issues with configuration of the account at her email provider – is still “in progress.”

We – and by “we” I include me and all you readers – are a fairly tech savvy group. With enough time and enough Google searches, we can probably hunt down and eliminate most bugs that might pop up. But that’s us. There are many more people who are like my wife. She doesn’t care about incorrectly configured email accounts or hidden caches. She just wants shit to work. She wants to be able to take a picture of my nephew on his 6th birthday. And when she can’t do that, the quality of my life takes a sudden downturn.

The more that tech becomes interconnected, the more likely it is that stuff can stop working for some arcane reason that only a network or software engineer can figure out. It’s getting to the point where all of us are going to need a full-time IT tech just to keep our households running. And I don’t know about you, but I don’t know where they’re going to sleep. Our guest room is full of broken down computers and printers right now.

For most of us, there is a triage sequence of responses to tech-related pains in the ass:

  1. First, we ignore the problem, hoping it will go away.
  2. Second, we reboot every piece of tech related to the problem, hoping it will go away.
  3. If neither of the above work, we marginalize the problem, working around it and hoping that eventually it will go away.
  4. If none of this works, we try to upgrade our way out of the problem, buying newer tech hoping that by tossing our old tech baby out the window, the problem will be flushed out along with the bath water.
  5. Finally, in rare cases (with the right people) – we actually dig into the problem, trying to resolve it

By the way, it hasn’t escaped my notice that there’s a pretty significant profit motive in point number 4 above. A conspiracy, perchance? Apple, Microsoft and Google wouldn’t do that to us, would they?

I’m all for the Internet of Things. I’m ready for self-driving cars, smart houses and bio-tech enhanced humans. But my “when you get a chance could you check…” list is getting unmanageably long. I’d be more than happy to live the rest of my life without having to “go into settings” or “check my preferences.”

Just last night I dreamt that I was trying to swim to a deserted tropical island but I kept drowning in a sea of Apple Watches. I called for help but the only person that could hear me was Siri. And she just kept saying, “I’m really sorry about this but I cannot take any requests right now. Please try again later…”

Do you think it means anything?

 

The Magic of the Internet Through My Dad’s Eyes

“Would you rather lose a limb or never be able to access the Internet?” My daughter looked at me, waiting for my answer.

“Well?”

We were playing the game “Would You Rather” during a lull in the Christmas festivities. The whole point of the game is to pose two random and usually bizarre alternatives to choose from. Once you do, you see how others have answered. It’s a hard game to take seriously.

Except for this question. This one hit me like a hammer blow.

“I have to say I’d rather lose a limb.”

Wow. I would rather lose an arm or a leg than lose something I didn’t even know existed 20 years ago. That’s a pretty sobering thought. I am so dependent on this technical artifact that I value it more than parts of my own body.

During the same holiday season, my stepdad came to visit. He has two cherished possessions that are always with him. One is a pocketknife his father gave him. The other is an iPhone 3 that my sister gave him when she upgraded. Dad doesn’t do much on his phone. But what he does do is critically important to him. He texts his kids and he checks the weather. If you grew up on a farm on the Canadian prairies during the 1930’s, you literally lived and died according to the weather. So, for Dad, it’s magic of the highest sort to be able to know what the temperature is in the places where his favorite people live. We kids have added all our home locations to his weather app, as well as that of his sister-in-law. Dad checks the weather in Edmonton (Alberta), Calgary (Alberta), Kelowna (BC), Orillia (Ontario) and his hometown of Sundre (Alberta) constantly. It’s his way of keeping tabs on us when he can’t be with us.

I wonder what Dad would say if I asked him to choose between his iPhone and his right arm. I suspect he’d have to think about it. I do know the first thing I have to do when he comes to our place is set him up on our home wifi network.

It’s easy to talk about how Millennials or Gen-X’s are dependent on technology. But for me, it really strikes home when I watch people of my parent’s generation hold on to some aspect of technology for dear life because it enables them to do something so fundamentally important to them. They understand something we don’t. They understand what Arthur C. Clarke meant when he said,

“Any sufficiently advanced technology is indistinguishable from magic.”

To understand this, look for a moment through the eyes of my Dad when he was a child. He rode a horse to school – a tiny one room building that was heated with a wood stove. Its library consisted of two bookshelves on the back wall. A circle whose radius was defined by how far you could drive the wagon in a single day bound the world of which he was aware. That world consisted of several farms, the Eagle Hill Co-op store, the tiny town of Sundre, his school and the post office. The last was particularly important, because that’s where the packages you ordered from the Eaton’s catalogue (the Canadian equivalent of Sears Roebuck) would come.

It’s to this post office that my step-dad dragged his sleigh about 75 years ago. He didn’t know it at the time, but he was picking up his Christmas present. His mother, whose own paternal grandfather was a contemporary and friend of Charles Darwin, had saved milk money for several months to purchase a three-volume encyclopaedia for the home. Nobody else they knew had an encyclopaedia. Books were rare enough. But for Isobel (Buckman) Leckie, knowledge was an investment worth making. Those three books became the gift of a much bigger world for my Dad.

It’s easy to make fun of seniors for their simultaneous amazement of and bewilderment by technology. We chuckle when Dad does his third “weather round-up” of the day. We get frustrated when he can’t seem to understand how wifi works. But let’s put this in the context of the change he has seen in his life on this earth. This is not just an obsolete iPhone 3 that he holds in his hand. This is something for which the adjective “magical” seems apt.

Perhaps it’s even magic you’d pay an arm and a leg for.

Back to the Coffee House: Has Journalism Gone Full Circle?

First, let’s consider two facts about Facebook that ran in Mediapost in the last two weeks. The first:

“A full 65% of people find their next destination through friends and family on Facebook.”

Let’s take this out of the context of just looking for your next travel destination. Let’s think about it in terms of a risky decision. Choosing somewhere to go on a vacation is a big decision. There’s a lot riding on it. Other than the expense, there’s also your personal experience. The fact that 2 out of 3 people chose Facebook as the platform upon which to make that decision is rather amazing when you think about it. It shows just how pervasive and influential Facebook as become.

Now, the next fact:

“Facebook users are two-and-a-half times more likely to read fake news fed through the social network than news from reputable news publishers.”

There’s really no reason to elaborate on the above – ‘nuff said. It’s pretty clear that Facebook has emerged at the dominant public space in our lives. It is perhaps the most important platform in our culture today for forming beliefs and opinions.

Sorry Mark Zuckerberg, but not matter what you may have said in the past about not being a media outlet, you can’t duck this responsibility. If our public opinions are formed on your private property that is a unimaginably powerful platform then – as Spidey’s Uncle Ben said (or the French National Convention of 1793; depending on whom you’re prefer to quote as a source) – “With great power comes great responsibility.” If you provide a platform and an audience to news providers – fake or real, you are, ipso facto, a media outlet.

But Facebook is more than just an outlet. It is also the forum where news is digested and shared. It is both a gristmill and a cauldron where beliefs are formed and opinions expressed. This isn’t the first time something like this has happened, although the previous occurrence was in a different time and a very different place. It actually contributed directly to the birth of modern journalism – which is, ironically – under threat from this latest evolution of news.

If you were an average citizen London in 1700 your sources for news were limited. First of all, there was a very good chance that you were illiterate, so reading the news wasn’t an option. The official channel for the news of the realm was royal proclamations read out by town criers. Unfortunately, this wasn’t so much news as whatever the ruling monarch felt like proclaiming.

There was another reality of life in London – if you drank the water it could possibly kill you. You could drink beer in a pub – which most did – or if you preferred to stay sober you could drink coffee. Starting in the mid 1600’s coffee houses started to pop up all over London. It wasn’t the quality of the coffee that made these public spaces all the rage. It was the forum they provided for the sharing of news. Each new arrival was greeted with, “Your servant, sir. What news have you?” Pamphlets, journals, broadsheets and newsletters from independent (a.k.a “non-royal”) publishers were read aloud, digested and debated. Given the class-bound society of London, coffee houses were remarkably democratic. “Pre-eminence of place none here should mind,” proclaimed the Rules and Orders of the Coffee-House (1674), “but take the next fit seat he can find.” Lords, fishmongers, baronets, barristers, butchers and shoe-blacks could and did all share the same table. The coffee houses of London made a huge contribution to our current notion of media as a public trust, with all that entails.

In a 2011 article the Economist made the same parallel between coffee houses and digitally mediated news. In it, they foreshadowed a dramatic shift in our concept of news:

“The internet is making news more participatory, social, diverse and partisan, reviving the discursive ethos of the era before mass media. That will have profound effects on society and politics.”

The last line was prescient. Seismic disruption has fundamentally torn the political and societal landscape asunder. But I have a different take on the “discursive ethos” of news consumption. I assume the Economist used this phrase to mean a verbal interchange of thought related to the news. But that doesn’t happen on Facebook. There is no thought and there is little discourse. The share button is hit before there is any chance to digest the news, let alone vet it for accuracy. This is a much different atmosphere of the coffee house. There is a dynamic that happens when our beliefs are called on the mat in a public forum. It is here where beliefs may be altered but they can never change in a vacuum. The coffee house provided the ideal forum for the challenging of beliefs. As mentioned, it was perhaps the most heterogeneous forum in all of England at the time. Most of all it was an atmosphere infused with physicality and human interaction – a melting pot of somatic feedback. Debate was civil but passionate. There was a dynamic totally missing from it’s online equivalent. The rules and realities of the 18th century coffee house forced thoughtfulness and diverse perspectives upon the discourse. Facebook allows you to do an end run around it as you hit your share button.