We’re Becoming Intellectually “Obese”

Humans are defined by scarcity. All our evolutionary adaptations tend to be built to ensure survival in harsh environments. This can sometimes backfire on us in times of abundance.

For example, humans are great at foraging. We have built-in algorithms that tell us which patches are most promising and when we should give up on the patch we’re in and move to another patch.

We’re also good at borrowing strategies that evolution designed for one purpose and applying them for another purpose. This is called exaptation. For example, we’ve exapted our food foraging strategies and applied them to searching for information in an online environment. We use these skills when we look at a website, conduct an online search or scan our email inbox. But as we forage for information – or food – we have to remember, this same strategy assumes scarcity, not abundance.

Take food for example. Nutritionally we have been hardwired by evolution to prefer high fat, high calorie foods. That’s because this wiring took place in an environment of scarcity, where you didn’t know where your next meal was coming from. High fat, high calorie and high salt foods were all “jackpots” if food was scarce. Eating these foods could mean the difference between life and death. So our brains evolved to send us a reward signal when we ate these foods. Subsequently, we naturally started to forage for these things.

This was all good when our home was the African savannah. Not so good when it’s Redondo Beach, there’s a fast food joint on every corner and the local Wal-Mart’s shelves are filled to overflowing with highly processed pre-made meals. We have “refined” food production to continually push our evolutionary buttons, gorging ourselves to the point of obesity. Foraging isn’t a problem here. Limiting ourselves is.

So, evolution has made humans good at foraging when things are scarce, but not so good at filtering in an environment of abundance. I suspect the same thing that happened with food is today happening with information.

Just like we are predisposed to look for food that is high in fats, salt and calories, we are drawn to information that:

  1. Leads to us having sex
  2. Leads to us having more than our neighbors
  3. Leads to us improving our position in the social hierarchy

All those things make sense in an evolutionary environment where there’s not enough to go around. But, in a society of abundance, they can cause big problems.

Just like food, for most of our history information was in short supply. We had to make decisions based on too little information, rather than too much. So most of our cognitive biases were developed to allow us to function in a setting where knowledge was in short supply and decisions had to be made quickly. In such an environment, these heuristic short cuts would usually end up working in our favor, giving us a higher probability of survival.

These evolutionary biases become dangerous as our information environment becomes more abundant. We weren’t built to rationally seek out and judiciously evaluate information. We were built to make decisions based on little or no knowledge. There is an override switch we can use if we wish, but it’s important to know that just like we’re inherently drawn to crappy food, we’re also subconsciously drawn to crappy information.

Whether or not you agree with the mainstream news sources, the fact is that there was a thoughtful editorial process, which was intended to improve the quality of information we were provided. Entire teams of people were employed to spend their days rationally thinking about gathering, presenting and validating the information that would be passed along to the public. In Nobel laureate Daniel Kahneman’s terminology, they were “thinking slow” about it. And because the transactional costs of getting that information to us was so high, there was a relatively strong signal to noise ratio.

That is no longer the case. Transactional costs have dropped to the point that it costs almost nothing to get information to us. This allows information providers to completely bypass any editorial loop and get it in front of us. Foraging for that information is not the problem. Filtering it is. As we forage through potential information “patches” – whether they be on Google, Facebook or Twitter – we tend to “think fast” – clicking on the links that are most tantalizing.

I would have never dreamed that having too much information could be a bad thing. But most of the cautionary columns that I’ve written about in the last few years all seem to have the same root cause – we’re becoming intellectually “obese.” We’ve developed an insatiable appetite for fast, fried, sugar-frosted information.

 

The Winona Ryder Effect

I was in the U.S. last week. It was my first visit in the Trump era.

It was weird. I was in California, so the full effect was muted, but I watched my tongue when meeting strangers. And that’s speaking as a Canadian, where watching your tongue is a national pastime. (As an aside, my US host, Lance, told me about a recent post on a satire site: “Concerned, But Not Wanting To Offend, Canada Quietly Plants Privacy Hedge Along Entire U.S. Border.” That’s so us.) There was a feeling that I had not felt before. As someone who has spent a lot of time in the US over the past decade or two, I felt a little less comfortable. There was a disconnect that was new to me.

Little did I know (because I’ve turned off my mobile CNN alerts since January 20th because I was slipping into depression) but just after I whisked through Sea-Tac airport with all the privilege that being a white male affords you, Washington Governor Jay Inslee would hold a press conference denouncing the new Trump Muslim ban in no uncertain terms. On the other side of the TSA security gates there were a thousand protesters gathering. I didn’t learn about this until I got home.

Like I said, it was weird.

And then there were the SAG awards on Sunday night. What the hell was the deal with Winona Ryder?

When the Stranger Things cast got on stage to accept their ensemble acting award, spokesperson David Harbour unleashed a fiery anti-Trump speech. But despite his passion and volume, it was Winona Ryder, standing beside him, that lit up the share button. And she didn’t say a word. Instead, her face contorted through a series of twenty-some different expressions in under 2 minutes. She became, as one Twitter post said, a “human gif machine.”

Now, by her own admission, Winona is fragile. She has battled depression and anxiety for much of her professional life. Maybe she was having a minor breakdown in front of the world. Or maybe this was a premeditated and choreographed social media master stroke. Either way, it says something about us.

The Stranger Things cast hadn’t even left the stage before the Twitterverse started spreading the Ryder meme. If you look at Google Trends there was a huge spike in searches for Winona Ryder starting right around 6:15 pm (PST) Sunday night. It peaked at 6:48 pm with a volume about 20 times that of queries for Ms. Ryder before the broadcast began.

It was David Harbour that delivered the speech Ryder was reacting to. The words were his, and while there was also a spike in searches for him coinciding with the speech, he didn’t come close to matching the viral popularity of the Ryder meme. At its peak, there were 5 searches for “Winona Ryder” for every search for “David Harbour.”

Ryder’s mugging was – premeditated or not – extremely meme-worthy. It was visual, it was over the top and – most importantly – it was a blank canvas we could project our own views on to. Winona didn’t give us any words, so we could fill in our own. We could use it to provide a somewhat bizarre exclamation point to our own views, expressed through social media.

As I was watching this happen, I knew this was going to go viral. Maybe it’s because it takes something pretty surreal to make a dent in an increasingly surreal world that leaves us numb. When the noise that surrounds us seems increasingly unfathomable, we need something like this to prick our consciousness and make us sit up and take notice. Then we hunker down again before we’re pummelled with the next bit of reality.

Let me give you one example.

As I was watching the SAG awards Sunday night, I was unaware that gunmen had opened fire on Muslim worshippers praying in a mosque in Quebec City. I only found out after I flicked through the channels after the broadcast ended. Today, as I write this, I now know that six are dead because someone hated Muslims that much. Canada also has extreme racism.

I find it hard to think about that. It’s easier to think about Winona Ryder’s funny faces. That’s not very noble, I know, but sometimes you have to go with what you’re actually able to wrap your mind around.

The Vanishing Value of the Truth

You know, the very powerful and the very stupid have one thing in common. They don’t alter their views to fit the facts. They alter the facts to fit the views.

Dr. Who, 1977

We might be in a period of ethical crisis. Or not. It’s tough to say. It really depends on what you believe. And that, in a nutshell, is the whole problem.

Take this past weekend for example. Brand new White House Press Secretary Sean Spicer, in his very first address, lied about the size of the inauguration crowd. Afterwards, a very cantankerous Kellyanne Conway defended the lying when confronted by Chuck Todd on Meet the Press. She said they weren’t lies…they were “Alternate Facts”.

http://www.nbcnews.com/widget/video-embed/860142147643

So, what exactly is an alternate fact? It’s something that is not a fact at all, but a narrative intended to be believed by a segment of the population, presumably to gain something from them.

To use a popular turn of phrase, it’s “Faking It til You Make It!”

And there you have the mantra of our society. We’re rewarding alternate facts on the theory that the end justifies the means. If we throw a blizzard of alternate facts out there that resonate with our audience’s beliefs, we’ll get what we want.

The Fake It Til You Make It syndrome is popping up everywhere. It’s always been a part of marketing and advertising. Arguably, the entire industry is based on alternate facts. But it’s also showing up in the development of new products and services, especially in the digital domain. While Eric Ries never espoused dishonesty in his book, The Lean Start Up, the idea of a Minimal Viable Product certainly lends itself to the principle of “faking it until you make it.” Agile development, in its purest sense, is about user feedback and rapid iteration, but humans being humans, it’s tough to resist the temptation to oversell each iteration, treading dangerously close to pitching “vaporware.” Then we hope like hell that the next development cycle will bridge some of the gap between reality and the alternate facts we sold the prospective customer.

I think we have to accept that our world may not place much value on the truth any more. It’s a slide that started about 100 years ago.

The Seven Habits of Highly Effective People author Stephen Covey reviewed the history of success literature in the US from the 1700’s forward. In the first 150 years of America’s history, all the success literature was about building character. Character was defined by words like integrity, kindness, virtue and honor. The most important thing was to be a good person.

Honesty was a fundamental underpinning of the Character Ethic. This coincided with the Enlightenment in Europe. Intellectually, this movement elevated truth above belief. Our modern concept of science gained its legs: “a branch of knowledge or study dealing with a body of facts or truths.” The concepts of honor and honesty were intertwined

But Covey noticed that things changed after the First World War. Success literature became preoccupied with the concept of personality. It was important to be likeable, extroverted, and influential. The most important thing was to be successful. Somehow, being truthful got lost in the noise generated by the rush to get rich.

Here’s the interesting thing about personality and character. Psychologists have found that your personality is resistant to change. Personality tends to work below the conscious surface and scripts play out without a lot of mindful intervention. You can read all the self-help books in the world and you probably won’t change your personality very much. But character can be worked on. Building character is an exercise in mindfulness. You have to make a conscious choice to be honest.

The other interesting thing about personality and character is how other people see you. We are wired to pick up on other people’s personalities almost instantly. We start picking up the subconscious cues immediately after meeting someone. But it takes a long time to determine a person’s character. You have to go through character-testing experiences before you can know if they’re really a good person. Character cuts to the core, where as personality is skin deep. But in this world of “labelability” (where we think we know people better than we actually do) we often substitute personality cues for character. If a person is outgoing, confident and fun, we believe them to be trustworthy, moral and honest.

This all adds up to some worrying consequences. If we have built a society where success is worth more than integrity, then our navigational bearings become dependent on context. Behavior becomes contingent on circumstances. Things that should be absolute become relative. Truth becomes what you believe is the most expedient and useful in a given situation.

Welcome to the world of alternate facts.

Branding in the Post Truth Age

If 2016 was nothing else – it was a watershed year for the concept of branding. In the previous 12 months, we saw a decoupling in the two elements we have always believed make up brands. As fellow Spinner Cory Treffiletti said recently:

“You have to satisfy the emotional quotient as well as the logical quotient for your brand.  If not, then your brand isn’t balanced, and is likely to fall flat on its face.”

But another Mediapost article highlighted an interesting trend in branding:

“Brands will strive to be ‘meticulously un-designed’ in 2017, according to WPP brand agency Brand Union.”

This, I believe, speaks to where brands are going. And depending on which side of the agency desk you happen to be on, this could either be good news or downright disheartening.

Let’s start with the logical side of branding. In their book Absolute Value, Itamar Simonson and Emanuel Rosen sounded the death knell for brands as a proxy for consumer information. Their premise, which I agree with, is that in a market that is increasingly moving towards perfect information, brands have lost their position of trust. We would rather rely on information that comes from non-marketing sources.

But brands have been aspiring to transcend their logical side for at least 5 decades now. This is the emotional side of branding that Treffiletti speaks of. And here I have to disagree with Simonson and Rosen. This form of branding appears to be very much alive and well, thank you. In fact, in the past year, this form of branding has upped the game considerably.

Brands, at their most potent, embed themselves in our belief systems. It is here, close to our emotional hearts, which mark the Promised Land for brands. Reid Montague’s famous Coke neuro-imaging experiment showed that for Coke drinkers, the brand became part of who they are. Research I was involved in showed that favored brands are positively responded to in a split second, far faster than the rational brain can act. We are hardwired to believe in brands and the more loved the brand, the stronger the reaction. So let’s look at beliefs for a moment.

Not all beliefs are created equal. Our beliefs have an emotional valence – some beliefs are defended more strongly than others. There is a hierarchy of belief defense. At the highest level are our Core beliefs; how we feel about things like politics and religion. Brands are trying to intrude on this core belief space. There has been no better example of this than the brand of Donald Trump.

Beliefs are funny things. From an evolutionary perspective, they’re valuable. They’re mental shortcuts that guide our actions without requiring us to think. They are a type of emotional auto-pilot. But they can also be quite dangerous for the same reason. We defend our beliefs against skeptics – and we defend our core beliefs most vigorously. Ration has nothing to do with it. It is this type of defense system that brands would love to build around themselves.

We like to believe our beliefs are unique to us – but in actual fact, beliefs also materialize out of our social connections. If enough people in our social network believe something is true, so will we. We will even create false memories and narratives to support the fiction. The evolutionary logic is quite simple. Tribes have better odds for survival than individuals, and our tribe will be more successful if we all think the same way about certain things. Beliefs create tribal cohesion.

So, the question is – how does a brand become a belief? It’s this question that possibly points the way in which brands will evolve in the Post-Truth future.

Up to now, brands have always been unilaterally “manufactured” – carefully crafted by agencies as a distillation of marketing messages and delivered to an audience. But now, brands are multilaterally “emergent” – formed through a network of socially connected interactions. All brands are now trying to ride the amplified waves of social media. This means they have to be “meme-worthy” – which really means they have to be both note and share-worthy. To become more amplifiable, brands will become more “jagged,” trying to act as catalysts for going viral. Branding messages will naturally evolve towards outlier extremes in their quest to be noticed and interacted with. Brands are aspiring to become “brain-worms” – wait, that’s not quite right – brands are becoming “belief-worms,” slipping past the rational brain if at all possible to lodge themselves directly in our belief systems. Brands want to be emotional shorthand notations that resonate with our most deeply held core beliefs. We have constructed a narrative of who we are and brands that fit that narrative are adopted and amplified.

It’s this version of branding that seems to be where we’re headed – a socially infectious virus that creates it’s own version of the truth and builds a bulwark of belief to defend itself. Increasingly, branding has nothing to do with rational thought or a quest for absolute value.

The Magic of the Internet Through My Dad’s Eyes

“Would you rather lose a limb or never be able to access the Internet?” My daughter looked at me, waiting for my answer.

“Well?”

We were playing the game “Would You Rather” during a lull in the Christmas festivities. The whole point of the game is to pose two random and usually bizarre alternatives to choose from. Once you do, you see how others have answered. It’s a hard game to take seriously.

Except for this question. This one hit me like a hammer blow.

“I have to say I’d rather lose a limb.”

Wow. I would rather lose an arm or a leg than lose something I didn’t even know existed 20 years ago. That’s a pretty sobering thought. I am so dependent on this technical artifact that I value it more than parts of my own body.

During the same holiday season, my stepdad came to visit. He has two cherished possessions that are always with him. One is a pocketknife his father gave him. The other is an iPhone 3 that my sister gave him when she upgraded. Dad doesn’t do much on his phone. But what he does do is critically important to him. He texts his kids and he checks the weather. If you grew up on a farm on the Canadian prairies during the 1930’s, you literally lived and died according to the weather. So, for Dad, it’s magic of the highest sort to be able to know what the temperature is in the places where his favorite people live. We kids have added all our home locations to his weather app, as well as that of his sister-in-law. Dad checks the weather in Edmonton (Alberta), Calgary (Alberta), Kelowna (BC), Orillia (Ontario) and his hometown of Sundre (Alberta) constantly. It’s his way of keeping tabs on us when he can’t be with us.

I wonder what Dad would say if I asked him to choose between his iPhone and his right arm. I suspect he’d have to think about it. I do know the first thing I have to do when he comes to our place is set him up on our home wifi network.

It’s easy to talk about how Millennials or Gen-X’s are dependent on technology. But for me, it really strikes home when I watch people of my parent’s generation hold on to some aspect of technology for dear life because it enables them to do something so fundamentally important to them. They understand something we don’t. They understand what Arthur C. Clarke meant when he said,

“Any sufficiently advanced technology is indistinguishable from magic.”

To understand this, look for a moment through the eyes of my Dad when he was a child. He rode a horse to school – a tiny one room building that was heated with a wood stove. Its library consisted of two bookshelves on the back wall. A circle whose radius was defined by how far you could drive the wagon in a single day bound the world of which he was aware. That world consisted of several farms, the Eagle Hill Co-op store, the tiny town of Sundre, his school and the post office. The last was particularly important, because that’s where the packages you ordered from the Eaton’s catalogue (the Canadian equivalent of Sears Roebuck) would come.

It’s to this post office that my step-dad dragged his sleigh about 75 years ago. He didn’t know it at the time, but he was picking up his Christmas present. His mother, whose own paternal grandfather was a contemporary and friend of Charles Darwin, had saved milk money for several months to purchase a three-volume encyclopaedia for the home. Nobody else they knew had an encyclopaedia. Books were rare enough. But for Isobel (Buckman) Leckie, knowledge was an investment worth making. Those three books became the gift of a much bigger world for my Dad.

It’s easy to make fun of seniors for their simultaneous amazement of and bewilderment by technology. We chuckle when Dad does his third “weather round-up” of the day. We get frustrated when he can’t seem to understand how wifi works. But let’s put this in the context of the change he has seen in his life on this earth. This is not just an obsolete iPhone 3 that he holds in his hand. This is something for which the adjective “magical” seems apt.

Perhaps it’s even magic you’d pay an arm and a leg for.

Back to the Coffee House: Has Journalism Gone Full Circle?

First, let’s consider two facts about Facebook that ran in Mediapost in the last two weeks. The first:

“A full 65% of people find their next destination through friends and family on Facebook.”

Let’s take this out of the context of just looking for your next travel destination. Let’s think about it in terms of a risky decision. Choosing somewhere to go on a vacation is a big decision. There’s a lot riding on it. Other than the expense, there’s also your personal experience. The fact that 2 out of 3 people chose Facebook as the platform upon which to make that decision is rather amazing when you think about it. It shows just how pervasive and influential Facebook as become.

Now, the next fact:

“Facebook users are two-and-a-half times more likely to read fake news fed through the social network than news from reputable news publishers.”

There’s really no reason to elaborate on the above – ‘nuff said. It’s pretty clear that Facebook has emerged at the dominant public space in our lives. It is perhaps the most important platform in our culture today for forming beliefs and opinions.

Sorry Mark Zuckerberg, but not matter what you may have said in the past about not being a media outlet, you can’t duck this responsibility. If our public opinions are formed on your private property that is a unimaginably powerful platform then – as Spidey’s Uncle Ben said (or the French National Convention of 1793; depending on whom you’re prefer to quote as a source) – “With great power comes great responsibility.” If you provide a platform and an audience to news providers – fake or real, you are, ipso facto, a media outlet.

But Facebook is more than just an outlet. It is also the forum where news is digested and shared. It is both a gristmill and a cauldron where beliefs are formed and opinions expressed. This isn’t the first time something like this has happened, although the previous occurrence was in a different time and a very different place. It actually contributed directly to the birth of modern journalism – which is, ironically – under threat from this latest evolution of news.

If you were an average citizen London in 1700 your sources for news were limited. First of all, there was a very good chance that you were illiterate, so reading the news wasn’t an option. The official channel for the news of the realm was royal proclamations read out by town criers. Unfortunately, this wasn’t so much news as whatever the ruling monarch felt like proclaiming.

There was another reality of life in London – if you drank the water it could possibly kill you. You could drink beer in a pub – which most did – or if you preferred to stay sober you could drink coffee. Starting in the mid 1600’s coffee houses started to pop up all over London. It wasn’t the quality of the coffee that made these public spaces all the rage. It was the forum they provided for the sharing of news. Each new arrival was greeted with, “Your servant, sir. What news have you?” Pamphlets, journals, broadsheets and newsletters from independent (a.k.a “non-royal”) publishers were read aloud, digested and debated. Given the class-bound society of London, coffee houses were remarkably democratic. “Pre-eminence of place none here should mind,” proclaimed the Rules and Orders of the Coffee-House (1674), “but take the next fit seat he can find.” Lords, fishmongers, baronets, barristers, butchers and shoe-blacks could and did all share the same table. The coffee houses of London made a huge contribution to our current notion of media as a public trust, with all that entails.

In a 2011 article the Economist made the same parallel between coffee houses and digitally mediated news. In it, they foreshadowed a dramatic shift in our concept of news:

“The internet is making news more participatory, social, diverse and partisan, reviving the discursive ethos of the era before mass media. That will have profound effects on society and politics.”

The last line was prescient. Seismic disruption has fundamentally torn the political and societal landscape asunder. But I have a different take on the “discursive ethos” of news consumption. I assume the Economist used this phrase to mean a verbal interchange of thought related to the news. But that doesn’t happen on Facebook. There is no thought and there is little discourse. The share button is hit before there is any chance to digest the news, let alone vet it for accuracy. This is a much different atmosphere of the coffee house. There is a dynamic that happens when our beliefs are called on the mat in a public forum. It is here where beliefs may be altered but they can never change in a vacuum. The coffee house provided the ideal forum for the challenging of beliefs. As mentioned, it was perhaps the most heterogeneous forum in all of England at the time. Most of all it was an atmosphere infused with physicality and human interaction – a melting pot of somatic feedback. Debate was civil but passionate. There was a dynamic totally missing from it’s online equivalent. The rules and realities of the 18th century coffee house forced thoughtfulness and diverse perspectives upon the discourse. Facebook allows you to do an end run around it as you hit your share button.

The Mindful Democracy Manifesto

 

The best argument against democracy is a five-minute conversation with the average voter.

Winston Churchill

Call it the Frog in Boiling Water Syndrome. It happens when creeping changes in our environment reach a disruptive tipping point that triggers massive change – or – sometimes – a dead frog. I think we’re going through one such scenario now. In this case, the boiling water may be technology and the frog may be democracy.

As I said in Online Spin last week, the network effects of President-elect Donald Trump’s victory may be yet another unintended consequence of technology.

I walked through the dynamics I believe lay behind the election last week in some detail. This week, I want to focus more on the impact of technology on democratic elections in general. In particular, I wanted to explore the network effects of technology, the spread of information and sweeping populist movements like we saw on November 8th.

In an ideal world, access to information should be the bedrock of effective democracy. Ironically, however, now that we have more access than ever that bedrock is being chipped away. There has been a lot of finger pointing at the dissemination of fake news on Facebook, but that’s just symptomatic of a bigger ill. The real problem is the filter bubbles and echo chambers that formed on social networks. And they formed because friction has been eliminated. The way we were informed in this election looked very different from that in elections past.

Information is now spread more through emergent social networks than through editorially controlled media channels. That makes it subject to unintended network effects. Because the friction of central control has been largely eliminated, the spread of information relies on the rules of emergence: the aggregated and amplified behaviors of the individual agents.

When it comes to predicting behaviors of individual human agents, our best bet is placed on the innate behaviors that lie below the threshold of rational thought. Up to now, social conformity was a huge factor. And that rallying point of that social conformity was largely formed and defined by information coming from the mainstream media. The trend of that information over the past several decades has been to the left end of the ideological spectrum. Political correctness is one clear example of this evolving trend.

But in this past election, there was a shift in individual behavior thanks to the elimination of friction in the spread of information – away from social conformity and towards other primal behaviors. Xenophobia is one such behavior. Much as some of us hate to admit it, we’re all xenophobic to some degree. Humans naturally choose familiar over foreign. It’s an evolved survival trait. And, as American economist Thomas Schelling showed in 1971, it doesn’t take a very high degree of xenophobia to lead to significant segregation. He showed that even people who only have a mild preference to be with people like themselves (about 33%) would, given the ability to move wherever they wished, lead to highly segregated neighborhoods. Imagine then the segregation that happens when friction is essentially removed from social networks. You don’t have to be a racist to want to be with people who agree with you. Liberals are definitely guilty of the same bias.

What happened in the election of 2016 were the final death throes of the mythical Homo Politicus – the fiction of the rational voter. Just like Homo Economicus – who predeceased him/her thanks to the ground breaking work of psychologists Amos Tversky and Daniel Kahneman – much as we might believe we make rational voting choices, we are all a primal basket of cognitive biases. And these biases were fed a steady stream of misinformation and questionable factoids thanks to our homogenized social connections.

This was not just a right wing trend. The left was equally guilty. Emergent networks formed and headed in diametrically opposed directions. In the middle, unfortunately, was the future of the country and – perhaps – democracy. Because, with the elimination of information distributional friction, we have to ask the question, “What will democracy become?” I have an idea, but I’ll warn you, it’s not a particularly attractive one.

If we look at democracy in the context of an emergent network, we can reasonably predict a few things. If the behaviors of the individual agents are not uniform – if half always turn left and half always turn right – that dynamic tension will set up an oscillation. The network will go through opposing phases. The higher the tension, the bigger the amplitude and the more rapid the frequency of those oscillations. The country will continually veer right and then veer left.

Because those voting decisions are driven more by primal reactions than rational thought, votes will become less about the optimal future of the country and more about revenge on the winner of the previous election. As the elimination of friction in information distribution accelerates, we will increasingly be subject to the threshold mob effect I described in my last column.

So, is democracy dead? Perhaps. At a minimum, it is debilitated. At the beginning of the column, I quoted Winston Churchill. Here is another quote from Churchill:

Many forms of Government have been tried, and will be tried in this world of sin and woe. No one pretends that democracy is perfect or all-wise. Indeed it has been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time.…

We are incredibly reluctant to toy with the idea of democracy. It is perhaps the most cherished ideal we cling to in the Western World. But if democracy is the mechanism for a never-ending oscillation of retribution, perhaps we should be brave enough to consider alternatives. In that spirit, I put forward the following:

Mindful Democracy.

The best antidote to irrationality is mindfulness – forcing our Prefrontal cortex to kick in and lift us above our primal urges. But how do we encourage mindfulness in a democratic context? How do we break out of our social filter bubbles and echo chambers?

What if we made the right to vote contingent on awareness? What if you had to take a test before you cast your vote? The objective of the test is simple: how aware were you not only of your candidate’s position and policies but – more importantly – that of the other side? You don’t have to agree with the other side’s position; you just have to be aware of it. Your awareness score would then be assigned as a weight to your vote. The higher your level of awareness, the more your vote would count.

I know I’m tiptoeing on the edge of sacrilege here, but consider it a straw man. I’ve been hesitating in going public with this, but I’ve been thinking about it for some time and I’m not so sure it’s worse than the increasingly shaky democratic status quo we currently have. It’s equally fair to the right and left. It encourages mindfulness. It breaks down echo chambers.

It’s worth thinking about.