The Calcification of a Columnist

First: the Caveat. I’m old and grumpy. That is self-evident. There is no need to remind me.

But even with this truth established, the fact is that I’ve noticed a trend. Increasingly, when I come to write this column, I get depressed. The more I look for a topic to write about, the more my mood spirals downward.

I’ve been writing for Mediapost for over 12 years now. Together, between the Search Insider and Online Spin, that’s close to 600 columns. Many – if not most – of those have been focused on the intersection between technology and human behavior. I’m fascinated by what happens when evolved instincts meet technological disruption.

When I started this gig I was mostly optimistic. I was amazed by the possibilities and – somewhat naively it turns out – believed it would make us better. Unlimited access to information, the ability to connect with anyone – anywhere, new ways to reach beyond the limits of our own DNA; how could this not make humans amazing?

Why, then, do we seem to be going backwards? What I didn’t realize at the time is that technology is like a magnifying glass. Yes, it can make the good of human nature better, but it can also make the bad worse. Not only that, but Technology also has a nasty habit of throwing in unintended consequences; little gotchas we never saw coming that have massive moral implications. Disruption can be a good thing, but it can also rip things apart in a thrice that took centuries of careful and thoughtful building to put in place. Black Swans have little regard for ethics or morality.

I have always said that technology doesn’t change behaviors. It enables behaviors. When it comes to the things that matter, our innate instincts and beliefs, we are not perceptibly different than our distant ancestors were. We are driven by the same drives. Increasingly, as I look at how we use the outcomes of science and innovation to pursue these objectives, I realize that while it can enable love, courage and compassion, technology can also engender more hate, racism and misogyny. It makes us better while it also makes us worse. We are becoming caricatures of ourselves.

800px-diffusion_of_ideas

Everett Rogers, 1962

Everett Rogers plotted the diffusion of technology through the masses on a bell curve and divided us up into innovators, early adopters, early majority, late majority and laggards. The categorization was defined by our acceptance of innovation. Inevitably, then, there would be a correlation between that acceptance and our sense of optimism about the possibilities of technology. Early adopters would naturally see how technology would enable us to be better. But, as diffusion rolls through the curve we would eventually hit those for which technology is just there – another entitlement, a factor of our environment, oxygen. There is no special magic or promise here. Technology simply is.

So, to recap, I’m old and grumpy. As I started to write yet another column I was submerged in a wave of weariness.   I have to admit – I have been emotionally beat up by the last few years. I’m tired of writing about how technology is making us stupider, lazier and less tolerant when it should be making us great.

But another thing usually comes with age: perspective. This isn’t the first time that humans and disruptive technology have crossed paths. That’s been the story of our existence. Perhaps we should zoom out a bit from our current situation. Let’s set aside for a moment our navel gazing about fake news, click bait, viral hatred, connected xenophobia and erosion of public trusts. Let’s look at the bigger picture.

History isn’t sketched in straight lines. History is plotted on a curve. Correction. History is plotted in a series of waves. We are constantly correcting course. Disruption tends to swing a pendulum one way until a gathering of opposing force swings it the other way. It takes us awhile to absorb disruption, but we do – eventually.

I suspect if I were writing this in 1785 I’d be disheartened by the industrial blight that was enveloping the world. Then, like now, technology was plotting a new course for us. But in this case, we have the advantage of hindsight to put things in perspective. Consider this one fact: between 1200 and 1600 the life span of a British noble didn’t go up by even a single year. But, between 1800 and today, life expectancy for white males in the West doubled from thirty eight years to seventy six. Technology made that possible.

stevenpinker2Technology, when viewed on a longer timeline, has also made us better. If you doubt that, read psychologist and author Steven Pinker’s “Better Angels of Our Nature.” His exhaustively researched and reasoned book leads you to the inescapable conclusion that we are better now than we ever have been. We are less violent, less cruel and more peaceful than at any time in history. Technology also made that possible.

It’s okay to be frustrated by the squandering of the promise of technology. But it’s not okay to just shrug and move on. You are the opposing force that can cause the pendulum to change direction. Because, in the end, it’s not technology that makes us better. It’s how we choose to use that technology.

 

 

 

America, You’re Great (But You Might Be Surprised Why)

The first time I went to Washington D.C. I was struck by the extreme polarity I saw there. That day, the Tea Party was staging a demonstration against Obamacare on the Mall in front of the Capitol building. But this wasn’t the only event happening. The Mall was jammed with gatherings of all types – from all political angles: the right, the ultra-right and left, the rich and poor, the eager and entitled, the sage and stupid. The discourse was loud, passionate and boisterous. It was – in a word – chaos.

That chaotic polarity is, of course, defining the current election. After the second presidential debate, commentator Bob Schieffer said, with a mixture of incredulity and disgust, “How have we come to this?” The presidential debates may have hit a new low in presidential decorum, but if you dig deep enough, there is something great here.

Really.

A recent PR campaign has asked Canadians to tweet why America is great. I’m going to do it in a column instead.

You’re great because you argue loudly, passionately and boisterously. You air out ideologies in a very messy and public way. You amplify the bejeezus out of the good and the bad of human nature and then put them both in a cage match to battle it out in broad daylight. You do this knowing there will be no clear winner of this battle, but you hope and trust that the scales will tip in the right direction. There is no other country I know of that has the guts to do this in quite the way you do.

You personally may not agree with Donald Trump, but there are many that do. He is giving voice to the feelings and frustrations of a sizable chunk of the US population. And as much I personally don’t like how he’s doing it, the fact is he is doing it. Your country, your constitution and your political system has allowed a man like this to take a shot at the highest office of the land by questioning and attacking many things that many Americans hold to be inviolable. It’s scary as hell, but I have to admire you for letting it play out the way it has and trusting that eventually the process will prevail. And it has for 240 years. Candidates and elections and campaign rhetoric will all eventually disappear- but the process – your process – has always prevailed.

The polarization of the US is nothing new. It defines you. For a quick history lesson, watch The Best of Enemies on Netflix; a documentary on the televised debates of William F. Buckley and Gore Vidal clashing on left vs. right during the 1968 Nixon vs. Humphreys vs. Wallace election. What started as an intellectual dual ended with Buckley threatening to smash Vidal’s face in after being called a neo-proto-Nazi on live TV.

If you look at the US from the outside, you swear that the whole mess is going to end up in a fiery wreck. But you’ve been here before. Many times. And somehow, the resiliency of who you are and how you conduct business wins out. You careen towards disaster but you always seem to swerve at the last minute and emerge stronger than before.

I honestly don’t know how you do it. As a polite, cautious Canadian, I stand simultaneously in awe and abject terror of how you operate. You defy the physics of what should be.

You’re fundamentally, gloriously flawed..but you are unquestionably resilient. You are an amazing example of emergence. You, in the words of Nassim Nicolas Taleb – are Antifragile:

“beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better.”

You are discordant, divided and dysfunctional and somehow you’re still the most powerful and successful nation on the planet. I suspect you got there not in spite of your flaws, but because of them.

Perhaps you’re embarrassed by the current election cycle. I understand that. It has been called “unprecedented” many, many times by many, many commentators. And that may be true, but I would say it’s unprecedented only in the vigor and volume of the candidates (or, to be frank, one candidate). The boundaries of what is permissible have been pushed forcefully out. It may not be what certain constituents think is proper, but it is probably an accurate reflection of the diverse moods of the nation and, as such, it needs to be heard. You are a country of many opinions – often diametrically opposed. The US’s unique brand of democracy has had to stretch to it’s limits to accurately capture the dynamics of a nation in flux.

I don’t know what will happen November 8th. I do know that whatever happens, you will have gone through the fire yet again. You will emerge. You will do what needs to be done. And I suspect that, once again, you’ll be the stronger for it.

What Would a “Time Well Spent” World Look Like?

I’m worried about us. And it’s not just because we seem bent on death by ultra-conservative parochialism and xenophobia. I’m worried because I believe we’re spending all our time doing the wrong things. We’re fiddling while Rome burns.

Technology is our new drug of choice and we’re hooked. We’re fascinated by the trivial. We’re dumping huge gobs of time down the drain playing virtual games, updating social statuses, clicking on clickbait and watching videos of epic wardrobe malfunctions. Humans should be better than this.

It’s okay to spend some time doing nothing. The brain needs some downtime. But something, somewhere has gone seriously wrong. We are now spending the majority of our lives doing useless things. TV used to be the biggest time suck, but in 2015, for the first time ever, the boob tube was overtaken by time spent with mobile apps. According to a survey conducted by Flurry, in the second quarter of 2015 we spent about 2.8 hours per day watching TV. And we spent 3.3 hours on mobile apps. That’s a grand total of 6.1 hours per day or one third of the time we spend awake. Yes, both things can happen at the same time, so there is undoubtedly overlap, but still- that’s a scary-assed statistic!

And it’s getting worse. In a previous Flurry poll conducted in 2013, we spent a total of 298 hours between TV and mobile apps versus 366 hours in 2015. That’s a 22.8% increase in just two years. We’re spending way more time doing nothing. And those totals don’t even include things like time spent in front of a gaming console. For kids, tack on an average of another 10 hours per week and you can double that for hard-core male gamers. Our addiction to gaming has even led to death in extreme cases.

Even in the wildest stretches of imagination, this can’t qualify as “time well spent.”

We’re treading on very dangerous and very thin ice here. And, we no longer have history to learn from. It’s the first time we’ve ever encountered this. Technology is now only one small degree of separation from plugging directly into the pleasure center of our brains. And science has proven that a good shot of self-administered dopamine can supersede everything –water, food, sex. True, these experiments were administered on rats – primarily because it’s been unethical to go too far on replicating the experiments with humans – but are you willing to risk the entire future of mankind on the bet that we’re really that much smarter than rats?

My fear is that technology is becoming a slightly more sophisticated lever we push to get that dopamine rush. And developers know exactly what they’re doing. They are making that lever as addictive as possible. They are pushing us towards the brink of death by technological lobotomization. They’re lulling us into a false sense of security by offering us the distraction of viral videos, infinitely scrolling social notification feeds and mobile game apps. It’s the intellectual equivalent of fast food – quite literally “brain candy.

Here the hypocrisy of for-profit interest becomes evident. The corporate response typically rests on individual freedom of choice and the consumer’s ability to exercise will power. “We are just giving them what they’re asking for,” touts the stereotypical PR flack. But if you have an entire industry with reams of developers and researchers all aiming to hook you on their addictive product and your only defense is the same faulty neurological defense system that has already fallen victim to fast food, porn, big tobacco, the alcohol industry and the $350 billion illegal drug trade, where would you be placing your bets?

Technology should be our greatest achievement. It should make us better, not turn us into a bunch of lazy screen-addicted louts. And it certainly could be this way. What would it mean if technology helped us spend our time well? This is the hope behind the Time Well Spent Manifesto. Ethan Harris, a design ethicist and product philosopher at Google is one of the co-directors. Here is an excerpt from the manifesto:

We believe in a new kind of design, that lets us connect without getting sucked in. And disconnect, without missing something important.

And we believe in a new kind economy that’s built to help us spend time well, where products compete to help us live by our values.

I believe in the Manifesto. I believe we’re being willingly led down a scary and potentially ruinous path. Worst of all, I believe there is nothing we can – or will – do about it. Problems like this are seldom solved by foresight and good intentions. Things only change after we drive off the cliff.

The problem is that most of us never see it coming. And we never see it coming because we’re too busy watching a video of masturbating monkeys on Youtube.

Can Stories Make Us Better?

In writing this column, I often put ideas on the shelf for a while. Sometimes, world events conspire to make one of these shelved ideas suddenly relevant. This happened this past weekend.

The idea that caught my eye some months ago was an article that explored whether robots could learn morality by reading stories. On the face of it, it was mildly intriguing. But early Sunday morning as the heartbreaking news filtered to me from Orlando, a deeper connection emerged.

When we speak of unintended consequence, which we have before, the media amplification of acts of terror are one of them. The staggeringly sad fact is that shocking casualty numbers have their own media value. And that, said one analyst who was commenting on ways to deal with terrorism, is a new reality we have to come to terms with. When we in the media business make stories news worthy we assign worth not just for news consumers but also to newsmakers – those troubled individuals who have the motivation and the means to blow apart the daily news cycle.

This same analyst, when asked how we deal with terrorism, made the point you can’t prevent lone acts of terrorism. The only answer is to use that same network of cultural connections we use to amplify catastrophic events to create an environment that dampens rather than intensifies violent impulse. We in the media and advertising industries have to use our considerable skills in setting cultural contexts to create an environment that reduces the odds of a violent outcome. And sadly, this is a game of odds. There are no absolute answers here – there is just a statistical lowering of the curve. Sometimes, despite your best efforts, the unimaginable still happens.

But how do you use the tools at our disposal to amplify morality? Here, perhaps the story I shelved some months ago can provide some clues.

In the study from Georgia Tech, Mark Riedl and Brent Harrison used stories as models of acceptable morality. For most of human history, popular culture included at least an element of moral code. We encoded the values we held most dear into our stories. It provided a base for acceptable behavior, either through positive reinforcement of commonly understood virtues (prudence, justice, temperance, courage, faith, hope and charity) or warnings about universal vices (lust, gluttony, greed, sloth, wrath, envy and pride). Sometimes these stories had religious foundations, sometimes they were secular morality fables but they all served the same purpose. They taught us what was acceptable behavior.

Stories were never originally intended to entertain. They were created to pass along knowledge and cultural wisdom. Entertainment came after when we discovered the more entertaining the story, the more effective it was at its primary purpose: education. And this is how the researchers used stories. Robots can’t be entertained, but they can be educated.

At some point in the last century, we focused on the entertainment value of stories over education and, in doing so, rotated our moral compass 180 degrees. If you look at what is most likely to titillate, sin almost always trumps sainthood. Review that list of virtues and vices and you’ll see that the stories of our current popular culture focus on vice – that list could be the programming handbook for any Hollywood producer. I don’t intend this a sermon – I enjoy Game of Thrones as much as the next person. I simply state it as a fact. Our popular culture – and the amplification that comes from it – is focused almost exclusively on the worst aspects of human nature. If robots were receiving their behavioral instruction through these stories, they would be programmed to be psychopathic moral degenerates.

For most of us, we can absorb this continual stream of anti-social programming and not be affected by it. We still know what is right and what is wrong. But in a world where it’s the “black swan” outliers that grab the news headlines, we have to think about the consequences that reach beyond the mainstream. When we abandon the moral purpose of stories and focus on their entertainment aspect, are we also abandoning a commonly understood value landscape?

If you’re looking for absolute answers here, you won’t find them. That’s just not the world we live in. And am I naïve when I say the stories we chose to tell may have an influence on isolated violent events such as happened in Orlando? Perhaps. Despite all our best intentions, Omar Mateen might still have gone horribly offside.

But all things and all people are, to some extent, products of their environment. And because we in media and advertising are storytellers, we set that cultural environment. That’s our job. Because of this, I belief we have a moral obligation. We have to start paying more attention to the stories we tell.

 

 

 

 

The World in Bite Sized Pieces

It’s hard to see the big picture when your perspective is limited to 160 characters.

Or when we keep getting distracted from said big picture by that other picture that always seems to be lurking over there on the right side of our screen – the one of Kate Upton tilting forward wearing a wet bikini.

Two things are at work here obscuring our view of the whole: Our preoccupation with the attention economy and a frantic scrambling for a new revenue model. The net result is that we’re being spoon-fed stuff that’s way too easy to digest. We’re being pandered to in the worst possible way. The world is becoming a staircase of really small steps, each of which has a bright shiny object on it urging us to scale just a little bit higher. And we, like idiots, stumble our way up the stairs.

This cannot be good for us. We become better people when we have to chew through some gristle. Or when we’re forced to eat our broccoli. The world should not be the cognitive equivalent of Captain Crunch cereal.

It’s here where human nature gets the best of us. We’re wired to prefer scintillation to substance. Our intellectual laziness and willingness to follow whatever herd seems to be heading in our direction have conspired to create a world where Donald Trump can be a viable candidate for president of the United States – where our attention span is measured in fractions of a second – where the content we consume is dictated by a popularity contest.

Our news is increasingly coming to us in smaller and smaller chunks. The exploding complexity of our world, which begs to be understood in depth, is increasingly parceled out to us in pre-digested little tidbits, pushed to our smartphone. We spend scant seconds scanning headlines to stay “up to date.” And an algorithm that is trying to understand where our interests lie usually determines the stories we see.

This algorithmic curation creates both “Filter” and “Agreement” Bubbles. The homogeneity of our social network leads to a homogeneity of content. But if we spend our entire time with others that think like us, we end up with an intellectually polarized society in which the factions that sit at opposite ends of any given spectrum are openly hostile to each other. The gaps between our respective ideas of what is right are simply too big and no one has any interest in building a bridge across them. We’re losing our ideological interface areas, those opportunities to encounter ideas that force us to rethink and reframe, broadening our worldview in the process. We sacrifice empathy and we look for news that “sounds right” to us, not matter what “right” might be.

This is a crying shame, because there is more thought provoking, intellectually rich content than ever before being produced. But there is also more sugar coated crap who’s sole purpose is to get us to click.

I’ve often talked about the elimination of friction. Usually, I think this is a good thing. Bob Garfield, in a column a few months ago, called for a whoop-ass can of WD 40 to remove all transactional friction. But if we make things too easy to access, will we also remove those cognitive barriers that force us to slow down and think, giving our rationality a change to catch up with impulse? And it’s not just on the consumption side where a little bit of friction might bring benefits. The upside of production friction was that it did slow down streams of content just long enough to introduce an editorial voice. Someone somewhere had to give some thought as to what might actually be good for us.

In other words, it was someone’s job to make sure we ate our vegetables.

The “Get It” Gap

Netflix and Chill…

It sounds innocent enough – a night of curling up on the couch in a Snuggie with no plan other than some Orange is the New Black and Häagen Dazs binging. And that’s how it started out. Innocent. Then those damned kids got hold of it, and its present meaning ended up in a place quite removed from its origin.

Unfortunately, my wife didn’t know that when she used the phrase in a Facebook post for her gift basket company. That is, until one of our daughters walked in the door and before her bags hit the floor yelled from the entryway, “Mom, you have to change your post – right now!”

“What post?”

“The Netflix and Chill one…”

“Why?”

“Unless your basket contains lubricants and condoms, I don’t think your post means what you think it means”

But how is a middle-aged parent to know? The subversion of this particular phrase has just happened in the last year. It takes me the better part of a year to remember that it’s no longer 2015 when I sign a check. There’s no way a middle-aged brain could possibly keep up with the ongoing bastardization of the English language. The threshold for “getting it” keeps getting higher, driven by the acceleration of memes through social media.

getitgapParents were never intended to “get it.” That’s the whole point. Kids want to speak their own language and have their own cultural reference points. We were no different when we were kids. Neither were our parents.

And kids always “get it.” It’s like a rite of passage. Memes propagate through social networks and when you’re 18, your social network is the most important thing in your life. New memes spread like wildfire and part of belonging to this culture depends on “getting it.” The faster things spread, the more likely it is that you can increase the “Get It” gap between you and your parents. It’s a control thing. If the parents call all the shots about everything in your life, at least you can have this one thing to call your own.

As you start to gain control, the Gap becomes less important. Our daughters are now becoming adults, so they now act as “Get It” translators and, in cases like the one above, Urban Slang enforcement officers. When we transgress, they attempt to bridge the gap.

As you get older, the “stuff” of life gets in the way of continuing to “get it.” Buying a house, getting a job and changing diapers leaves little time left over to Snapchat about Scumbag Steve or tweet “Hell yea finna get crunk!” to your Hommie gee funk-a-nator on a Friday night.

The danger comes when parents unilaterally try to cross over the gap and attempt to tap into the zeitgeist of urban slang. This is always doomed to failure. There are no exceptions. It’s like tiptoeing through a minefield with snowshoes on.

At the very least, run it past your kids before you post anything. Better yet – look it up in Urban Dictionary. Kids can’t be trusted.

“Hotchkiss – Ouuuttt!” (Mic drop here)

Luddites Unite…

Throw off the shackles of technology. Rediscover the true zen of analog pleasures!

The Hotchkisses had a tech-free Christmas holiday – mostly. The most popular activity around our home this year was adult coloring. Whodathunkit?

There were no electronic gadgets, wired home entertainment devices or addictive apps exchanged. No personal tech, no connected platforms, no internet of things (with one exception). There were small appliances, real books printed on real paper, various articles of clothing – including designer socks – and board games.

As I mentioned, I did give one techie gift, but with a totally practical intention. I gave everyone Tiles to keep track of the crap we keep losing with irritating regularity. Other than that, we were surprisingly low tech this year.

Look, I’m the last person in the world that could be considered a digital counter-revolutionary. I love tech. I eat, breathe and revel in stuff that causes my wife’s eyes to repeatedly roll. But this year – nada. Not once did I sit down with a Chinglish manual that told me “When the unit not work, press “C” and hold on until you hear (you should loose your hands after you hear each sound) “

This wasn’t part of any pre-ordained plan. We didn’t get together and decide to boycott tech this holiday. We were just technology fatigued.

Maybe it’s because technology is ceasing to be fun. Sometimes, it’s a real pain in the ass. It nags us. It causes us to fixate on stupid things. It beeps and blinks and points out our shortcomings. It can lull us into catatonic states for hours on end. And this year, we just said “Enough!” If I’m going to be catatonic, it’s going to be at the working end of a pencil crayon, trying to stay within the lines.

Even our holiday movie choice was anti-tech, in a weird kind of way. We, along with the rest of the world, went to see Star Wars, the Force Awakens. Yes, it’s a sci-fi movive, but no one is going to see this movie for its special effects or CGI gimcrackery. Like the best space opera entries, we want to get reacquainted with people in the story. The Force’s appeal is that it is a long-awaited (32 years!) family reunion. We want to see if Luke Skywalker got bald and fat, despite the force stirring within him.

I doubt that this is part of any sustained move away from tech. We are tech-dependent. But maybe that’s the point. It used to be that tech gadgets separated us from the herd. It made us look coolly nerdish and cutting edge. But when the whole world is wearing an iWatch, the way to assert your independence is to use a pocket watch. Or maybe a sundial.

And you know what else we discovered? Turning away from tech usually means you turn towards people. We played board games together – actual board games, with cards and dice and boards that were made of pasteboard, not integrated circuits. We were in the same room together. We actually talked to each other. It was a form of communication that – for once – didn’t involve keyboards, emojis or hashtags.

I know this was a fleeting anomaly. We’re already back to our regular tech-dependent habits, our hands nervously seeking the nearest connected device whenever we have a millisecond to spare.

But for a brief, disconnected moment, it was nice.

Talking Back to Technology

The tech world seems to be leaning heavily towards voice activated devices. Siri – Amazon Echo – Facebook M – “OK Google” – as well as pretty much every vehicle in existence. It should make sense that we would want to speak to our digital assistants. After all, that’s how we communicate with each other. So why – then – do I feel like such a dork when I say “Siri, find me an Indian restaurant”?

I almost never use Sir as my interface to my iPhone. On the very rare occasions when I do, it’s when I’m driving. By myself. With no one to judge me. And even then, I feel unusually self-conscious.

I don’t think I’m alone. No one I know uses Siri, except on the same occasions and in the same way I do. This should be the most natural thing in the world. We’ve been talking to each other for several millennia. It’s so much more elegant than hammering away on a keyboard. But I keep seeing the same scenario play out over and over again. We give voice navigation a try. It sometimes works. When it does, it seems very cool. We try it again. And then, we don’t do it any more. I base this on admittedly anecdotal evidence. I’m sure there are those that continually chat merrily away to the nearest device. But not me. And not anyone I know either. So, given that voice activation seems to be the way devices are going, I have to ask why we’re dragging our heels to adopt?

In trying to judge the adoption of voice-activated interfaces, we have to account for mismatches in our expected utility. Every time we ask for some thing – like, for instance, “Play Bruno Mars” and we get the response, “I’m sorry, I can’t find Brutal Cars,” some frustration would be natural. This is certainly part of it. But that’s an adoption threshold that will eventually yield to sheer processing brute strength. I suspect our reluctance to talk to an object is found in the fact that we’re talking to an object. It doesn’t feel right. It makes us look addle-minded. We make fun of people who speak when there’s no one else in the room.

Our relationship with language is an intimately nuanced one. It’s a relatively newly acquired skill, in evolutionary terms, so it takes up a fair amount of cognitive processing. Granted, no matter what the interface, we currently have to translate desire into language, and speaking is certainly more efficient than typing, so it should be a natural step forward in our relationship with machines. But we also have to remember that verbal communication is the most social of things. In our minds, we have created a well-worn slot for speaking, and it’s something to be done when sitting across from another human.

Mental associations are critical for how we make sense of things. We are natural categorizers. And, if we haven’t found an appropriate category when we encounter something new, we adapt an existing one. I think vocal activation may be creating cognitive dissonance in our mental categorization schema. Interaction with devices is a generally solitary endeavor. Talking is a group activity. Something here just doesn’t seem to fit. We’re finding it hard to reconcile our usage of language and our interaction with machines.

I have no idea if I’m right about this. Perhaps I’m just being a Luddite. But given that my entire family, and most of my friends, have had voice activation capable phones for several years now and none of them use that feature except on very rare occasions, I thought it was worth mentioning.

By the way, let’s just keep this between you and I. Don’t tell Siri.

Donald Trump, The Clickbait Candidate

Intellectually, I hate clickbait. But do I click on it? You bet. Usually before I stop to think. It hits me in the quick and dirty (in every sense of the word) part of my brain. Much as I know I should be better than this, I find myself clicking through more viscerally tantalizing slideshows than I would care to admit. Humans, of which I number myself one, are suckers for sensationalism.

So, I admit to human foibles. But in doing so, I stress that they’re something we should strive to overcome. Ration should rule the day. We should not embrace a future that’s built on the pushing of our collective hot buttons.

That’s why the current ascendency of one Mr. Trump is scaring the hell out of me.

Donald Trump is not stupid. He’s built his campaign to be one massive, ongoing A/B clickbait test. He floats Outrageous Remark A against Outrageous Remark B to see which generates the biggest response. He’s probing the collective psyche of America to see what goes viral. And he knows that virality cannot live in the middle of the road. It has to live in the extreme margins. In order to be sensational, you have to provoke senses. You have to push buttons. To get people to love you, you also have to get people to hate you. It was an inevitable evolution of politicking in the Age of the Internet.

To this point, Trumps tactics appear to be working. He’s distancing his Republican opponents by increasing margins (the latest has him doubling Jeb Bush’s support, at 32% vs 16%). He’s even closing in on Hilary Clinton, trailing by just 6% in a recent poll. Trump’s sledgehammer-subtle attack on the quick and dirty shortcuts of our brains seems to be triumphing over any rational appeal to the slow and reasoned loops of logic.

But is this really how we want our leaders to be chosen?

In 1856, America was edging closer to the ideological precipice of the Civil War. It was a time when it was easy to ignite hair-triggered passions. And the country was captivated by one senatorial race in particular – in the state of Illinois. There, incumbent Stephen A. Douglas was running against a little known lawyer who had served one largely unremarkable term in Congress. His name was Abraham Lincoln. As part of the campaign, Douglas agreed to debate Lincoln on what was the only real issue of the election – the future of slavery. Prior to the debates, popular opinion had it that Douglas would eviscerate Lincoln.

lincolndouglasThe series of seven debates were spread around the state over a period of 56 days. The stakes were profound. Over 14% of the US population was black. Of them, almost 90% were slaves. The future of the union revolved on the thorny question of the legality of slavery. No matter what side of the issue you were on, whatever came out of your mouth was guaranteed to be provocative.

Each debate was 3 hours in length. The first speaker spoke for 60 minutes, the other candidate had 90 minutes to respond, and the first speaker had an additional 30 minutes as a rejoinder. In total, that was 21 hours of usually eloquent political debate. The full text of all speeches were published almost verbatim in the nation’s newspapers (papers usually fixed the grammatical errors of whichever candidate they were supporting, while leaving the opponent’s remarks in rough form.) Lincoln got off to a rough start, but hit his stride midway through the debates. By the final two debates, in Quincy and Alton, most everyone who was at objective felt that Lincoln was the clear winner. He ended up losing the senatorial race to Douglas, but emerged as the national champion of abolitionists. The momentum from those debates eventually carried him into the presidency 4 years later.

In these debates, Lincoln managed to do something extraordinary. He reframed the slavery debate – moving it from a question of social equality to one of legal liberty. This sidestepped some of the fiercely held beliefs and allowed for a more rational examination of the question. Beliefs are the bedrock of the quick and dirty mechanisms of our mind. It’s relatively easy to connect with someone’s beliefs. You just have to know the right buttons to push. It’s much more difficult to encourage people to think, as Lincoln did, and push them to question their beliefs. Beliefs act as bulwarks against open and rational consideration.

By the way, if you’re not familiar with the term, a bulwark is a great wall built to keep things out. Like, for example, a great wall on the US/Mexican border.

Why Disruptive Change is Disruptive

There were a lot of responses to my last column, looking at why agencies and clients have hit the point of irreconcilable differences. Many of those responses were in agreement. In fact, none were in outright disagreement. This surprised me. A lot of Online Spin readers are people who work for very big agencies. I can only conclude that you elected to show your dissention through your silence.

But there were many that fell in the “Yeah-but” category:

Tiffany Lyman Otten wrote,

“This, like anything, is a sign simply that agencies must evolve – again.

Jill Montaigne adds,

“Yet, our own ongoing advertiser conversations confirm that rather than walking away from their traditional agency relationships, clients desperately need and want their agencies to evolve.”

David Vawter chimes in,

“As long as there is something to sell, people will be needed to create and produce the ideas that sell it.”

Agreed. But…

All of the above comments pointed to a new trend in the marketing ecosystem – that of a network of specialists, often in the form of micro-agencies, that appear to be finding niches to hang on to in the tidal wave of change that is sweeping over our industry.

I used to head one of these agencies. Our area of specialty was in user behavior with search interfaces. We did well in this niche. So well, in fact, that we were eventually acquired by a bigger agency. Bigger agencies are always vertically integrated. As such, they offer clients the one-stop shop model. They move to that model because that is the model they know. It is the model they are programmed to create. It is an organizational form that is dictated by their P&L targets. There is no operational wiggle-room here. They simply can’t become anything else.

Tiffany, Jill and several others all used the word evolve, like it is a magical formula for survival. But evolution is like a tree. Once your branch has been determined, you have to evolve outward from that branch. You can’t suddenly leap to another branch. If you’re a chimpanzee, you can’t suddenly decide one day to evolve into a budgie. You can evolve into a new type of chimpanzee, but you’re still a chimpanzee.

What does happen in evolution, however, is that the environment changes so drastically that the tree is dramatically pruned. Some branches are lopped off, so that new branches can sprout. This is called punctuated equilibrium, and, as I’ve said before, this is what I believe we’re going through right now in marketing. Yes, as David rightly notes, “As long as there is something to sell, people will be needed to create and produce the ideas that sell it.” It’s just that the form that takes may be dramatically different that what we currently know. It could be – correction – will be a marketing ecosystem that will be dominated by new species of marketers.

We tend to equate evolution with change – but evolution is a very specific kind of change. It’s change in response to environmental pressures. And while individual species can evolve, so can entire ecosystems. In that bigger picture, some species will emerge and thrive and others will disappear. What is happening to agencies now is just a ripple effect from a much bigger environmental change – analogous to a planet size asteroid slamming into the business and marketing ecosystem that evolved over the past two centuries.

Big agencies are the result of corporate evolution in the previous ecosystem. We are quick to take them to task for being slow, or dumb, or oblivious to client needs. And perhaps, in the new ecosystem, those things are true. But those are the characteristics of the species. No agency intends to be dumb or unresponsive. It’s just an evolutionary mismatch caused by massive disruption in the environment.

These things happen. It’s actually a good thing. Joseph Schumpeter called it Creative Destruction. But, as the name implies, it’s a zero sum game. For something to be created, something has to be destroyed.