Drowning in a Sea of Tech

The world is becoming a pretty technical place. The Internet of Things is surrounding us. Which sounds exciting. Until the Internet of Things doesn’t work.

Then what?

I know all these tech companies have scores of really smart people who work to make their own individual tech as trouble free as possible. Although the term has lost its contextual meaning, we’re all still aiming for “plug and play”. For people of a certain age – me, for example – this used to refer to a physical context; being able to plug stuff into a computer and have it simply started working. Now, we plug technology into our lives and hopes it plays well with all the other technology that it finds there.

But that isn’t always the case – is it? Sometimes, as Mediapost IoT Daily editor Chuck Martin recently related, technology refuses to play nice together. And because we now have so much technology interacting in so many hidden ways, it becomes very difficult to root out the culprit when something goes wrong.

Let me give you an example. My wife has been complaining for some time that her iPhone has been unable to take a picture because it has no storage available, even though it’s supposed to magically transport stuff off to the “Cloud”. This past weekend, I finally dug in to see what the problem was. The problem, as it turned out, was that the phone was bloated with thousands of emails and Messenger chats that were hidden and couldn’t be deleted. They were sucking up all the available storage. After more than an hour of investigation, I managed to clear up the Messenger cache but the email problem – which I’ve traced back to some issues with configuration of the account at her email provider – is still “in progress.”

We – and by “we” I include me and all you readers – are a fairly tech savvy group. With enough time and enough Google searches, we can probably hunt down and eliminate most bugs that might pop up. But that’s us. There are many more people who are like my wife. She doesn’t care about incorrectly configured email accounts or hidden caches. She just wants shit to work. She wants to be able to take a picture of my nephew on his 6th birthday. And when she can’t do that, the quality of my life takes a sudden downturn.

The more that tech becomes interconnected, the more likely it is that stuff can stop working for some arcane reason that only a network or software engineer can figure out. It’s getting to the point where all of us are going to need a full-time IT tech just to keep our households running. And I don’t know about you, but I don’t know where they’re going to sleep. Our guest room is full of broken down computers and printers right now.

For most of us, there is a triage sequence of responses to tech-related pains in the ass:

  1. First, we ignore the problem, hoping it will go away.
  2. Second, we reboot every piece of tech related to the problem, hoping it will go away.
  3. If neither of the above work, we marginalize the problem, working around it and hoping that eventually it will go away.
  4. If none of this works, we try to upgrade our way out of the problem, buying newer tech hoping that by tossing our old tech baby out the window, the problem will be flushed out along with the bath water.
  5. Finally, in rare cases (with the right people) – we actually dig into the problem, trying to resolve it

By the way, it hasn’t escaped my notice that there’s a pretty significant profit motive in point number 4 above. A conspiracy, perchance? Apple, Microsoft and Google wouldn’t do that to us, would they?

I’m all for the Internet of Things. I’m ready for self-driving cars, smart houses and bio-tech enhanced humans. But my “when you get a chance could you check…” list is getting unmanageably long. I’d be more than happy to live the rest of my life without having to “go into settings” or “check my preferences.”

Just last night I dreamt that I was trying to swim to a deserted tropical island but I kept drowning in a sea of Apple Watches. I called for help but the only person that could hear me was Siri. And she just kept saying, “I’m really sorry about this but I cannot take any requests right now. Please try again later…”

Do you think it means anything?

 

The Winona Ryder Effect

I was in the U.S. last week. It was my first visit in the Trump era.

It was weird. I was in California, so the full effect was muted, but I watched my tongue when meeting strangers. And that’s speaking as a Canadian, where watching your tongue is a national pastime. (As an aside, my US host, Lance, told me about a recent post on a satire site: “Concerned, But Not Wanting To Offend, Canada Quietly Plants Privacy Hedge Along Entire U.S. Border.” That’s so us.) There was a feeling that I had not felt before. As someone who has spent a lot of time in the US over the past decade or two, I felt a little less comfortable. There was a disconnect that was new to me.

Little did I know (because I’ve turned off my mobile CNN alerts since January 20th because I was slipping into depression) but just after I whisked through Sea-Tac airport with all the privilege that being a white male affords you, Washington Governor Jay Inslee would hold a press conference denouncing the new Trump Muslim ban in no uncertain terms. On the other side of the TSA security gates there were a thousand protesters gathering. I didn’t learn about this until I got home.

Like I said, it was weird.

And then there were the SAG awards on Sunday night. What the hell was the deal with Winona Ryder?

When the Stranger Things cast got on stage to accept their ensemble acting award, spokesperson David Harbour unleashed a fiery anti-Trump speech. But despite his passion and volume, it was Winona Ryder, standing beside him, that lit up the share button. And she didn’t say a word. Instead, her face contorted through a series of twenty-some different expressions in under 2 minutes. She became, as one Twitter post said, a “human gif machine.”

Now, by her own admission, Winona is fragile. She has battled depression and anxiety for much of her professional life. Maybe she was having a minor breakdown in front of the world. Or maybe this was a premeditated and choreographed social media master stroke. Either way, it says something about us.

The Stranger Things cast hadn’t even left the stage before the Twitterverse started spreading the Ryder meme. If you look at Google Trends there was a huge spike in searches for Winona Ryder starting right around 6:15 pm (PST) Sunday night. It peaked at 6:48 pm with a volume about 20 times that of queries for Ms. Ryder before the broadcast began.

It was David Harbour that delivered the speech Ryder was reacting to. The words were his, and while there was also a spike in searches for him coinciding with the speech, he didn’t come close to matching the viral popularity of the Ryder meme. At its peak, there were 5 searches for “Winona Ryder” for every search for “David Harbour.”

Ryder’s mugging was – premeditated or not – extremely meme-worthy. It was visual, it was over the top and – most importantly – it was a blank canvas we could project our own views on to. Winona didn’t give us any words, so we could fill in our own. We could use it to provide a somewhat bizarre exclamation point to our own views, expressed through social media.

As I was watching this happen, I knew this was going to go viral. Maybe it’s because it takes something pretty surreal to make a dent in an increasingly surreal world that leaves us numb. When the noise that surrounds us seems increasingly unfathomable, we need something like this to prick our consciousness and make us sit up and take notice. Then we hunker down again before we’re pummelled with the next bit of reality.

Let me give you one example.

As I was watching the SAG awards Sunday night, I was unaware that gunmen had opened fire on Muslim worshippers praying in a mosque in Quebec City. I only found out after I flicked through the channels after the broadcast ended. Today, as I write this, I now know that six are dead because someone hated Muslims that much. Canada also has extreme racism.

I find it hard to think about that. It’s easier to think about Winona Ryder’s funny faces. That’s not very noble, I know, but sometimes you have to go with what you’re actually able to wrap your mind around.

The Vanishing Value of the Truth

You know, the very powerful and the very stupid have one thing in common. They don’t alter their views to fit the facts. They alter the facts to fit the views.

Dr. Who, 1977

We might be in a period of ethical crisis. Or not. It’s tough to say. It really depends on what you believe. And that, in a nutshell, is the whole problem.

Take this past weekend for example. Brand new White House Press Secretary Sean Spicer, in his very first address, lied about the size of the inauguration crowd. Afterwards, a very cantankerous Kellyanne Conway defended the lying when confronted by Chuck Todd on Meet the Press. She said they weren’t lies…they were “Alternate Facts”.

http://www.nbcnews.com/widget/video-embed/860142147643

So, what exactly is an alternate fact? It’s something that is not a fact at all, but a narrative intended to be believed by a segment of the population, presumably to gain something from them.

To use a popular turn of phrase, it’s “Faking It til You Make It!”

And there you have the mantra of our society. We’re rewarding alternate facts on the theory that the end justifies the means. If we throw a blizzard of alternate facts out there that resonate with our audience’s beliefs, we’ll get what we want.

The Fake It Til You Make It syndrome is popping up everywhere. It’s always been a part of marketing and advertising. Arguably, the entire industry is based on alternate facts. But it’s also showing up in the development of new products and services, especially in the digital domain. While Eric Ries never espoused dishonesty in his book, The Lean Start Up, the idea of a Minimal Viable Product certainly lends itself to the principle of “faking it until you make it.” Agile development, in its purest sense, is about user feedback and rapid iteration, but humans being humans, it’s tough to resist the temptation to oversell each iteration, treading dangerously close to pitching “vaporware.” Then we hope like hell that the next development cycle will bridge some of the gap between reality and the alternate facts we sold the prospective customer.

I think we have to accept that our world may not place much value on the truth any more. It’s a slide that started about 100 years ago.

The Seven Habits of Highly Effective People author Stephen Covey reviewed the history of success literature in the US from the 1700’s forward. In the first 150 years of America’s history, all the success literature was about building character. Character was defined by words like integrity, kindness, virtue and honor. The most important thing was to be a good person.

Honesty was a fundamental underpinning of the Character Ethic. This coincided with the Enlightenment in Europe. Intellectually, this movement elevated truth above belief. Our modern concept of science gained its legs: “a branch of knowledge or study dealing with a body of facts or truths.” The concepts of honor and honesty were intertwined

But Covey noticed that things changed after the First World War. Success literature became preoccupied with the concept of personality. It was important to be likeable, extroverted, and influential. The most important thing was to be successful. Somehow, being truthful got lost in the noise generated by the rush to get rich.

Here’s the interesting thing about personality and character. Psychologists have found that your personality is resistant to change. Personality tends to work below the conscious surface and scripts play out without a lot of mindful intervention. You can read all the self-help books in the world and you probably won’t change your personality very much. But character can be worked on. Building character is an exercise in mindfulness. You have to make a conscious choice to be honest.

The other interesting thing about personality and character is how other people see you. We are wired to pick up on other people’s personalities almost instantly. We start picking up the subconscious cues immediately after meeting someone. But it takes a long time to determine a person’s character. You have to go through character-testing experiences before you can know if they’re really a good person. Character cuts to the core, where as personality is skin deep. But in this world of “labelability” (where we think we know people better than we actually do) we often substitute personality cues for character. If a person is outgoing, confident and fun, we believe them to be trustworthy, moral and honest.

This all adds up to some worrying consequences. If we have built a society where success is worth more than integrity, then our navigational bearings become dependent on context. Behavior becomes contingent on circumstances. Things that should be absolute become relative. Truth becomes what you believe is the most expedient and useful in a given situation.

Welcome to the world of alternate facts.

Watson:2020 – America’s Self-Driving Presidency

Ken Jennings, the second most successful Jeopardy player of all time, has an IQ of 175. That makes him smarter than 99.9998615605% of everybody. If you put him in a city the size of Indianapolis, he’d probably be the smartest person there. In fact, in all of the US, statistics say there are only 443 people that would be smarter than Mr. Jennings.

And one machine. Let’s not forget IBM’s Watson whupped Jennings’ ass over two days, piling up $77,147 in winnings to Jennings $24,000. It wasn’t even close. Watson won by a factor of more than 3 to 1.

That’s why I think Watson should run for president in 2020. Bear with me.

Donald Trump’s IQ is probably in the 119 range (not 156 as he boasts – but then he also boasted that every woman who ever appeared on the Apprentice flirted with him). Of course we’ll never know. Like his tax returns, any actual evidence of his intelligence is unavailable. But let’s go with 119. That makes him smarter than 88.24% of the population, which isn’t bad, but it also isn’t great. According to Wikipedia, if that IQ estimate were correct, he would be the second dumbest president in history, slightly ahead of Gerald Ford. Here’s another way to think about it. If you were standing at a moderately busy bus stop, chances are somebody else waiting with you would be smarter than the President Elect of the United States.

Watson won Jeopardy in 2011. Since then, he’s become smarter, becoming an expert in health, law, real estate, finance, weather – even cooking. And when I say expert, I mean Watson knows more about those things than anyone alive.

Donald Trump, on the other hand, has probably learned little in the last 5 years because, apparently, he doesn’t have time to read. But that’s okay, because he reaches the right decisions

“with very little knowledge other than the knowledge I [already] had, plus the words ‘common sense,’ because I have a lot of common sense and I have a lot of business ability.”

In the President Elect’s mind, that also qualifies him to “wing it” with things like international relations, security risks, emerging world events, domestic crises and the other stuff on his daily to-do list. He has also decided that he doesn’t need his regular intelligence briefing, reiterating:

“You know, I’m, like, a smart person. I don’t have to be told the same thing in the same words every single day for the next eight years. Could be eight years — but eight years. I don’t need that.”

That’s right, the future leader of the free world is, “you know, like, a smart person.”

Now, President Watson could also decide to skip the briefing, but that’s because Watson can process 500 gigabytes – the equivalent of a million books – per second. Any analyst or advisor would be hard pressed to keep up.

Let’s talk about technology. Donald Trump doesn’t appear to know how to use a computer. His technical prowess seems to begin and end with midnight use of Twitter. To be fair, Hillary Clinton was also bamboozled by technology, as one errant email server showed all too clearly. But Watson is technology: and if you can follow this description from Wikipedia, apparently pretty impressive technology: “a cluster of ninety IBM Power 750 servers, each of which uses a 3.5 GHz POWER7 eight-core processor, with four threads per core. In total, the system has 2,880 POWER7 processor threads and 16 terabytes of RAM.

In a presidential debate, or, for that matter, a tweet, Watson can simultaneously retrieve from its onboard 16-terabyte memory, process, formulate and fact check. Presumably, unlike Trump, Watson could remember whether or not he said global warming was a hoax, how long ISIS has actually been around and whether he in fact had the world’s greatest memory. At the very least, Watson would know how to spell “unprecedented

But let’s get down to the real question, whose digit do you want on the button: Trump’s “long and beautiful” fingers or Watson’s bionic thumb? Watson – who can instantly and rationally process terabytes of information to determine optimum alternatives – or Trump – who’s philosophy is that “it really doesn’t matter…as long as you’ve got a young and beautiful piece of *ss.”

I know what you’re thinking – this is us finally surrendering to the machines. But at least it’s intelligence – even if it is artificial.

Note: In writing what I thought was satire, I found once again that fact is stranger than fiction. Somebody already thought of this 4 years ago: http://watson2016.com/

The Calcification of a Columnist

First: the Caveat. I’m old and grumpy. That is self-evident. There is no need to remind me.

But even with this truth established, the fact is that I’ve noticed a trend. Increasingly, when I come to write this column, I get depressed. The more I look for a topic to write about, the more my mood spirals downward.

I’ve been writing for Mediapost for over 12 years now. Together, between the Search Insider and Online Spin, that’s close to 600 columns. Many – if not most – of those have been focused on the intersection between technology and human behavior. I’m fascinated by what happens when evolved instincts meet technological disruption.

When I started this gig I was mostly optimistic. I was amazed by the possibilities and – somewhat naively it turns out – believed it would make us better. Unlimited access to information, the ability to connect with anyone – anywhere, new ways to reach beyond the limits of our own DNA; how could this not make humans amazing?

Why, then, do we seem to be going backwards? What I didn’t realize at the time is that technology is like a magnifying glass. Yes, it can make the good of human nature better, but it can also make the bad worse. Not only that, but Technology also has a nasty habit of throwing in unintended consequences; little gotchas we never saw coming that have massive moral implications. Disruption can be a good thing, but it can also rip things apart in a thrice that took centuries of careful and thoughtful building to put in place. Black Swans have little regard for ethics or morality.

I have always said that technology doesn’t change behaviors. It enables behaviors. When it comes to the things that matter, our innate instincts and beliefs, we are not perceptibly different than our distant ancestors were. We are driven by the same drives. Increasingly, as I look at how we use the outcomes of science and innovation to pursue these objectives, I realize that while it can enable love, courage and compassion, technology can also engender more hate, racism and misogyny. It makes us better while it also makes us worse. We are becoming caricatures of ourselves.

800px-diffusion_of_ideas

Everett Rogers, 1962

Everett Rogers plotted the diffusion of technology through the masses on a bell curve and divided us up into innovators, early adopters, early majority, late majority and laggards. The categorization was defined by our acceptance of innovation. Inevitably, then, there would be a correlation between that acceptance and our sense of optimism about the possibilities of technology. Early adopters would naturally see how technology would enable us to be better. But, as diffusion rolls through the curve we would eventually hit those for which technology is just there – another entitlement, a factor of our environment, oxygen. There is no special magic or promise here. Technology simply is.

So, to recap, I’m old and grumpy. As I started to write yet another column I was submerged in a wave of weariness.   I have to admit – I have been emotionally beat up by the last few years. I’m tired of writing about how technology is making us stupider, lazier and less tolerant when it should be making us great.

But another thing usually comes with age: perspective. This isn’t the first time that humans and disruptive technology have crossed paths. That’s been the story of our existence. Perhaps we should zoom out a bit from our current situation. Let’s set aside for a moment our navel gazing about fake news, click bait, viral hatred, connected xenophobia and erosion of public trusts. Let’s look at the bigger picture.

History isn’t sketched in straight lines. History is plotted on a curve. Correction. History is plotted in a series of waves. We are constantly correcting course. Disruption tends to swing a pendulum one way until a gathering of opposing force swings it the other way. It takes us awhile to absorb disruption, but we do – eventually.

I suspect if I were writing this in 1785 I’d be disheartened by the industrial blight that was enveloping the world. Then, like now, technology was plotting a new course for us. But in this case, we have the advantage of hindsight to put things in perspective. Consider this one fact: between 1200 and 1600 the life span of a British noble didn’t go up by even a single year. But, between 1800 and today, life expectancy for white males in the West doubled from thirty eight years to seventy six. Technology made that possible.

stevenpinker2Technology, when viewed on a longer timeline, has also made us better. If you doubt that, read psychologist and author Steven Pinker’s “Better Angels of Our Nature.” His exhaustively researched and reasoned book leads you to the inescapable conclusion that we are better now than we ever have been. We are less violent, less cruel and more peaceful than at any time in history. Technology also made that possible.

It’s okay to be frustrated by the squandering of the promise of technology. But it’s not okay to just shrug and move on. You are the opposing force that can cause the pendulum to change direction. Because, in the end, it’s not technology that makes us better. It’s how we choose to use that technology.

 

 

 

America, You’re Great (But You Might Be Surprised Why)

The first time I went to Washington D.C. I was struck by the extreme polarity I saw there. That day, the Tea Party was staging a demonstration against Obamacare on the Mall in front of the Capitol building. But this wasn’t the only event happening. The Mall was jammed with gatherings of all types – from all political angles: the right, the ultra-right and left, the rich and poor, the eager and entitled, the sage and stupid. The discourse was loud, passionate and boisterous. It was – in a word – chaos.

That chaotic polarity is, of course, defining the current election. After the second presidential debate, commentator Bob Schieffer said, with a mixture of incredulity and disgust, “How have we come to this?” The presidential debates may have hit a new low in presidential decorum, but if you dig deep enough, there is something great here.

Really.

A recent PR campaign has asked Canadians to tweet why America is great. I’m going to do it in a column instead.

You’re great because you argue loudly, passionately and boisterously. You air out ideologies in a very messy and public way. You amplify the bejeezus out of the good and the bad of human nature and then put them both in a cage match to battle it out in broad daylight. You do this knowing there will be no clear winner of this battle, but you hope and trust that the scales will tip in the right direction. There is no other country I know of that has the guts to do this in quite the way you do.

You personally may not agree with Donald Trump, but there are many that do. He is giving voice to the feelings and frustrations of a sizable chunk of the US population. And as much I personally don’t like how he’s doing it, the fact is he is doing it. Your country, your constitution and your political system has allowed a man like this to take a shot at the highest office of the land by questioning and attacking many things that many Americans hold to be inviolable. It’s scary as hell, but I have to admire you for letting it play out the way it has and trusting that eventually the process will prevail. And it has for 240 years. Candidates and elections and campaign rhetoric will all eventually disappear- but the process – your process – has always prevailed.

The polarization of the US is nothing new. It defines you. For a quick history lesson, watch The Best of Enemies on Netflix; a documentary on the televised debates of William F. Buckley and Gore Vidal clashing on left vs. right during the 1968 Nixon vs. Humphreys vs. Wallace election. What started as an intellectual dual ended with Buckley threatening to smash Vidal’s face in after being called a neo-proto-Nazi on live TV.

If you look at the US from the outside, you swear that the whole mess is going to end up in a fiery wreck. But you’ve been here before. Many times. And somehow, the resiliency of who you are and how you conduct business wins out. You careen towards disaster but you always seem to swerve at the last minute and emerge stronger than before.

I honestly don’t know how you do it. As a polite, cautious Canadian, I stand simultaneously in awe and abject terror of how you operate. You defy the physics of what should be.

You’re fundamentally, gloriously flawed..but you are unquestionably resilient. You are an amazing example of emergence. You, in the words of Nassim Nicolas Taleb – are Antifragile:

“beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better.”

You are discordant, divided and dysfunctional and somehow you’re still the most powerful and successful nation on the planet. I suspect you got there not in spite of your flaws, but because of them.

Perhaps you’re embarrassed by the current election cycle. I understand that. It has been called “unprecedented” many, many times by many, many commentators. And that may be true, but I would say it’s unprecedented only in the vigor and volume of the candidates (or, to be frank, one candidate). The boundaries of what is permissible have been pushed forcefully out. It may not be what certain constituents think is proper, but it is probably an accurate reflection of the diverse moods of the nation and, as such, it needs to be heard. You are a country of many opinions – often diametrically opposed. The US’s unique brand of democracy has had to stretch to it’s limits to accurately capture the dynamics of a nation in flux.

I don’t know what will happen November 8th. I do know that whatever happens, you will have gone through the fire yet again. You will emerge. You will do what needs to be done. And I suspect that, once again, you’ll be the stronger for it.

What Would a “Time Well Spent” World Look Like?

I’m worried about us. And it’s not just because we seem bent on death by ultra-conservative parochialism and xenophobia. I’m worried because I believe we’re spending all our time doing the wrong things. We’re fiddling while Rome burns.

Technology is our new drug of choice and we’re hooked. We’re fascinated by the trivial. We’re dumping huge gobs of time down the drain playing virtual games, updating social statuses, clicking on clickbait and watching videos of epic wardrobe malfunctions. Humans should be better than this.

It’s okay to spend some time doing nothing. The brain needs some downtime. But something, somewhere has gone seriously wrong. We are now spending the majority of our lives doing useless things. TV used to be the biggest time suck, but in 2015, for the first time ever, the boob tube was overtaken by time spent with mobile apps. According to a survey conducted by Flurry, in the second quarter of 2015 we spent about 2.8 hours per day watching TV. And we spent 3.3 hours on mobile apps. That’s a grand total of 6.1 hours per day or one third of the time we spend awake. Yes, both things can happen at the same time, so there is undoubtedly overlap, but still- that’s a scary-assed statistic!

And it’s getting worse. In a previous Flurry poll conducted in 2013, we spent a total of 298 hours between TV and mobile apps versus 366 hours in 2015. That’s a 22.8% increase in just two years. We’re spending way more time doing nothing. And those totals don’t even include things like time spent in front of a gaming console. For kids, tack on an average of another 10 hours per week and you can double that for hard-core male gamers. Our addiction to gaming has even led to death in extreme cases.

Even in the wildest stretches of imagination, this can’t qualify as “time well spent.”

We’re treading on very dangerous and very thin ice here. And, we no longer have history to learn from. It’s the first time we’ve ever encountered this. Technology is now only one small degree of separation from plugging directly into the pleasure center of our brains. And science has proven that a good shot of self-administered dopamine can supersede everything –water, food, sex. True, these experiments were administered on rats – primarily because it’s been unethical to go too far on replicating the experiments with humans – but are you willing to risk the entire future of mankind on the bet that we’re really that much smarter than rats?

My fear is that technology is becoming a slightly more sophisticated lever we push to get that dopamine rush. And developers know exactly what they’re doing. They are making that lever as addictive as possible. They are pushing us towards the brink of death by technological lobotomization. They’re lulling us into a false sense of security by offering us the distraction of viral videos, infinitely scrolling social notification feeds and mobile game apps. It’s the intellectual equivalent of fast food – quite literally “brain candy.

Here the hypocrisy of for-profit interest becomes evident. The corporate response typically rests on individual freedom of choice and the consumer’s ability to exercise will power. “We are just giving them what they’re asking for,” touts the stereotypical PR flack. But if you have an entire industry with reams of developers and researchers all aiming to hook you on their addictive product and your only defense is the same faulty neurological defense system that has already fallen victim to fast food, porn, big tobacco, the alcohol industry and the $350 billion illegal drug trade, where would you be placing your bets?

Technology should be our greatest achievement. It should make us better, not turn us into a bunch of lazy screen-addicted louts. And it certainly could be this way. What would it mean if technology helped us spend our time well? This is the hope behind the Time Well Spent Manifesto. Ethan Harris, a design ethicist and product philosopher at Google is one of the co-directors. Here is an excerpt from the manifesto:

We believe in a new kind of design, that lets us connect without getting sucked in. And disconnect, without missing something important.

And we believe in a new kind economy that’s built to help us spend time well, where products compete to help us live by our values.

I believe in the Manifesto. I believe we’re being willingly led down a scary and potentially ruinous path. Worst of all, I believe there is nothing we can – or will – do about it. Problems like this are seldom solved by foresight and good intentions. Things only change after we drive off the cliff.

The problem is that most of us never see it coming. And we never see it coming because we’re too busy watching a video of masturbating monkeys on Youtube.

Can Stories Make Us Better?

In writing this column, I often put ideas on the shelf for a while. Sometimes, world events conspire to make one of these shelved ideas suddenly relevant. This happened this past weekend.

The idea that caught my eye some months ago was an article that explored whether robots could learn morality by reading stories. On the face of it, it was mildly intriguing. But early Sunday morning as the heartbreaking news filtered to me from Orlando, a deeper connection emerged.

When we speak of unintended consequence, which we have before, the media amplification of acts of terror are one of them. The staggeringly sad fact is that shocking casualty numbers have their own media value. And that, said one analyst who was commenting on ways to deal with terrorism, is a new reality we have to come to terms with. When we in the media business make stories news worthy we assign worth not just for news consumers but also to newsmakers – those troubled individuals who have the motivation and the means to blow apart the daily news cycle.

This same analyst, when asked how we deal with terrorism, made the point you can’t prevent lone acts of terrorism. The only answer is to use that same network of cultural connections we use to amplify catastrophic events to create an environment that dampens rather than intensifies violent impulse. We in the media and advertising industries have to use our considerable skills in setting cultural contexts to create an environment that reduces the odds of a violent outcome. And sadly, this is a game of odds. There are no absolute answers here – there is just a statistical lowering of the curve. Sometimes, despite your best efforts, the unimaginable still happens.

But how do you use the tools at our disposal to amplify morality? Here, perhaps the story I shelved some months ago can provide some clues.

In the study from Georgia Tech, Mark Riedl and Brent Harrison used stories as models of acceptable morality. For most of human history, popular culture included at least an element of moral code. We encoded the values we held most dear into our stories. It provided a base for acceptable behavior, either through positive reinforcement of commonly understood virtues (prudence, justice, temperance, courage, faith, hope and charity) or warnings about universal vices (lust, gluttony, greed, sloth, wrath, envy and pride). Sometimes these stories had religious foundations, sometimes they were secular morality fables but they all served the same purpose. They taught us what was acceptable behavior.

Stories were never originally intended to entertain. They were created to pass along knowledge and cultural wisdom. Entertainment came after when we discovered the more entertaining the story, the more effective it was at its primary purpose: education. And this is how the researchers used stories. Robots can’t be entertained, but they can be educated.

At some point in the last century, we focused on the entertainment value of stories over education and, in doing so, rotated our moral compass 180 degrees. If you look at what is most likely to titillate, sin almost always trumps sainthood. Review that list of virtues and vices and you’ll see that the stories of our current popular culture focus on vice – that list could be the programming handbook for any Hollywood producer. I don’t intend this a sermon – I enjoy Game of Thrones as much as the next person. I simply state it as a fact. Our popular culture – and the amplification that comes from it – is focused almost exclusively on the worst aspects of human nature. If robots were receiving their behavioral instruction through these stories, they would be programmed to be psychopathic moral degenerates.

For most of us, we can absorb this continual stream of anti-social programming and not be affected by it. We still know what is right and what is wrong. But in a world where it’s the “black swan” outliers that grab the news headlines, we have to think about the consequences that reach beyond the mainstream. When we abandon the moral purpose of stories and focus on their entertainment aspect, are we also abandoning a commonly understood value landscape?

If you’re looking for absolute answers here, you won’t find them. That’s just not the world we live in. And am I naïve when I say the stories we chose to tell may have an influence on isolated violent events such as happened in Orlando? Perhaps. Despite all our best intentions, Omar Mateen might still have gone horribly offside.

But all things and all people are, to some extent, products of their environment. And because we in media and advertising are storytellers, we set that cultural environment. That’s our job. Because of this, I belief we have a moral obligation. We have to start paying more attention to the stories we tell.

 

 

 

 

The World in Bite Sized Pieces

It’s hard to see the big picture when your perspective is limited to 160 characters.

Or when we keep getting distracted from said big picture by that other picture that always seems to be lurking over there on the right side of our screen – the one of Kate Upton tilting forward wearing a wet bikini.

Two things are at work here obscuring our view of the whole: Our preoccupation with the attention economy and a frantic scrambling for a new revenue model. The net result is that we’re being spoon-fed stuff that’s way too easy to digest. We’re being pandered to in the worst possible way. The world is becoming a staircase of really small steps, each of which has a bright shiny object on it urging us to scale just a little bit higher. And we, like idiots, stumble our way up the stairs.

This cannot be good for us. We become better people when we have to chew through some gristle. Or when we’re forced to eat our broccoli. The world should not be the cognitive equivalent of Captain Crunch cereal.

It’s here where human nature gets the best of us. We’re wired to prefer scintillation to substance. Our intellectual laziness and willingness to follow whatever herd seems to be heading in our direction have conspired to create a world where Donald Trump can be a viable candidate for president of the United States – where our attention span is measured in fractions of a second – where the content we consume is dictated by a popularity contest.

Our news is increasingly coming to us in smaller and smaller chunks. The exploding complexity of our world, which begs to be understood in depth, is increasingly parceled out to us in pre-digested little tidbits, pushed to our smartphone. We spend scant seconds scanning headlines to stay “up to date.” And an algorithm that is trying to understand where our interests lie usually determines the stories we see.

This algorithmic curation creates both “Filter” and “Agreement” Bubbles. The homogeneity of our social network leads to a homogeneity of content. But if we spend our entire time with others that think like us, we end up with an intellectually polarized society in which the factions that sit at opposite ends of any given spectrum are openly hostile to each other. The gaps between our respective ideas of what is right are simply too big and no one has any interest in building a bridge across them. We’re losing our ideological interface areas, those opportunities to encounter ideas that force us to rethink and reframe, broadening our worldview in the process. We sacrifice empathy and we look for news that “sounds right” to us, not matter what “right” might be.

This is a crying shame, because there is more thought provoking, intellectually rich content than ever before being produced. But there is also more sugar coated crap who’s sole purpose is to get us to click.

I’ve often talked about the elimination of friction. Usually, I think this is a good thing. Bob Garfield, in a column a few months ago, called for a whoop-ass can of WD 40 to remove all transactional friction. But if we make things too easy to access, will we also remove those cognitive barriers that force us to slow down and think, giving our rationality a change to catch up with impulse? And it’s not just on the consumption side where a little bit of friction might bring benefits. The upside of production friction was that it did slow down streams of content just long enough to introduce an editorial voice. Someone somewhere had to give some thought as to what might actually be good for us.

In other words, it was someone’s job to make sure we ate our vegetables.

The “Get It” Gap

Netflix and Chill…

It sounds innocent enough – a night of curling up on the couch in a Snuggie with no plan other than some Orange is the New Black and Häagen Dazs binging. And that’s how it started out. Innocent. Then those damned kids got hold of it, and its present meaning ended up in a place quite removed from its origin.

Unfortunately, my wife didn’t know that when she used the phrase in a Facebook post for her gift basket company. That is, until one of our daughters walked in the door and before her bags hit the floor yelled from the entryway, “Mom, you have to change your post – right now!”

“What post?”

“The Netflix and Chill one…”

“Why?”

“Unless your basket contains lubricants and condoms, I don’t think your post means what you think it means”

But how is a middle-aged parent to know? The subversion of this particular phrase has just happened in the last year. It takes me the better part of a year to remember that it’s no longer 2015 when I sign a check. There’s no way a middle-aged brain could possibly keep up with the ongoing bastardization of the English language. The threshold for “getting it” keeps getting higher, driven by the acceleration of memes through social media.

getitgapParents were never intended to “get it.” That’s the whole point. Kids want to speak their own language and have their own cultural reference points. We were no different when we were kids. Neither were our parents.

And kids always “get it.” It’s like a rite of passage. Memes propagate through social networks and when you’re 18, your social network is the most important thing in your life. New memes spread like wildfire and part of belonging to this culture depends on “getting it.” The faster things spread, the more likely it is that you can increase the “Get It” gap between you and your parents. It’s a control thing. If the parents call all the shots about everything in your life, at least you can have this one thing to call your own.

As you start to gain control, the Gap becomes less important. Our daughters are now becoming adults, so they now act as “Get It” translators and, in cases like the one above, Urban Slang enforcement officers. When we transgress, they attempt to bridge the gap.

As you get older, the “stuff” of life gets in the way of continuing to “get it.” Buying a house, getting a job and changing diapers leaves little time left over to Snapchat about Scumbag Steve or tweet “Hell yea finna get crunk!” to your Hommie gee funk-a-nator on a Friday night.

The danger comes when parents unilaterally try to cross over the gap and attempt to tap into the zeitgeist of urban slang. This is always doomed to failure. There are no exceptions. It’s like tiptoeing through a minefield with snowshoes on.

At the very least, run it past your kids before you post anything. Better yet – look it up in Urban Dictionary. Kids can’t be trusted.

“Hotchkiss – Ouuuttt!” (Mic drop here)