Damn You Technology…

Quit batting your seductive visual sensors at me. You know I can’t resist. But I often wonder what I’m giving up when I give in to your temptations. That’s why I was interested in reading Tom Goodwin’s take on the major theme at SXSW – the Battle for Humanity. He broke this down into three sub themes. I agree with them. In fact, I’ve written on all of them in the past. They were:

Data Trading – We’re creating a market for data. But when you’re the one that generated that data, who should own it?

Shift to No Screens – an increasing number of connected devices will change of concept of what it means to be online.

Content Tunnel Vision – As the content we see is increasingly filtered based on our preferences, what does that do for our perception of what is real?

But while we’re talking about our imminent surrender to the machines, I feel there are some other themes that also merit some discussion. Let’s limit it to two today.

A New Definition of Connection and Community

sapolsky

Robert Sapolsky

A few weeks ago I read an article that I found fascinating by neuroendocrinologist and author Robert Sapolsky. In it, he posits that understanding Capgras Syndrome is the key to understanding the Facebook society. Capgras, first identified by French psychiatrist Joseph Capgras, is a disorder where we can recognize a face of a person but we can’t retrieve feelings of familiarity. Those afflicted can identify the face of a loved one but swear that it’s actually an identical imposter. Recognition of a person and retrieval of emotions attached to that person are handled by two different parts of the brain. When the connection is broken, Capgras Syndrome is the result.

This bifurcation of how we identify people is interesting. There is the yin and yang of cognition and emotion. The fusiform gyrus cognitively “parses” the face and then the brain retrieves the emotions and memories that are associated with it. To a normally functioning brain, it seems seamless and connected, but because two different regions (or, in the case of emotion, a network of regions) are involved, they can neurologically evolve independently of each other. And in the age of Facebook, that could mean a significant shift in the way we recognize connections and create “cognitive communities.” Sapolsky elaborates:

Through history, Capgras syndrome has been a cultural mirror of a dissociative mind, where thoughts of recognition and feelings of intimacy have been sundered. It is still that mirror. Today we think that what is false and artificial in the world around us is substantive and meaningful. It’s not that loved ones and friends are mistaken for simulations, but that simulations are mistaken for them.

As I said in a column a few months back, we are substituting surface cues for familiarity. We are rushing into intimacy without all the messy, time consuming process of understanding and shared experience that generally accompanies it.

Brains do love to take short cuts. They’re not big on heavy lifting. Here’s another example of that…

Free Will is Replaced with An Algorithm

harari

Yuval Harari

In a conversation with historian Yuval Harari, author of the best seller Sapiens, Derek Thompson from the Atlantic explored “The Post Human World.” One of the topics they discussed was the End of Individualism.

Humans (or, at least, most humans) have believed our decisions come from a mystical soul – a transcendental something that lives above our base biology and is in control of our will. Wrapped up in this is the concept of us as an individual and our importance in the world as free thinking agents.

In the past few decades, there is a growing realization that our notion of “free will” is just the result of a cascade of biochemical processes. There is nothing magical here; there is just a chain of synaptic switches being thrown. And that being the case – if a computer can process things faster than our brains, should we simply relegate our thinking to a machine?

In many ways, this is already happening. We trust Google Maps or our GPS device more than we trust our ability to find our own way. We trust Google Search more than our own memory. We’re on the verge of trusting our wearable fitness tracking devices more than our own body’s feedback. And in all these cases, our trust in tech is justified. These things are usually right more often than we are. But when it comes to humans vs, machines, they represent a slippery slope that we’re already well down. Harari speculates what might be at the bottom:

What really happens is that the self disintegrates. It’s not that you understand your true self better, but you come to realize there is no true self. There is just a complicated connection of biochemical connections, without a core. There is no authentic voice that lives inside you.

When I lay awake worrying about technology, these are the types of things that I think about. The big question is – is humanity an outmoded model? The fact is that we evolved to be successful in a certain environment. But here’s the irony in that: we were so successful that we changed that environment to one where it was the tools we’ve created, not the creators, which are the most successful adaptation. We may have made ourselves obsolete. And that’s why really smart humans, like Bill Gates, Elon Musk and Stephen Hawking are so worried about artificial intelligence.

“It would take off on its own, and re-design itself at an ever increasing rate,” said Hawking in a recent interview with BBC. “Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”

Worried about a machine taking your job? That may be the least of your worries.

 

 

Too Many Fish in the Sea: The Search for Brand Love

I still see – in a number of MediaPost articles and in other places – a lot of talk about “brand-love.” So let’s talk about that.

My grandfather Jack, who farmed on the Canadian Prairies for most of his life, loved John Deere tractors.

And I mean L-O-V-E-D. Deep love. A love that lasted 50 some years and never – not once – did he ever consider a rival for his affection. You could have given him a brand new shiny red Massey Ferguson and it would have sat untouched behind the barn. The man bled green and yellow. He wore a John Deere ball cap everywhere. He had his grime encrusted one for every day wear and a clean one for formal occasions – things like the christening of new grandchildren and 50th wedding anniversaries. He wasn’t buried with one, but if he had his way, he would have been.

My grandpa Jack loved John Deere tractors because he loved one tractor – his tractor. And there was absolutely no logic to this love.

I’ve heard stories of Jack’s rocky road to farm equipment romance. His tractor was a mythically cantankerous beast. It often had to be patiently cajoled into turning over. It was literally held together with twine and bailing wire. At the end of its life, there was little of it that originally issued from the John Deere factory floor in Welland, Ontario. Most of it was vintage Jury-rigged Jack.

But Jack didn’t love this tractor in spite of all that. He loved it because of it. Were there better tractors than the ones John Deere made? Perhaps. Were there better tractors than this particular John Deere? Guaranteed. But that wasn’t the point. Over the years there was a lot of Jack in that tractor. It got to the point where he was the only one who was sufficiently patient to get it to run. But there was also a lot of that tractor in Jack. It made him a more patient man, more resourceful and – much to my grandmother’s never ending frustration – much more stubborn.

This is the stuff that love is made of. The tough stuff. The maddening stuff. The stuff that ain’t so pretty. A lot of times, love happens because you don’t have an alternative. I suspect love – true love – may be inversely correlated to choice. Jack couldn’t afford a new tractor. And by the time he could, he was too deeply in love to consider it.

This may be the dilemma for brands looking for love in today’s world. We may be attracted to a brand, we may even become infatuated with it, but will we fall in true love? What I call “Jack-love?”

Let me lay out some more evidence of this Love/Choice paradox.

If you believe the claims of online dating sites like Match.com and eHarmony, your odds of ending up in a happy relationship have never been better than when you put yourselves in the hands of their matching algorithm. This just makes sense. If you increase the prospects going in the front end and are much smarter about filtering your options, you should come out the winner in the end. But according to an article from the Association for Psychological Science, this claim doesn’t really stand up when subjected to academic rigor. “Regarding matching, no compelling evidence supports matching sites’ claims that mathematical algorithms work— that they foster romantic outcomes that are superior to those fostered by other means of pairing partners.”

A study, by Dr. Aditi Paul, found that couples that meet through online dating sites are less likely to enter marriage than those that meet through offline channels and; if they do wed, are more likely to split up down the road. Another study (D’Angelo and Toma) showed that the greater the number of options at the beginning, the more likely it was that online daters would question and probably reverse their choice.

What dating sites have done have turned looking for love into an exercise in foraging. And the rule of thumb in foraging is: The more we believe there are options that may be better, the less time we will be willing to invest in the current choice. It may seem sacrilegious to apply something so mundane as foraging theory to romance, but the evidence is starting to mount up. And if the search for a soul mate has become an exercise in efficient foraging, it’s not a great leap to conclude that everything else that can be determined by a search and matching algorithm has suffered the same fate. This may not be a bad thing, but I’m placing a fairly large bet that we’re looking at a very different cognitive processing path here. The brain simply wouldn’t use the same mechanisms or strategies to juggle a large number of promising alternatives as it would do fall deeply in love, like Jack and his John Deere (or my grandmother, for that matter).

The point is this. Infatuation happens quickly and can fade just as quickly. Love develops over time and it requires shared experiences. That’s something that’s pretty tough for an algorithm to predict. As the authors of the APS article said, “these sites are in a poor position to know how the two partners will grow and mature over time, what life circumstances they will confront and coping responses they will exhibit in the future, and how the dynamics of their interaction will ultimately promote or undermine romantic attraction and long-term relationship well-being.”

I’ve always felt uncomfortable with the phrase “brand-love” but I think it did provide a convenient and mostly accurate label for some brand relationships. I’m not so sure this is still true today. As I said in a previous column, branding is still aiming to engender love by latching on to our emotions but I suspect they may just be sparking infatuation.

The Winona Ryder Effect

I was in the U.S. last week. It was my first visit in the Trump era.

It was weird. I was in California, so the full effect was muted, but I watched my tongue when meeting strangers. And that’s speaking as a Canadian, where watching your tongue is a national pastime. (As an aside, my US host, Lance, told me about a recent post on a satire site: “Concerned, But Not Wanting To Offend, Canada Quietly Plants Privacy Hedge Along Entire U.S. Border.” That’s so us.) There was a feeling that I had not felt before. As someone who has spent a lot of time in the US over the past decade or two, I felt a little less comfortable. There was a disconnect that was new to me.

Little did I know (because I’ve turned off my mobile CNN alerts since January 20th because I was slipping into depression) but just after I whisked through Sea-Tac airport with all the privilege that being a white male affords you, Washington Governor Jay Inslee would hold a press conference denouncing the new Trump Muslim ban in no uncertain terms. On the other side of the TSA security gates there were a thousand protesters gathering. I didn’t learn about this until I got home.

Like I said, it was weird.

And then there were the SAG awards on Sunday night. What the hell was the deal with Winona Ryder?

When the Stranger Things cast got on stage to accept their ensemble acting award, spokesperson David Harbour unleashed a fiery anti-Trump speech. But despite his passion and volume, it was Winona Ryder, standing beside him, that lit up the share button. And she didn’t say a word. Instead, her face contorted through a series of twenty-some different expressions in under 2 minutes. She became, as one Twitter post said, a “human gif machine.”

Now, by her own admission, Winona is fragile. She has battled depression and anxiety for much of her professional life. Maybe she was having a minor breakdown in front of the world. Or maybe this was a premeditated and choreographed social media master stroke. Either way, it says something about us.

The Stranger Things cast hadn’t even left the stage before the Twitterverse started spreading the Ryder meme. If you look at Google Trends there was a huge spike in searches for Winona Ryder starting right around 6:15 pm (PST) Sunday night. It peaked at 6:48 pm with a volume about 20 times that of queries for Ms. Ryder before the broadcast began.

It was David Harbour that delivered the speech Ryder was reacting to. The words were his, and while there was also a spike in searches for him coinciding with the speech, he didn’t come close to matching the viral popularity of the Ryder meme. At its peak, there were 5 searches for “Winona Ryder” for every search for “David Harbour.”

Ryder’s mugging was – premeditated or not – extremely meme-worthy. It was visual, it was over the top and – most importantly – it was a blank canvas we could project our own views on to. Winona didn’t give us any words, so we could fill in our own. We could use it to provide a somewhat bizarre exclamation point to our own views, expressed through social media.

As I was watching this happen, I knew this was going to go viral. Maybe it’s because it takes something pretty surreal to make a dent in an increasingly surreal world that leaves us numb. When the noise that surrounds us seems increasingly unfathomable, we need something like this to prick our consciousness and make us sit up and take notice. Then we hunker down again before we’re pummelled with the next bit of reality.

Let me give you one example.

As I was watching the SAG awards Sunday night, I was unaware that gunmen had opened fire on Muslim worshippers praying in a mosque in Quebec City. I only found out after I flicked through the channels after the broadcast ended. Today, as I write this, I now know that six are dead because someone hated Muslims that much. Canada also has extreme racism.

I find it hard to think about that. It’s easier to think about Winona Ryder’s funny faces. That’s not very noble, I know, but sometimes you have to go with what you’re actually able to wrap your mind around.

The Vanishing Value of the Truth

You know, the very powerful and the very stupid have one thing in common. They don’t alter their views to fit the facts. They alter the facts to fit the views.

Dr. Who, 1977

We might be in a period of ethical crisis. Or not. It’s tough to say. It really depends on what you believe. And that, in a nutshell, is the whole problem.

Take this past weekend for example. Brand new White House Press Secretary Sean Spicer, in his very first address, lied about the size of the inauguration crowd. Afterwards, a very cantankerous Kellyanne Conway defended the lying when confronted by Chuck Todd on Meet the Press. She said they weren’t lies…they were “Alternate Facts”.

http://www.nbcnews.com/widget/video-embed/860142147643

So, what exactly is an alternate fact? It’s something that is not a fact at all, but a narrative intended to be believed by a segment of the population, presumably to gain something from them.

To use a popular turn of phrase, it’s “Faking It til You Make It!”

And there you have the mantra of our society. We’re rewarding alternate facts on the theory that the end justifies the means. If we throw a blizzard of alternate facts out there that resonate with our audience’s beliefs, we’ll get what we want.

The Fake It Til You Make It syndrome is popping up everywhere. It’s always been a part of marketing and advertising. Arguably, the entire industry is based on alternate facts. But it’s also showing up in the development of new products and services, especially in the digital domain. While Eric Ries never espoused dishonesty in his book, The Lean Start Up, the idea of a Minimal Viable Product certainly lends itself to the principle of “faking it until you make it.” Agile development, in its purest sense, is about user feedback and rapid iteration, but humans being humans, it’s tough to resist the temptation to oversell each iteration, treading dangerously close to pitching “vaporware.” Then we hope like hell that the next development cycle will bridge some of the gap between reality and the alternate facts we sold the prospective customer.

I think we have to accept that our world may not place much value on the truth any more. It’s a slide that started about 100 years ago.

The Seven Habits of Highly Effective People author Stephen Covey reviewed the history of success literature in the US from the 1700’s forward. In the first 150 years of America’s history, all the success literature was about building character. Character was defined by words like integrity, kindness, virtue and honor. The most important thing was to be a good person.

Honesty was a fundamental underpinning of the Character Ethic. This coincided with the Enlightenment in Europe. Intellectually, this movement elevated truth above belief. Our modern concept of science gained its legs: “a branch of knowledge or study dealing with a body of facts or truths.” The concepts of honor and honesty were intertwined

But Covey noticed that things changed after the First World War. Success literature became preoccupied with the concept of personality. It was important to be likeable, extroverted, and influential. The most important thing was to be successful. Somehow, being truthful got lost in the noise generated by the rush to get rich.

Here’s the interesting thing about personality and character. Psychologists have found that your personality is resistant to change. Personality tends to work below the conscious surface and scripts play out without a lot of mindful intervention. You can read all the self-help books in the world and you probably won’t change your personality very much. But character can be worked on. Building character is an exercise in mindfulness. You have to make a conscious choice to be honest.

The other interesting thing about personality and character is how other people see you. We are wired to pick up on other people’s personalities almost instantly. We start picking up the subconscious cues immediately after meeting someone. But it takes a long time to determine a person’s character. You have to go through character-testing experiences before you can know if they’re really a good person. Character cuts to the core, where as personality is skin deep. But in this world of “labelability” (where we think we know people better than we actually do) we often substitute personality cues for character. If a person is outgoing, confident and fun, we believe them to be trustworthy, moral and honest.

This all adds up to some worrying consequences. If we have built a society where success is worth more than integrity, then our navigational bearings become dependent on context. Behavior becomes contingent on circumstances. Things that should be absolute become relative. Truth becomes what you believe is the most expedient and useful in a given situation.

Welcome to the world of alternate facts.

Branding in the Post Truth Age

If 2016 was nothing else – it was a watershed year for the concept of branding. In the previous 12 months, we saw a decoupling in the two elements we have always believed make up brands. As fellow Spinner Cory Treffiletti said recently:

“You have to satisfy the emotional quotient as well as the logical quotient for your brand.  If not, then your brand isn’t balanced, and is likely to fall flat on its face.”

But another Mediapost article highlighted an interesting trend in branding:

“Brands will strive to be ‘meticulously un-designed’ in 2017, according to WPP brand agency Brand Union.”

This, I believe, speaks to where brands are going. And depending on which side of the agency desk you happen to be on, this could either be good news or downright disheartening.

Let’s start with the logical side of branding. In their book Absolute Value, Itamar Simonson and Emanuel Rosen sounded the death knell for brands as a proxy for consumer information. Their premise, which I agree with, is that in a market that is increasingly moving towards perfect information, brands have lost their position of trust. We would rather rely on information that comes from non-marketing sources.

But brands have been aspiring to transcend their logical side for at least 5 decades now. This is the emotional side of branding that Treffiletti speaks of. And here I have to disagree with Simonson and Rosen. This form of branding appears to be very much alive and well, thank you. In fact, in the past year, this form of branding has upped the game considerably.

Brands, at their most potent, embed themselves in our belief systems. It is here, close to our emotional hearts, which mark the Promised Land for brands. Reid Montague’s famous Coke neuro-imaging experiment showed that for Coke drinkers, the brand became part of who they are. Research I was involved in showed that favored brands are positively responded to in a split second, far faster than the rational brain can act. We are hardwired to believe in brands and the more loved the brand, the stronger the reaction. So let’s look at beliefs for a moment.

Not all beliefs are created equal. Our beliefs have an emotional valence – some beliefs are defended more strongly than others. There is a hierarchy of belief defense. At the highest level are our Core beliefs; how we feel about things like politics and religion. Brands are trying to intrude on this core belief space. There has been no better example of this than the brand of Donald Trump.

Beliefs are funny things. From an evolutionary perspective, they’re valuable. They’re mental shortcuts that guide our actions without requiring us to think. They are a type of emotional auto-pilot. But they can also be quite dangerous for the same reason. We defend our beliefs against skeptics – and we defend our core beliefs most vigorously. Ration has nothing to do with it. It is this type of defense system that brands would love to build around themselves.

We like to believe our beliefs are unique to us – but in actual fact, beliefs also materialize out of our social connections. If enough people in our social network believe something is true, so will we. We will even create false memories and narratives to support the fiction. The evolutionary logic is quite simple. Tribes have better odds for survival than individuals, and our tribe will be more successful if we all think the same way about certain things. Beliefs create tribal cohesion.

So, the question is – how does a brand become a belief? It’s this question that possibly points the way in which brands will evolve in the Post-Truth future.

Up to now, brands have always been unilaterally “manufactured” – carefully crafted by agencies as a distillation of marketing messages and delivered to an audience. But now, brands are multilaterally “emergent” – formed through a network of socially connected interactions. All brands are now trying to ride the amplified waves of social media. This means they have to be “meme-worthy” – which really means they have to be both note and share-worthy. To become more amplifiable, brands will become more “jagged,” trying to act as catalysts for going viral. Branding messages will naturally evolve towards outlier extremes in their quest to be noticed and interacted with. Brands are aspiring to become “brain-worms” – wait, that’s not quite right – brands are becoming “belief-worms,” slipping past the rational brain if at all possible to lodge themselves directly in our belief systems. Brands want to be emotional shorthand notations that resonate with our most deeply held core beliefs. We have constructed a narrative of who we are and brands that fit that narrative are adopted and amplified.

It’s this version of branding that seems to be where we’re headed – a socially infectious virus that creates it’s own version of the truth and builds a bulwark of belief to defend itself. Increasingly, branding has nothing to do with rational thought or a quest for absolute value.

The Magic of the Internet Through My Dad’s Eyes

“Would you rather lose a limb or never be able to access the Internet?” My daughter looked at me, waiting for my answer.

“Well?”

We were playing the game “Would You Rather” during a lull in the Christmas festivities. The whole point of the game is to pose two random and usually bizarre alternatives to choose from. Once you do, you see how others have answered. It’s a hard game to take seriously.

Except for this question. This one hit me like a hammer blow.

“I have to say I’d rather lose a limb.”

Wow. I would rather lose an arm or a leg than lose something I didn’t even know existed 20 years ago. That’s a pretty sobering thought. I am so dependent on this technical artifact that I value it more than parts of my own body.

During the same holiday season, my stepdad came to visit. He has two cherished possessions that are always with him. One is a pocketknife his father gave him. The other is an iPhone 3 that my sister gave him when she upgraded. Dad doesn’t do much on his phone. But what he does do is critically important to him. He texts his kids and he checks the weather. If you grew up on a farm on the Canadian prairies during the 1930’s, you literally lived and died according to the weather. So, for Dad, it’s magic of the highest sort to be able to know what the temperature is in the places where his favorite people live. We kids have added all our home locations to his weather app, as well as that of his sister-in-law. Dad checks the weather in Edmonton (Alberta), Calgary (Alberta), Kelowna (BC), Orillia (Ontario) and his hometown of Sundre (Alberta) constantly. It’s his way of keeping tabs on us when he can’t be with us.

I wonder what Dad would say if I asked him to choose between his iPhone and his right arm. I suspect he’d have to think about it. I do know the first thing I have to do when he comes to our place is set him up on our home wifi network.

It’s easy to talk about how Millennials or Gen-X’s are dependent on technology. But for me, it really strikes home when I watch people of my parent’s generation hold on to some aspect of technology for dear life because it enables them to do something so fundamentally important to them. They understand something we don’t. They understand what Arthur C. Clarke meant when he said,

“Any sufficiently advanced technology is indistinguishable from magic.”

To understand this, look for a moment through the eyes of my Dad when he was a child. He rode a horse to school – a tiny one room building that was heated with a wood stove. Its library consisted of two bookshelves on the back wall. A circle whose radius was defined by how far you could drive the wagon in a single day bound the world of which he was aware. That world consisted of several farms, the Eagle Hill Co-op store, the tiny town of Sundre, his school and the post office. The last was particularly important, because that’s where the packages you ordered from the Eaton’s catalogue (the Canadian equivalent of Sears Roebuck) would come.

It’s to this post office that my step-dad dragged his sleigh about 75 years ago. He didn’t know it at the time, but he was picking up his Christmas present. His mother, whose own paternal grandfather was a contemporary and friend of Charles Darwin, had saved milk money for several months to purchase a three-volume encyclopaedia for the home. Nobody else they knew had an encyclopaedia. Books were rare enough. But for Isobel (Buckman) Leckie, knowledge was an investment worth making. Those three books became the gift of a much bigger world for my Dad.

It’s easy to make fun of seniors for their simultaneous amazement of and bewilderment by technology. We chuckle when Dad does his third “weather round-up” of the day. We get frustrated when he can’t seem to understand how wifi works. But let’s put this in the context of the change he has seen in his life on this earth. This is not just an obsolete iPhone 3 that he holds in his hand. This is something for which the adjective “magical” seems apt.

Perhaps it’s even magic you’d pay an arm and a leg for.

The Calcification of a Columnist

First: the Caveat. I’m old and grumpy. That is self-evident. There is no need to remind me.

But even with this truth established, the fact is that I’ve noticed a trend. Increasingly, when I come to write this column, I get depressed. The more I look for a topic to write about, the more my mood spirals downward.

I’ve been writing for Mediapost for over 12 years now. Together, between the Search Insider and Online Spin, that’s close to 600 columns. Many – if not most – of those have been focused on the intersection between technology and human behavior. I’m fascinated by what happens when evolved instincts meet technological disruption.

When I started this gig I was mostly optimistic. I was amazed by the possibilities and – somewhat naively it turns out – believed it would make us better. Unlimited access to information, the ability to connect with anyone – anywhere, new ways to reach beyond the limits of our own DNA; how could this not make humans amazing?

Why, then, do we seem to be going backwards? What I didn’t realize at the time is that technology is like a magnifying glass. Yes, it can make the good of human nature better, but it can also make the bad worse. Not only that, but Technology also has a nasty habit of throwing in unintended consequences; little gotchas we never saw coming that have massive moral implications. Disruption can be a good thing, but it can also rip things apart in a thrice that took centuries of careful and thoughtful building to put in place. Black Swans have little regard for ethics or morality.

I have always said that technology doesn’t change behaviors. It enables behaviors. When it comes to the things that matter, our innate instincts and beliefs, we are not perceptibly different than our distant ancestors were. We are driven by the same drives. Increasingly, as I look at how we use the outcomes of science and innovation to pursue these objectives, I realize that while it can enable love, courage and compassion, technology can also engender more hate, racism and misogyny. It makes us better while it also makes us worse. We are becoming caricatures of ourselves.

800px-diffusion_of_ideas

Everett Rogers, 1962

Everett Rogers plotted the diffusion of technology through the masses on a bell curve and divided us up into innovators, early adopters, early majority, late majority and laggards. The categorization was defined by our acceptance of innovation. Inevitably, then, there would be a correlation between that acceptance and our sense of optimism about the possibilities of technology. Early adopters would naturally see how technology would enable us to be better. But, as diffusion rolls through the curve we would eventually hit those for which technology is just there – another entitlement, a factor of our environment, oxygen. There is no special magic or promise here. Technology simply is.

So, to recap, I’m old and grumpy. As I started to write yet another column I was submerged in a wave of weariness.   I have to admit – I have been emotionally beat up by the last few years. I’m tired of writing about how technology is making us stupider, lazier and less tolerant when it should be making us great.

But another thing usually comes with age: perspective. This isn’t the first time that humans and disruptive technology have crossed paths. That’s been the story of our existence. Perhaps we should zoom out a bit from our current situation. Let’s set aside for a moment our navel gazing about fake news, click bait, viral hatred, connected xenophobia and erosion of public trusts. Let’s look at the bigger picture.

History isn’t sketched in straight lines. History is plotted on a curve. Correction. History is plotted in a series of waves. We are constantly correcting course. Disruption tends to swing a pendulum one way until a gathering of opposing force swings it the other way. It takes us awhile to absorb disruption, but we do – eventually.

I suspect if I were writing this in 1785 I’d be disheartened by the industrial blight that was enveloping the world. Then, like now, technology was plotting a new course for us. But in this case, we have the advantage of hindsight to put things in perspective. Consider this one fact: between 1200 and 1600 the life span of a British noble didn’t go up by even a single year. But, between 1800 and today, life expectancy for white males in the West doubled from thirty eight years to seventy six. Technology made that possible.

stevenpinker2Technology, when viewed on a longer timeline, has also made us better. If you doubt that, read psychologist and author Steven Pinker’s “Better Angels of Our Nature.” His exhaustively researched and reasoned book leads you to the inescapable conclusion that we are better now than we ever have been. We are less violent, less cruel and more peaceful than at any time in history. Technology also made that possible.

It’s okay to be frustrated by the squandering of the promise of technology. But it’s not okay to just shrug and move on. You are the opposing force that can cause the pendulum to change direction. Because, in the end, it’s not technology that makes us better. It’s how we choose to use that technology.