Personal Endeavour in the Age of Instant Judgement

No one likes to be judged — not even gymnasts and figure skaters. But at least in those sports, the judges supposedly know what it is they’re judging. So, in the spirit of instant feedback, let me rephrase: No one likes to be judged by a peanut gallery*. Or, to use a more era appropriate moniker, by a troll’s chorus.

Because of this, I feel sorry for David Benioff and D.B. Weiss, the showrunners of “Game of Thrones.” Those poor bastards couldn’t be any more doomed if they had been invited to a wedding of the red variety.

At least they were aware of their fate. In an interview with Entertainment Weekly, they disclosed their plans for the airing of the final episode. “We’ll in an undisclosed location, turning off our phones and opening various bottles,” Weiss admitted. “At some point, if and when it’s safe to come out again, somebody like [HBO’s ‘GOT’ publicist] will give us a breakdown of what was out there without us having to actually experience it.” Added Benioff: “I plan to be very drunk and very far from the internet.”

Like it or not, we now live in an era of instant judgement, from everyone. It’s the evil twin of social virality. It means we have to grow thicker skins than your average full-grown dragon**. And because I’m obsessively fixated on unintended consequences, this got me to thinking. How might all this judgement impact our motivation to do stuff?

First of all, let’s look at the good that comes from this social media froth kicked up by fervent fans. There is a sense of ownership and emotional investment in shows like “Game of Thrones” that’s reached a pitch never seen before — and I truly believe we’re getting better TV because of it.

If you look at any of the lists of the best TV shows of all time, they are decidedly back-end loaded. “Game of Thrones,” even at its worst, is better than almost any television of the ’80s or ’90s. And it’s not only because of the advances in special effects and CGI wizardry. There is a plethora of thoughtful, exquisitely scripted and superbly acted shows that have nary an enchantress, dragon or apocalypse of the walking dead in sight. There is no CGI in “Better Call Saul,” “Master of None” or “Atlanta.”

But what about the dark side of social fandom?

I suspect instant judgement might make it harder for certain people to actually do anything that ends up in the public arena. All types of personal endeavors require failure and subsequent growth as an ingredient for success. And fans are getting less and less tolerant of failure. That makes the entry stakes pretty high for anyone producing output that is going to be out there, available for anyone to pass judgement on.

We might get self-selection bias in arenas like the arts, politics and sports. Those adverse to criticism that cuts too deep will avoid making themselves vulnerable. Or — upon first encountering negative feedback — they may just throw in the towel and opt for something less public.

The contributors to our culture may just become hard-nosed and impervious to outside opinion — kind of like Cersei Lannister. Or, even worse, they may be so worried about what fans think that they oscillate trying to keep all factions happy. That would be the Jon Snows of the world.

Either way, we lose the contributions of those with fragile egos and vulnerable hearts. If we applied that same filter retroactively to our historic collective culture, we’d lose most of what we now treasure.

In the end, perhaps David Benioff got it right. Just be “very drunk and very far from the internet.”

* Irrelevant Fact #1: The term peanut gallery comes from vaudeville, where the least expensive seats were occupied by the rowdiest members of the audience. The cheapest snack was peanuts, which the audience would throw at the performers.

** Irrelevant Fact #2: Dragons have thick skin because they don’t shed their skins. It just keeps getting thicker and more armor-like. The older the dragon, the thicker the skin.

The Importance of Playing Make-Believe

One of my favourite sounds in the world is children playing. Although our children are well past that age, we have stayed in a neighbourhood where new families move in all the time. One of the things that has always amazed me is a child’s ability to make believe. I used to do this but I don’t any more. At least, I don’t do it the same way I used to.

Just take a minute to think about the term itself: make-believe. The very words connote the creation of an imaginary world that you and your playmates can share, even in that brief and fleeting moment. Out of the ether, you can create an ephemeral reality where you can play God. A few adults can still do that. George R.R. Martin pulled it off. J.K. Rowling did likewise. But for most of us, our days of make-believe are well behind us.

I worry about the state of play. I am concerned that rather than making believe themselves, children today are playing in the manufactured and highly commercialized imaginations of profit-hungry corporations. There is no making — there is only consuming. And that could have some serious consequences.

Although we don’t use imagination the way we once did, it is the foundation for the most importance cognitive tasks we do. It was Albert Einstein who said, “Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution.”

It is imagination that connects the dots, explores the “what-ifs” and peeks beyond the bounds of the known. It is what separates us from machines.

In that, Einstein presciently nailed the importance of imagination. Only here does the mysterious alchemy of the human mind somehow magically weave fully formed worlds out of nothingness and snippets of reality. We may not play princess anymore, but our ability to imagine underpins everything of substance that we think about.

The importance of playing make-believe is more than just cognition. Imagination is also essential to our ability to empathize. We need it to put ourselves in place of others. Our “theory of mind” is just another instance of the many facets of imagination.

This thing we take for granted has been linked to a massive range of essential cognitive developments. In addition to the above examples, pretending gives children a safe place to begin to define their own place in society. It helps them explore interpersonal relationships. It creates the framework for them to assimilate information from the world into their own representation of reality.

We are not the only animals that play when we’re young. It’s true for many mammals, and scientists have discovered it’s also essential in species as diverse as crocodiles, turtles, octopuses and even wasps.

For other species, though, it seems play is mainly intended to help come to terms with surviving in the physical world.  We’re alone in our need for elaborate play involving imagination and cognitive games.

With typical human hubris, we adults have been on a century-long mission to structure the act of play. In doing so, we have been imposing our own rules, frameworks and expectations on something we should be keeping as is. Much of the value of play comes from its very lack of structure. Playing isn’t as effective when it’s done under adult supervision. Kids have to be kids.

Play definitely loses much of its value when it becomes passive consumption of content imagined and presented by others through digital entertainment channels. Childhood is meant to give us a blank canvas to colour with our imagination.

As we grow, the real world encroaches on this canvas.  But the delivery of child-targeted content through technology is also shrinking the boundaries of our own imagination.

Still, despite corporate interests that run counter to playing in its purest sense, I suspect that children may be more resilient than I fear. After all, I can still hear the children playing next door. And their imaginations still awe and inspire me.

Selfies: A Different Take on Reality

It was a perfect evening in Sydney Harbor. I was there for a conference and the organizers had arranged an event for the speakers at Milsons Point – under the impressive span of the Harbour bridge. It was dusk and the view of downtown Sydney spread out in front of us with awesome breadth and scope. It was one of those moments that literally takes your breath away. That minute seemed eternal.

After some time, I turned around. There was another attendee, who was intently focused on taking a selfie and posting it to social media. His back was turned to the view behind him. At first, I thought I should do the same. Then I changed my mind. I’d rely on my memory and actually try to stay in the moment. My phone stayed in my pocket.

In the age of selfies, it turns out that my mini-existential crisis is getting more common. According to a new study published in the Journal of Consumer Research, something called “self-presentational concern” can creep into these lifetime moments and suck the awe right out of them. One of the study authors, Alixandra Barasch, explains, “When people take photos to share, they remember their experience more from a third-person perspective, suggesting that taking photos to share makes people consider how the event (and the photos) would be evaluated by an observer. “

Simply stated, selfies take us “out of the moment”. But this effect depends on why we’re taking the selfie in first place. The experimenters didn’t find the effect when people took selfies with the intent of just remembering the moment. It showed up when the selfie was taken for the express purpose of sharing on social media. Suddenly, we are more worried about how we look than where we are and what we’re doing.

Dr. Terri Apter, a professor of psychology at Cambridge University, has been looking at the emergence of selfies as a form of “self-definition” for some time. “We all like the idea of being sort of in control of our image and getting attention, being noticed, being part of the culture.” But when does this very human urge slip over the edge into a destructive spiral? Dr. Apter explains, “You can get that exaggerated or exacerbated by celebrity culture that says unless you’re being noticed, you’re no one,”

I suspect what we’re seeing now is a sort of selfie arms race. Can we upstage the rest of our social network by posting selfies in increasingly exotic locations, doing exceptional things and looking ever more “Mahvelous”? That’s a lot of pressure to put on something we do when we’re just supposed to be enjoying life.

A 2015 study explored the connection between personality traits and posting of selfies. In particular, the authors of the study looked at narcissism, psychopathy and self-objectification. They found that frequent posting of selfies and being overly concerned with how you look in the selfies can be tied to both self-objectification and narcissism. This is interesting, because those two things are at opposite ends of the self-esteem spectrum. Narcissists love themselves and those that self-objectify tend to suffer from low self-esteem. In both cases, selfies represent a way to advertise their personal brands to a wider audience.

There’s another danger with selfie-preoccupation that goes hand-in-hand with distancing yourselves from the moment you’re in – you can fall victim to bad judgement. It happened to Barack Obama at Nelson Mandela’s memorial ceremony. In a moment when he should have been acting with appropriate gravitas, he decided to take a selfie with Danish Prime Minister Helle Thorning-Schmidt and then British Prime Minister David Cameron. It was a stunningly classless moment from a usually classy guy. If you check a photo taken at the time, you can see that Michelle Obama was not amused. I agree.

Like many things tied to social media, selfies can represent a troubling trend in how we look at ourselves in a social context. These things seem to be pointing in the same direction: we’re spending more time worrying about an artificial reality of our own making and less time noticing reality as it actually exists.

We just have to put the phone down sometimes and admire the view across the harbor.

 

Why We’re not Ready for AI to Take the Wheel…Yet

It’s interesting to see how we humans assign trust.

Consider the following scenario. At any time, in any city in the world, you will put your life in the hands of a complete stranger in an environment you have no control over without a second thought. We do it every time we hail a cab. We know nothing about the driver or their safety record. We don’t know if they’re a good person or a psychopath. We place trust without any empirical reason to do so.

Yet a number of recent surveys indicate the majority of us don’t trust self-driving cars. A recent survey by AAA found that 71% of us would be afraid to ride in a fully self-driving vehicle. I’m one of them. I’m not sure I could slam the door on a self-driven Uber and relax in the back seat while AI takes the wheel. Yet I pride myself on being a fairly rational person and there are plenty of rational reasons why self-driving cars should be far safer than the human powered equivalents.  Even the most skeptical measured comparisons call it a toss-up.

And that brings us to key point- we don’t assign trust rationally. We do it emotionally. And emotionally, we have a tortured relationship with technology.

The problem here is two-fold. First, our trust mechanisms are built to work best when we’re face-to-face with the potential recipient of trust. Trust evolved to be a human-dependent process. And that brings us to the second problem. Over the last thousand years or so, we have learned how to trust in institutions. But that type of trust is dissolving rapidly.

Author and academic Rachel Botsman has spent over a decade looking at how technology is transforming trust. In an interview with Fast Company, she unpacks this notion of imploding institutional trust, “Whether it’s banks, the media, government, churches . . . this institutional trust that is really important to society is disintegrating at an alarming rate. And so how do we trust people enough to get in a car with a total stranger and yet we don’t trust a banking executive? “

I think this transformation of trust has something to do with the decoupling phenomenon I wrote about last week. When we relied on vertically integrated supply chains, we had no choice but to trust the institutions that were the caretakers of those chains. But now that our markets have flipped from the vertical to the horizontal, we are redefining our notions of trust. We are digitally connecting with strangers through sharing economy platforms like AirBnB and Uber and, in the process, we are finding new signals to indicate when we should trust and when we shouldn’t.

There is another unique aspect to our decision to trust. We tend to trust when it’s expedient to do so. Like so many things in human behavior, trust is just one factor wrapped up in our ongoing risk vs reward calculations. Our emotions will push us to trust when it’s required to get what we want. The fewer the alternatives available to us, the more we tend to trust.

Our lack of trust in self driving vehicles is a more visceral example. I don’t think anyone believes the creators of self-driving technology are out to off our species in a self-driven version of a Mad Max conspiracy. We just aren’t wired to trust machines with our lives. There is an innate human hubris that believes that when it comes to self-preservation, our fates are best left in our hands.

Self-driving proponents believe that with time and exposure, these trust issues will be resolved. The trick to us trusting machines with our lives is to lull us into not thinking about it too much. Millions of us do it every day when we board an airplane. The degree to which our airborne lives are dependent on technology was tragically revealed with the recent Boeing Max incidents. The fact is, if we had any idea how much our living to see tomorrow is dependent on technology, we would dissolve into a shuddering, panic-stricken mess. In this case, ignorance is indeed bliss.

But there are few times when we have to make the same conscious decision to put our lives in the metaphorical hands of a computer to the extent we do in a self-driven car. If we look at how we decide to trust, this an environment strewn with psychological landmines. Remember, we tend to trust when we have no options. And in this case, our option couldn’t be clearer. The steering wheel is right there, begging us to take over. It freaks us out then the car pulls away from the curb and we see the wheel start turning by itself. It’s small wonder that 71% of us are having some control issues.

 

Why Are So Many Companies So Horrible At Responding To Emails?

I love email. I hate 62.4% of the people I email.

Sorry. That’s not quite right. I hate 62.4% of the people I email in the futile expectation of a response…sometime…in the next decade or so (I will get back to the specificity of the 62.4% shortly).  It’s you who suck.

You know who you are. You are the ones who never respond to emails, who force me to send email after email with an escalating tone of prickliness, imploring you to take a few seconds from whatever herculean tasks fill your day to actually acknowledge my existence.

It’s you who force me to continually set aside whatever I’m working on to prod you into doing your damned job! And — often — it is you who causes me to eventually abandon email in exasperation and then sink further into the 7thcircle of customer service hell:  voicemail.

Why am I (and trust me, I’m not alone) so exasperated with you? Allow me to explain.

From our side, when we send an email, we are making a psychological statement about how we expect this communication channel to proceed. We have picked this channel deliberately. It is the right match for the mental prioritization we have given this task.

In 1891, in a speech on his 70th birthday, German scientist Hermann Von Helmholtz explained how ideas came to him  He identified four stages that were later labeled by social psychologist Graham Wallas: Preparation, Incubation, Illumination and Verification. These stages have held up remarkably well against the findings of modern neuroscience. Each of these stages has a distinct cognitive pattern and its own set of communication expectations.

  1. Preparation
    Preparation is gathering the information required for our later decision-making. We are actively foraging, looking for gaps in our current understanding of the situation and tracking down sources of that missing information. Our brains are actively involved in the task, but we also have a realistic expectation of the timeline required. This is the perfect match for email as a channel. We’ll came back to our expectations at this stage in a moment, as it’s key to understanding what a reasonable response time is.
  2. Incubation
    Once we have the information we require, our brain often moves the problem to the back burner. Even though it’s not “top of mind,” this doesn’t mean the brain isn’t still mulling it over. It’s the processing that happens while we’re sleeping or taking a walk. Because the brain isn’t actively working on the problem, there is no real communication needed.
  3. Illumination
    This is the eureka moment. You literally “make up your mind”: the cognitive stars align and you settle on a decision. You are now ready to take action. Again, at this stage, there is little to no outside communication needed.
  4. Verification
    Even though we’ve “made up our mind,” there is still one more step before action. We need to make sure our decision matches what is feasible in the real world. Does our internal reality match the external one? Again, our brains are actively involved, pushing us forward. Again, there is often some type of communication required here.

What we have here — in intelligence terms — is a sensemaking loop. The brain ideally wants this loop to continue smoothly, without interruption. But at two of the stages — the beginning and end — our brain needs to idle, waiting for input from the outside world.

Brains that have put tasks on idle do one of two things: They forget, or they get irritated. There are no other options.

The only variance is the degree of irritation. If the task is not that important to us, we get mildly irritated. The more important the task and the longer we are forced to put it on hold, the more frustrated we get.

Next, let’s talk about expectations. At the Preparation phase, we realize the entire world does not march to the beat of our internal drummer. Using email is our way to accommodate the collective schedules of the world. We are not demanding an immediate response. If we did, we’d use another channel, like a phone or instant messaging. When we use email, we expect those on the receiving end to fit our requirements into their priorities.

A recent survey by Jeff Toister, a customer service consultant, found that 87% of respondents expect a response to their emails within one day. Half of those expect a response in four hours or less. The most demanding are baby boomers — probably because email is still our preferred communication channel.

What we do not expect is for our emails to be completely ignored. Forever.

Yet, according to a recent benchmark study by SuperOffice, that is exactly what happens. 62.4% of businesses contacted with a customer service question in the study never responded. 90.5% never acknowledged receiving an email.  They effectively said to those customers, “Either forget us or get pissed off at us. We don’t really care.”

This lack of response is fine if you really don’t care. I toss a number of emails from my inbox daily without responding. They are a waste of my time. But if you have any expectation of having any type of relationship with the sender, take the time to hit the “reply” button.

There were some red flags that these non-responsive companies had in common. Typically, they could only be contacted through a web form on their site. I know I only fill these out if I have no other choice. If there is a direct email link, I always opt for that. These companies also tended to be smaller and didn’t use auto-responders to confirm a message had been received.

If this sounds like a rant, it is. One of my biggest frustrations is lack of email follow-up. I have found that the bar to surprise and delight me via your email response procedure is incredibly low:

  1. Respond.
  2. Don’t be a complete idiot.

Clear, Simple…and Wrong

For every complex problem there is an answer that is clear, simple, and wrong
H. L. Mencken

We live in a world of complex problems. And – increasingly – we long for simple solutions to those problems. Brexit was a simple answer to a complex problem. Trump’s border wall is a simple answer to a complex problem. The current wave of populism is being driven by the desire for simple answers to complex problems.

But, like H.L. Mencken said…all those answers are wrong.

Even philosophers – who are a pretty complex breed – have embraced the principle of simplicity. William of Ockham, a 14th century Franciscan friar who studied logic, wrote “Entia non sunt multiplicanda praetor necessitate.” This translates as “More things should not be used than are necessary.” It has since been called “Occam’s Razor.” In scientific research, it’s known as the principle of parsimony.

But Occam’s Razor illustrates a short coming of humans. We will look for the simplest solution even if it isn’t the right solution. We forget the “are necessary” part of the principle. The Wikipedia entry for Occam’s Razor includes this caveat, “Occam’s razor only applies when the simple explanation and complex explanation both work equally well. If a more complex explanation does a better job than a simpler one, then you should use the complex explanation.”

This introduces a problem for humans. Simple answers are usually easier for us.  People can grasp them easier.  Given a choice between complex and simple, we almost always default to the simple. For most of our history, this has not been a bad strategy. When all the factors the determine our likelihood to survive are proximate and intending to eat you, simple and fast is almost always the right bet.

But then we humans went and built a complex world. We started connecting things together into extended networks. We exponentially introduced dependencies. Through our ingenuity, we transformed our environments and, in the process, made complexity the rule rather than the exception. Unfortunately, that our brains didn’t keep up. They still operate as if our biggest concerns were to find food and to avoid becoming food.

Our brains are causal inference machines. We assign cause and effect without bothering to determine if we are right.  We are hardwired to go for simple answers. When the world was a pretty simple place, the payoff for cognitively crunching complex questions wasn’t worth it. But that’s no longer the case. And when we mistake correlation for causation, the consequences can be tragic.

Let’s go back to the example of Trump’s Wall. I don’t question that illegal (or legal, for that matter) immigration causes pressures in a society. That’s perfectly natural, no matter where those immigrants are coming from. But it’s also a dynamic and complex problem. There are a myriad of interleaved and inter-dependent factors underlying the visible issue. If we don’t take the time to understand those dynamics of complexity, a simple solution – like a wall – could unleash forces that have drastic and unintended consequences. Even worse, thanks to the nature of complexity, those consequences can be amplified throughout a network.

Simple answers can also provide a false hope that keeps us from digging deeper for the true nature of the problem. It lets us fall into the trap of “one and done” thinking. Why hurt our heads thinking about complex issues when we can put a checkmark beside an item on our to do list and move on to the next one?

According to Ian McKenzie, this predilection for simplicity is also rotting away the creative core of advertising. In an essay he posted on Medium, he points to a backlash against Digital because of its complexity, “Digital is complex. And because the simplicity bias says complicated is bad, digital and data are bad by association. And this can cause smart people trained in traditional thinking to avoid or tamp down digital ideas and tactics because they appear to be at odds with the simplicity dogma.”

Like it or not, we ignore complexity at our peril. As David Krakauer, President of the Santa Fe Institute and William H. Miller Professor of Complex Systems warned, “There is only one Earth and we shall never improve it by acting as if life upon it were simple. Complex systems will not allow it.”

 

Don’t Be So Quick to Eliminate Friction

If you have the mind of an engineer, you hate friction. When you worship at the altar of optimization, friction is something to be ruthlessly eliminated – squeezed out of the equation. Friction equals inefficiency. It saps the energy out of our efforts.  It’s what stands between reality and a perfect market, where commerce theoretically slides effortlessly between participants. Much of what we call tech today is optimized with the goal of eliminating friction.

But there’s another side of friction. And perhaps we shouldn’t be too quick to eliminate it.  Without friction, there would be no traction, so you wouldn’t be able to walk. Your car would have no brakes. Nails, bolts, screws, glue and tape wouldn’t work. Without friction, there would be nothing to keep the world together.

And in society, it’s friction that slows us down and helps us smell the roses. That’s because another word for friction – when we talk about our experiential selves – is savouring.

Take conversations, for instance. A completely efficient, friction free conversation would be pretty damn boring. It would get the required information from participant A to participant B – and vice versa – in the minimum number of words. There would be no embellishment, no nuance, no humanity. It would not be a conversation we would savour.

Savouring is all about slowing down. According to Maggie Pitts, a professor at the University of Arizona who studies how we savour conversations, “Savouring is prolonging, extending, and lingering in a positive or pleasant feeling.” And you can’t prolong anything without friction.

But what about friction in tech itself?  As I said before, the rule of thumb in tech is to eliminate as much friction as possible. But can the elimination of friction go too far? Product designer Jesse Weaver says yes. In an online essay, he says we friction-obsessed humans should pay more attention to the natural world, where friction is still very much alive-and-well, thank you:

“Nature is the ultimate optimizer, having run an endless slate of A/B tests over billions of years at scale. And in nature, friction and inconvenience have stood the test of time. Not only do they remain in abundance, but they’ve proven themselves critical. Nature understands the power of friction while we have become blind to it.”

A couple weeks ago, I wrote about Yerkes-Dodson law; which states that there can be too much of a good thing – or, in this case – too little of a supposedly bad thing. According to a 2012 study, when it comes to assigning value, we actually appreciate a little friction. It’s known as the IKEA effect. There is a sweet spot for optimal effort. Too much and we get frustrated. Too little and we feel that it was too easy. When it’s just right, we have a crappy set of shelves that we love more than we should because we had to figure out how to put them together.

Weaver feels the same is true for tech.  As examples, he points to Amazon’s Dash smart button and Facebook’s Frictionless Sharing. In the first case, Amazon claims the need has been eliminated by voice-activated shopping on Alexa. In the second case, we had legitimate privacy concern. But Weaver speculates that perhaps both things just moved a little too fast for our comfort, removing our sense of control. We need a little bit of friction in the system so we feel we can apply the brakes when required.

If we eliminate too much friction, we’ll slip over that hump into not valuing the tech enabled experiences we’re having. He cites the 2018 World Happiness Report which has been tracking our satisfaction with live on a global basis for over a decade. In that time, despite our tech capabilities increasing exponentially, our happiness has flatlined.

I have issues with his statistical logic – there is a bushel basket full of confounding factors in the comparison he’s trying to make – but I generally agree with Weaver’s hypothesis. We do need some friction in our lives. It applies the brakes to our instincts. It forces us to appreciate the here and now that we’re rushing through. It opens the door to serendipity and makes allowances for savouring.

In the end, we may need a little friction in our lives to appreciate what it means to be human.