Personal Endeavour in the Age of Instant Judgement

No one likes to be judged — not even gymnasts and figure skaters. But at least in those sports, the judges supposedly know what it is they’re judging. So, in the spirit of instant feedback, let me rephrase: No one likes to be judged by a peanut gallery*. Or, to use a more era appropriate moniker, by a troll’s chorus.

Because of this, I feel sorry for David Benioff and D.B. Weiss, the showrunners of “Game of Thrones.” Those poor bastards couldn’t be any more doomed if they had been invited to a wedding of the red variety.

At least they were aware of their fate. In an interview with Entertainment Weekly, they disclosed their plans for the airing of the final episode. “We’ll in an undisclosed location, turning off our phones and opening various bottles,” Weiss admitted. “At some point, if and when it’s safe to come out again, somebody like [HBO’s ‘GOT’ publicist] will give us a breakdown of what was out there without us having to actually experience it.” Added Benioff: “I plan to be very drunk and very far from the internet.”

Like it or not, we now live in an era of instant judgement, from everyone. It’s the evil twin of social virality. It means we have to grow thicker skins than your average full-grown dragon**. And because I’m obsessively fixated on unintended consequences, this got me to thinking. How might all this judgement impact our motivation to do stuff?

First of all, let’s look at the good that comes from this social media froth kicked up by fervent fans. There is a sense of ownership and emotional investment in shows like “Game of Thrones” that’s reached a pitch never seen before — and I truly believe we’re getting better TV because of it.

If you look at any of the lists of the best TV shows of all time, they are decidedly back-end loaded. “Game of Thrones,” even at its worst, is better than almost any television of the ’80s or ’90s. And it’s not only because of the advances in special effects and CGI wizardry. There is a plethora of thoughtful, exquisitely scripted and superbly acted shows that have nary an enchantress, dragon or apocalypse of the walking dead in sight. There is no CGI in “Better Call Saul,” “Master of None” or “Atlanta.”

But what about the dark side of social fandom?

I suspect instant judgement might make it harder for certain people to actually do anything that ends up in the public arena. All types of personal endeavors require failure and subsequent growth as an ingredient for success. And fans are getting less and less tolerant of failure. That makes the entry stakes pretty high for anyone producing output that is going to be out there, available for anyone to pass judgement on.

We might get self-selection bias in arenas like the arts, politics and sports. Those adverse to criticism that cuts too deep will avoid making themselves vulnerable. Or — upon first encountering negative feedback — they may just throw in the towel and opt for something less public.

The contributors to our culture may just become hard-nosed and impervious to outside opinion — kind of like Cersei Lannister. Or, even worse, they may be so worried about what fans think that they oscillate trying to keep all factions happy. That would be the Jon Snows of the world.

Either way, we lose the contributions of those with fragile egos and vulnerable hearts. If we applied that same filter retroactively to our historic collective culture, we’d lose most of what we now treasure.

In the end, perhaps David Benioff got it right. Just be “very drunk and very far from the internet.”

* Irrelevant Fact #1: The term peanut gallery comes from vaudeville, where the least expensive seats were occupied by the rowdiest members of the audience. The cheapest snack was peanuts, which the audience would throw at the performers.

** Irrelevant Fact #2: Dragons have thick skin because they don’t shed their skins. It just keeps getting thicker and more armor-like. The older the dragon, the thicker the skin.

The Gap Between People and Platforms

I read with interest fellow Spinner Dave Morgan’s column about how software is destroying advertising agencies, but not the need for them. I do want to chime in on what’s happening in advertising, but I need a little more time to think about it.

What did catch my eye was a comment at the end by Harvard Business School professor Alvin Silk: “You can eliminate the middleman, but not his/her function.”

I think Dave and Alvin have put their collective thumbs on something that extends beyond our industry: the growing gap between people and platforms. I’ll use my current industry as an example – travel. It’s something we all do so we can all relate to it.

Platforms and software have definitely eaten this industry. In terms of travel destination planning, the 800-pound Gorilla is TripAdvisor. It’s impossible to overstate its importance to operators and business owners.  TripAdvisor almost single-handedly ushered in an era of do-it-yourself travel planning. For any destination in the world, we can now find the restaurants, accommodations, tours and attractions that are the favorites of other travellers. It allows us to both discover and filter while planning our next trip, something that was impossible 20 years ago, before TripAdvisor came along.

But for all its benefits, TripAdvisor also leaves some gaps.

The biggest gap in travel is what I’ve heard called the “Other Five.” I live in Canada’s wine country (yes, there is such a thing). Visitors to our valley – the Okanagan – generally come with 5 wineries they have planned to visit. The chances are very good that those wineries were selected with the help of TripAdvisor. But while they’re visiting, they also visit the “other five” – 5 wineries they discovered once they got to the destination. These discoveries depend on more traditional means – either word of mouth or sheer serendipity. And it’s often one of these “other five” that provide the truly memorable and authentic experiences.

That’s the problem with platforms like TripAdvisor, which are based on general popularity and algorithms. Technically, platforms should help you discover the long tail, but they don’t. Everything automatically defaults to the head of the curve. It’s the Matthew Effect applied to travel – advantage accumulates to those already blessed. We all want to see the same things – up to a point.

But then we want to explore the “other five” and that’s where we find the gap between platforms and people. We have been trained by Google not to look beyond the first page of online results. It’s actually worse than that. We don’t typically scan beyond the top five. But – by the very nature of ratings-based algorithms – that is always where you’ll find the “other five.” They languish in the middle of the results, sometimes taking years to bump up even a few spots. It’s why there’s still a market – and a rapidly expanding one at that – for a tour guided by an actual human. Humans can think beyond an algorithm, asking questions about what you like and pulling from their own experience to make very targeted and empathetic suggestions.

The problem with platforms is their preoccupation with scale. They feel they have to be all things to all people. I’ll call it Unicornitis – the obsession with gaining a massive valuation. They approach every potential market focused on how many users they can capture. By doing so, they have to target the lowest common denominator. The web thrives on scale and popularity; the rich get richer and the poor get poorer. Yes, there are niche players out there, but they’re very hard to find. They are the “other five” of the Internet, sitting on the third page of Google results.

This has almost nothing to do with advertising, but I think it’s the same phenomenon at work. As we rely more on software, we gain a false confidence that it replaces human-powered expertise. It doesn’t. And a lot of things can slip through the gap that’s created.

 

The Importance of Playing Make-Believe

One of my favourite sounds in the world is children playing. Although our children are well past that age, we have stayed in a neighbourhood where new families move in all the time. One of the things that has always amazed me is a child’s ability to make believe. I used to do this but I don’t any more. At least, I don’t do it the same way I used to.

Just take a minute to think about the term itself: make-believe. The very words connote the creation of an imaginary world that you and your playmates can share, even in that brief and fleeting moment. Out of the ether, you can create an ephemeral reality where you can play God. A few adults can still do that. George R.R. Martin pulled it off. J.K. Rowling did likewise. But for most of us, our days of make-believe are well behind us.

I worry about the state of play. I am concerned that rather than making believe themselves, children today are playing in the manufactured and highly commercialized imaginations of profit-hungry corporations. There is no making — there is only consuming. And that could have some serious consequences.

Although we don’t use imagination the way we once did, it is the foundation for the most importance cognitive tasks we do. It was Albert Einstein who said, “Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution.”

It is imagination that connects the dots, explores the “what-ifs” and peeks beyond the bounds of the known. It is what separates us from machines.

In that, Einstein presciently nailed the importance of imagination. Only here does the mysterious alchemy of the human mind somehow magically weave fully formed worlds out of nothingness and snippets of reality. We may not play princess anymore, but our ability to imagine underpins everything of substance that we think about.

The importance of playing make-believe is more than just cognition. Imagination is also essential to our ability to empathize. We need it to put ourselves in place of others. Our “theory of mind” is just another instance of the many facets of imagination.

This thing we take for granted has been linked to a massive range of essential cognitive developments. In addition to the above examples, pretending gives children a safe place to begin to define their own place in society. It helps them explore interpersonal relationships. It creates the framework for them to assimilate information from the world into their own representation of reality.

We are not the only animals that play when we’re young. It’s true for many mammals, and scientists have discovered it’s also essential in species as diverse as crocodiles, turtles, octopuses and even wasps.

For other species, though, it seems play is mainly intended to help come to terms with surviving in the physical world.  We’re alone in our need for elaborate play involving imagination and cognitive games.

With typical human hubris, we adults have been on a century-long mission to structure the act of play. In doing so, we have been imposing our own rules, frameworks and expectations on something we should be keeping as is. Much of the value of play comes from its very lack of structure. Playing isn’t as effective when it’s done under adult supervision. Kids have to be kids.

Play definitely loses much of its value when it becomes passive consumption of content imagined and presented by others through digital entertainment channels. Childhood is meant to give us a blank canvas to colour with our imagination.

As we grow, the real world encroaches on this canvas.  But the delivery of child-targeted content through technology is also shrinking the boundaries of our own imagination.

Still, despite corporate interests that run counter to playing in its purest sense, I suspect that children may be more resilient than I fear. After all, I can still hear the children playing next door. And their imaginations still awe and inspire me.

Selfies: A Different Take on Reality

It was a perfect evening in Sydney Harbor. I was there for a conference and the organizers had arranged an event for the speakers at Milsons Point – under the impressive span of the Harbour bridge. It was dusk and the view of downtown Sydney spread out in front of us with awesome breadth and scope. It was one of those moments that literally takes your breath away. That minute seemed eternal.

After some time, I turned around. There was another attendee, who was intently focused on taking a selfie and posting it to social media. His back was turned to the view behind him. At first, I thought I should do the same. Then I changed my mind. I’d rely on my memory and actually try to stay in the moment. My phone stayed in my pocket.

In the age of selfies, it turns out that my mini-existential crisis is getting more common. According to a new study published in the Journal of Consumer Research, something called “self-presentational concern” can creep into these lifetime moments and suck the awe right out of them. One of the study authors, Alixandra Barasch, explains, “When people take photos to share, they remember their experience more from a third-person perspective, suggesting that taking photos to share makes people consider how the event (and the photos) would be evaluated by an observer. “

Simply stated, selfies take us “out of the moment”. But this effect depends on why we’re taking the selfie in first place. The experimenters didn’t find the effect when people took selfies with the intent of just remembering the moment. It showed up when the selfie was taken for the express purpose of sharing on social media. Suddenly, we are more worried about how we look than where we are and what we’re doing.

Dr. Terri Apter, a professor of psychology at Cambridge University, has been looking at the emergence of selfies as a form of “self-definition” for some time. “We all like the idea of being sort of in control of our image and getting attention, being noticed, being part of the culture.” But when does this very human urge slip over the edge into a destructive spiral? Dr. Apter explains, “You can get that exaggerated or exacerbated by celebrity culture that says unless you’re being noticed, you’re no one,”

I suspect what we’re seeing now is a sort of selfie arms race. Can we upstage the rest of our social network by posting selfies in increasingly exotic locations, doing exceptional things and looking ever more “Mahvelous”? That’s a lot of pressure to put on something we do when we’re just supposed to be enjoying life.

A 2015 study explored the connection between personality traits and posting of selfies. In particular, the authors of the study looked at narcissism, psychopathy and self-objectification. They found that frequent posting of selfies and being overly concerned with how you look in the selfies can be tied to both self-objectification and narcissism. This is interesting, because those two things are at opposite ends of the self-esteem spectrum. Narcissists love themselves and those that self-objectify tend to suffer from low self-esteem. In both cases, selfies represent a way to advertise their personal brands to a wider audience.

There’s another danger with selfie-preoccupation that goes hand-in-hand with distancing yourselves from the moment you’re in – you can fall victim to bad judgement. It happened to Barack Obama at Nelson Mandela’s memorial ceremony. In a moment when he should have been acting with appropriate gravitas, he decided to take a selfie with Danish Prime Minister Helle Thorning-Schmidt and then British Prime Minister David Cameron. It was a stunningly classless moment from a usually classy guy. If you check a photo taken at the time, you can see that Michelle Obama was not amused. I agree.

Like many things tied to social media, selfies can represent a troubling trend in how we look at ourselves in a social context. These things seem to be pointing in the same direction: we’re spending more time worrying about an artificial reality of our own making and less time noticing reality as it actually exists.

We just have to put the phone down sometimes and admire the view across the harbor.

 

Don’t Be So Quick to Eliminate Friction

If you have the mind of an engineer, you hate friction. When you worship at the altar of optimization, friction is something to be ruthlessly eliminated – squeezed out of the equation. Friction equals inefficiency. It saps the energy out of our efforts.  It’s what stands between reality and a perfect market, where commerce theoretically slides effortlessly between participants. Much of what we call tech today is optimized with the goal of eliminating friction.

But there’s another side of friction. And perhaps we shouldn’t be too quick to eliminate it.  Without friction, there would be no traction, so you wouldn’t be able to walk. Your car would have no brakes. Nails, bolts, screws, glue and tape wouldn’t work. Without friction, there would be nothing to keep the world together.

And in society, it’s friction that slows us down and helps us smell the roses. That’s because another word for friction – when we talk about our experiential selves – is savouring.

Take conversations, for instance. A completely efficient, friction free conversation would be pretty damn boring. It would get the required information from participant A to participant B – and vice versa – in the minimum number of words. There would be no embellishment, no nuance, no humanity. It would not be a conversation we would savour.

Savouring is all about slowing down. According to Maggie Pitts, a professor at the University of Arizona who studies how we savour conversations, “Savouring is prolonging, extending, and lingering in a positive or pleasant feeling.” And you can’t prolong anything without friction.

But what about friction in tech itself?  As I said before, the rule of thumb in tech is to eliminate as much friction as possible. But can the elimination of friction go too far? Product designer Jesse Weaver says yes. In an online essay, he says we friction-obsessed humans should pay more attention to the natural world, where friction is still very much alive-and-well, thank you:

“Nature is the ultimate optimizer, having run an endless slate of A/B tests over billions of years at scale. And in nature, friction and inconvenience have stood the test of time. Not only do they remain in abundance, but they’ve proven themselves critical. Nature understands the power of friction while we have become blind to it.”

A couple weeks ago, I wrote about Yerkes-Dodson law; which states that there can be too much of a good thing – or, in this case – too little of a supposedly bad thing. According to a 2012 study, when it comes to assigning value, we actually appreciate a little friction. It’s known as the IKEA effect. There is a sweet spot for optimal effort. Too much and we get frustrated. Too little and we feel that it was too easy. When it’s just right, we have a crappy set of shelves that we love more than we should because we had to figure out how to put them together.

Weaver feels the same is true for tech.  As examples, he points to Amazon’s Dash smart button and Facebook’s Frictionless Sharing. In the first case, Amazon claims the need has been eliminated by voice-activated shopping on Alexa. In the second case, we had legitimate privacy concern. But Weaver speculates that perhaps both things just moved a little too fast for our comfort, removing our sense of control. We need a little bit of friction in the system so we feel we can apply the brakes when required.

If we eliminate too much friction, we’ll slip over that hump into not valuing the tech enabled experiences we’re having. He cites the 2018 World Happiness Report which has been tracking our satisfaction with live on a global basis for over a decade. In that time, despite our tech capabilities increasing exponentially, our happiness has flatlined.

I have issues with his statistical logic – there is a bushel basket full of confounding factors in the comparison he’s trying to make – but I generally agree with Weaver’s hypothesis. We do need some friction in our lives. It applies the brakes to our instincts. It forces us to appreciate the here and now that we’re rushing through. It opens the door to serendipity and makes allowances for savouring.

In the end, we may need a little friction in our lives to appreciate what it means to be human.

 

The Social Acceptance of Siri

There was a time, not too long ago, when I did a fairly exhaustive series of posts on the acceptance of technology. The psychology of how and when we adopted disruptive tech fascinated me. So Laurie Sullivan’s article on how more people are talking to their phone caught my eye.

If you look at tech acceptance, there are a bucket full of factors you have to consider. Utility, emotions, goals, ease of use, cost and our own attitudes all play a part. But one of the biggest factors is social acceptance. We don’t want to look like a moron in front of friends and family. It was this, more than anything else, that killed Google Glass the first time around. Call it the Glasshole factor.

So, back to Laurie’s article and the survey she referred to in it. Which shifts in the social universe are making it more acceptable to shoot the shit with Siri?

The survey has been done for the last three years by Stone Temple, so we’re starting to see some emerging trends. And here are the things that caught my attention. First of all, the biggest shifts from 2017 to 2019, in terms of percentage, are: at the gym, in Public Restrooms and in the Theatre. Usage at home has actually slipped a little (one might assume that these conversations have migrated to Alexa and other home-based digital assistants). If we’re looking at acceptance of technology and the factors driving it, one thing jumps out from the survey. All the shifts are to do with how comfortable we feel talking to our phone in publicly visible situations. There is obviously a moving threshold of acceptability here.

As I mentioned, the three social “safe zones” – those instances where we wouldn’t be judged for speaking to our phones – have shown little movement in the last three years. These are “Home Alone”, “Home with Friends” (public but presumably safe from social judgment), and “Office Alone.” As much as possible in survey-based research, this isolates the social factor from all the other variables rather nicely and shows its importance in our collective jumping on the voice technology band wagon.

This highlights an important lesson is acceptance of new technologies: you have to budget in the time required for society to absorb and accept new technologies. The more that the technology will be utilized in visibly social situations, the more time you need to budget. Otherwise, the tech will only be adopted by a tiny group of socially obtuse techno-weenies and will be stranded on the wrong side of the bleeding edge. As technology becomes more personal and tags along with us in more situations, the designers and marketers of that tech will have to understand this.

This places technology acceptance in a whole new ball park. As the tech we use increasingly becomes part of our own social facing brand, our carefully constructed personas and the social norms we have in place become key factors that determine the pace of acceptance.

This becomes a delicate balancing act. How do you control social acceptance? As an example, let’s take out one of my favorite marketing punching bags – influencer marketing – and see if we could accelerate acceptance by seeding tech acceptance with a few key social connectors. That same strategy failed miserably when it came to promoting Google Glass to the public. And there’s a perfectly irrational reason for it. It has nothing to do with rational stuff like use cases, aesthetics or technology. It had to do with Google picking the wrong influencers – the so-called Google Glass Explorers. As a group, they tended to be tech-obsessed, socially awkward and painfully uncool. They were the people you avoid getting stuck in the corner with at a party because you just aren’t up for a 90-minute conversation on the importance of regular hard drive hygiene. No one wants to be them.

If this survey tells us anything, it tells us that – sometimes – you just have to hope and wait. Ever since Everett Rogers first sketched it out in 1962, we’ve known that innovation diffusion happens on a bell curve. Some innovations get stranded on the upside of the slope and wither away to nothingness while some make it over the hump and become part of our everyday lives. Three years ago, there were certainly people talking to their phones on buses, in gyms and at movie theatres. They didn’t care if they were judged for it. But most of us did care. Today, apparently, the social stigma has disappeared for many of us. We were just waiting for the right time – and the right company.

Less Tech = Fewer Regrets

In a tech ubiquitous world, I fear our reality is becoming more “tech” and less “world.”  But how do you fight that? Well, if you’re Kendall Marianacci – a recent college grad – you ditch your phone and move to Nepal. In that process she learned that, “paying attention to the life in front of you opens a new world.”

In a recent post, she reflected on lessons learned by truly getting off the grid:

“Not having any distractions of a phone and being immersed in this different world, I had to pay more attention to my surroundings. I took walks every day just to explore. I went out of my way to meet new people and ask them questions about their lives. When this became the norm, I realized I was living for one of the first times of my life. I was not in my own head distracted by where I was going and what I needed to do. I was just being. I was present and welcoming to the moment. I was compassionate and throwing myself into life with whoever was around me.”

It’s sad and a little shocking that we have to go to such extremes to realize how much of our world can be obscured by a little 5-inch screen. Where did tech that was supposed to make our lives better go off the rails? And was the derailment intentional?

“Absolutely,” says Jesse Weaver, a product designer. In a post on Medium.com, he lays out – in alarming terms – our tech dependency and the trade-off we’re agreeing to:

“The digital world, as we’ve designed it, is draining us. The products and services we use are like needy friends: desperate and demanding. Yet we can’t step away. We’re in a codependent relationship. Our products never seem to have enough, and we’re always willing to give a little more. They need our data, files, photos, posts, friends, cars, and houses. They need every second of our attention.

We’re willing to give these things to our digital products because the products themselves are so useful. Product designers are experts at delivering utility. “

But are they? Yes, there is utility here, but it’s wrapped in a thick layer of addiction. What product designers are really good at is fostering addiction by dangling a carrot of utility. And, as Weaver points out, we often mistake utility for empowerment,

“Empowerment means becoming more confident, especially in controlling our own lives and asserting our rights. That is not technology’s current paradigm. Quite often, our interactions with these useful products leave us feeling depressed, diminished, and frustrated.”

That’s not just Weaver’s opinion. A new study from HumaneTech.com backs it up with empirical evidence. They partnered with Moment, a screen time tracking app, “to ask how much screen time in apps left people feeling happy, and how much time left them in regret.”

According to 200,000 iPhone users, here are the apps that make people happiest:

  1. Calm
  2. Google Calendar
  3. Headspace
  4. Insight Timer
  5. The Weather
  6. MyFitnessPal
  7. Audible
  8. Waze
  9. Amazon Music
  10. Podcasts

That’s three meditative apps, three utilitarian apps, one fitness app, one entertainment app and two apps that help you broaden your intellectual horizons. If you are talking human empowerment – according to Weaver’s definition – you could do a lot worse than this round up.

But here were the apps that left their users with a feeling of regret:

  1. Grindr
  2. Candy Crush Saga
  3. Facebook
  4. WeChat
  5. Candy Crush
  6. Reddit
  7. Tweetbot
  8. Weibo
  9. Tinder
  10. Subway Surf

What is even more interesting is what the average time spent is for these apps. For the first group, the average daily usage was 9 minutes. For the regret group, the average daily time spent was 57 minutes! We feel better about apps that do their job, add something to our lives and then let us get on with living that life. What we hate are time sucks that may offer a kernel of functionality wrapped in an interface that ensnares us like a digital spider web.

This study comes from the Center for Humane Technology, headed by ex-Googler Tristan Harris. The goal of the Center is to encourage designers and developers to create apps that move “away from technology that extracts attention and erodes society, towards technology that protects our minds and replenishes society.”

That all sounds great, but what does it really mean for you and me and everybody else that hasn’t moved to Nepal? It all depends on what revenue model is driving development of these apps and platforms. If it is anything that depends on advertising – in any form – don’t count on any nobly intentioned shifts in design direction anytime soon. More likely, it will mean some half-hearted placations like Apple’s new Screen Time warning that pops up on your phone every Sunday, giving you the illusion of control over your behaviour.

Why an illusion? Because things like Apple’s Screen Time are great for our pre-frontal cortex, the intent driven part of our rational brain that puts our best intentions forward. They’re not so good for our Lizard brain, which subconsciously drives us to play Candy Crush and swipe our way through Tinder. And when it comes to addiction, the Lizard brain has been on a winning streak for most of the history of mankind. I don’t like our odds.

The developers escape hatch is always the same – they’re giving us control. It’s our own choice, and freedom of choice is always a good thing. But there is an unstated deception here. It’s the same lie that Mark Zuckerberg told last Wednesday when he laid out the privacy-focused future of Facebook. He’s putting us in control. But he’s not. What he’s doing is making us feel better about spending more time on Facebook.  And that’s exactly the problem. The less we worry about the time we spend on Facebook, the less we will think about it at all.  The less we think about it, the more time we will spend. And the more time we spend, the more we will regret it afterwards.

If that doesn’t seem like an addictive cycle, I’m not sure what does.