Attention: Divided

I’d like you to give me your undivided attention. I’d like you to – but you can’t. First, I’m probably not interesting enough. Secondly, you no longer live in a world where that’s possible. And third, even if you could, I’m not sure I could handle it. I’m out of practice.

The fact is, our attention is almost never undivided anymore. Let’s take talking for example. You know; old-fashioned, face-to-face, sharing the same physical space communication. It’s the one channel that most demands undivided attention. But when is the last time you had a conversation where you were giving it 100 percent of your attention? I actually had one this past week, and I have to tell you, it unnerved me. I was meeting with a museum curator and she immediately locked eyes on me and gave me the full breadth of her attention span. I faltered. I couldn’t hold her gaze. As I talked I scanned the room we were in. It’s probably been years since someone did that to me. And nary a smart phone was in sight.

If this is true when we’re physically present, imagine the challenge in other channels. Take television, for instance. We don’t watch TV like we used to. When I was growing up, I would be verging on catatonia as I watched the sparks fly between Batman and Catwoman (the Julie Newmar version – with all due respect to Eartha Kitt and Lee Meriwether.) My dad used to call it the “idiot box.” At the time, I thought it was a comment on the quality of programming, but I now know realize he was referring to my mental state. You could have dropped a live badger in my lap and not an eye would have been batted.

But that’s definitely not how we watch TV now. A recent study indicates that 177 million Americans have at least one other screen going – usually a smartphone – while they watch TV. According to Nielsen, there are only 120 million TV households. That means that 1.48 adults per household are definitely dividing their attention amongst at least two devices while watching Game of Thrones. My daughters and wife are squarely in that camp. Ironically, I now get frustrated because they don’t watch TV the same way I do – catatonically.

Now, I’m sure watching TV does not represent the pinnacle of focused mindfulness. But this could be a canary in a coalmine. We simply don’t allocate undivided attention to anything anymore. We think we’re multi-tasking, but that’s a myth. We don’t multi-task – we mentally fidget. We have the average attention span of a gnat.

So, what is the price we’re paying for living in this attention deficit world? Well, first, there’s a price to be paid when we do decided to communicate. I’ve already stated how unnerving it was for me when I did have someone’s laser focused attention. But the opposite is also true. It’s tough to communicate with someone who is obviously paying little attention to you. Try presenting to a group that is more interested in chatting to each other. Research studies show that our ability to communicate effectively erodes quickly when we’re not getting feedback that the person or people we’re talking to are actually paying attention to us. Effective communication required an adequate allocation of attention on both ends; otherwise it spins into a downward spiral.

But it’s not just communication that suffers. It’s our ability to focus on anything. It’s just too damned tempting to pick up our smartphone and check it. We’re paying a price for our mythical multitasking – Boise State professor Nancy Napier suggests a simple test to prove this. Draw two lines on a piece of paper. While having someone time you, write “I am a great multi-tasker” on one, then write down the numbers from 1 to 20 on the other. Next, repeat this same exercise, but this time, alternate between the two: write “I” on the first line, then “1” on the second, then go back and write “a” on the first, “2” on the second and so on. What’s your time? It will probably be double what it was the first time.

Every time we try to mentally juggle, we’re more likely to drop a ball. Attention is important. But we keep allocating thinner and thinner slices of it. And a big part of the reason is the smart phone that is probably within arm’s reach of you right now. Why? Because of something called intermittent variable rewards. Slot machines use it. And that’s probably why slot machines make more money in the US than baseball, moves and theme parks combined. Tristan Harris, who is taking technology to task for hijacking our brains, explains the concept: “If you want to maximize addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize!) or nothing. Addictiveness is maximized when the rate of reward is most variable.”

Your smartphone is no different. In this case, the reward is a new email, Facebook post, Instagram photo or Tinder match. Intermittent variable rewards – together with the fear of missing out – makes your smartphone as addictive as a slot machine.

I’m sorry, but I’m no match for all of that.

When Technology Makes Us Better…

I’m always quick to point out the darker sides of technology. So, to be fair, I should also give credit where credit is due. That’s what today’s column is about. Technology, we collectively owe you one. Why? Because without you, we wouldn’t be slowly chipping away at the massive issue of sexual predation. #Metoo couldn’t have happened without you.

I’ve talked before of Mark Granovetter’s threshold model of crowd behavior. In the past, I’ve used it to explain how it can tip collective behavior towards the negative; turning crowds into mobs. But it can also work the other way; turning crowds into movements. Either way, the threshold model depends on connection and technology makes that connecting possible. What’s more, it makes it possible in a very specific way that is important to understand.

Technological connection is often ideological connection. We connect in ad hoc social networks that center around an idea. We find common ground that is not physical but conceptual. In the process, we forge new social connections that are freed from the typical constraints that introduce friction in the growth of social networks. We create links that are unrestricted by how people look, where they live, how much they earn or what church they worship at. All we need is to find resonance within ideas and we can quickly create a viral wave. The cost of connection is reduced.

This is no way diminishes the courage required to post the #metoo hashtag. I have been in the digital world for almost three decades now and in that time I have met many, many remarkable women. I hope I have judged them as fellow human beings and have treated them as equals. It has profoundly saddened me to see most of them join the #metoo movement in the past few weeks. It has been painful to learn just how pervasive the problem is and to see this light creep into a behavioral basement of which we are becoming more aware. But it is oh-so-necessary. And I must believe that technology and the comfort it affords by letting you know you’re not alone has made it just a little bit easier to type those six characters.

As I have always said – technology erases friction. It breaks down those sticking points that used to allow powerful individuals to exert control. Control is needed to maintain those circles of complicity that allows the Harvey Weinsteins of the world to prey on others. But with technology, all we need is one little crack in that circle to set in motion a chain reaction that blasts it apart.

I believe that the Weinstein example will represent a sea-change moment in how our society views sexual predation. These behaviors are always part of a power game. For it to continue to exist, the perpetrator must believe in their own power and their ability to maintain it. Once the power goes, so does the predation. #Metoo has shown that your power can disappear immediately and permanently if you get publically tagged. “If it happened to Harvey, it could happen to me” may become the new cautionary tale.

But I hope it’s not just the fear of being caught that pushes us to be better. I also hope that we have learned that it’s not okay to tolerate this. In the incredibly raw and honest post of screenwriter Scott Rosenberg, we had our worst fears confirmed: “Everybody f—ing knew!” And everybody who knew is being sucked into the whirlpool of Harvey’s quickly sinking bulk. I have to believe this is tipping the balance in the right direction. We good men (and women) might be less likely to do nothing next time.

Finally, technology has made us better, whether we believe it or not. In 1961, when I was born, Weinstein’s behavior would have been accepted as normal. It would have even been considered laudable in some circles (predominately male circles – granted). As a father of two daughters, I am grateful that that’s not the world we live in today. The locker room mentality that allows the Harvey Weinsteins, Robert Scobles, and Donald Trumps of the world to flourish is being chipped away – #metoo post by #metoo post.

And we have technology to thank for that.

Together We Lie

Humans are social animals. We’ve become this way because – evolutionarily speaking – we do better as a group than individually. But there’s a caveat here. If you get a group of usually honest people together, they’re more likely to lie. Why is this?

Martin Kocher and his colleagues from LMU in Munich set up a study where participants had to watch a video of a single roll of a die and then report on the number that came up. Depending on what they reported, there was a payoff. Researchers asked both individuals and small groups who had the opportunity to chat anonymously with each other before reporting. The result,

“Our findings are unequivocal: People are less likely to lie if they decide on their own.”

Even individuals who answered honestly independently started lying when they got in a group.

The researchers called this a “dishonesty shift.” They blame it on a shifting weight placed on the norm of honesty. Norms are those patterns we have that guide us in our behaviors and beliefs. But those norms may be different individually than they are when we’re part of a group.

“Feedback is the decisive factor. Group-based decision-making involves an exchange of views that may alter the relative weight assigned to the relevant norm.”

Let’s look at how this may play out. Individually, we may default to honesty. We do so because we’re unsure of the consequences of not being honest. But when we get in a group, we start talking to others and it’s easier to rationalize not being honest – “Well, if everyone’s going to lie, I might as well too.”

Why is this important? Because marketing is done in groups, by groups, to groups. The dynamics of group-based ethics are important for us to understand. It could help to explain the most egregious breaches of ethics we see becoming more and more commonplace, either in corporations or in governments.

Four of the seminal studies in psychology and sociology shed further light on why groups tend to shift towards dishonesty. Let’s look at them individually.

In 1955, Solomon Asch showed that even if individually we believe something to be incorrect, if enough people around us have a different option, we’ll go with the group consensus rather than risk being the odd person out. In his famous study, he would surround a subject with “plants” who, when shown cards with three black lines of obviously differing lengths on it, would insist that three lines were equal. The subjects were then asked their opinion. In 75% of the cases, they’d go with the group rather than risk disagreement. As Asch said in his paper – quoting sociologist Gabriel Tarde – “Social man in a somnambulist.” We have about as much independent will as your average sleepwalker.

Now, let’s continue with Stanley Milgram’s Obedience to Authority study, perhaps the most controversial and frightening of the group. When confronted with an authoritative demeanor, a white coat and a clipboard, 63% of the subjects meekly followed directions and delivered what were supposed to be lethal levels of electrical shock to a hapless individual. The results were so disheartening that we’ve been trying to debunk them ever since. But a follow up study by Stanford psychology professor Philip Zimbardo – where subjects were arbitrarily assigned roles as guards and inmates in a mock prison scenario – was even more shocking. We’re more likely to become monsters and abandon our personal ethics when we’re in a group than when we act alone. Whether it’s obedience to authority – as Milgram was trying to prove – or whether it’s social conformity taken to the extreme, we tend to do very bad things when we’re in bad company.

But how do we slip so far so quickly from our own personal ethical baseline? Here’s where the last study I’ll cite can shed a little light. Sociologist Mark Granovetter – famous for his Strength of Weak Ties study – also looked at the viral spreading of behaviors in groups. I’ve talked about this in a previous column, but here’s the short version: If we have the choice between two options, with accompanying social consequences, which option we choose may be driven by social conformity. If we see enough other people around us picking the more disruptive option (i.e. starting a riot) we may follow suit. Even if we all have different thresholds – which we do – the nature of a crowd is such that those with the lowest threshold will pick the disruption option, setting into effect a bandwagon effect that eventually tips the entire group over the threshold.

These were all studied in isolation, because that’s how science works. We study variables in isolation. But it’s when factors combine that we get the complexity that typifies the real world – and the real marketplace. And there’s where predictability goes out the window. The group dynamics in play can create behavioral patterns that make no sense to the average person with the average degree of morality. But it’s happened before, it’s happening now, and it’s sure to happen again.

 

 

Addicted to Tech

A few columns ago, I mentioned one of the aspects that is troubling me about technology – the shallowness of social media. I had mentioned at the time that there were other aspects that were equally troubling. Here’s one:

Technology is addictive – and it’s addictive by design.

Let’s begin by looking at the definition of addiction:

Persistent compulsive use of a substance known by the user to be harmful

So, let’s break it down. I don’t think you can quibble with the persistent, compulsive use part. When’s the last time you had your iPhone in your hand? We can simply swap out “substance” for “device” or “technology” So that leaves with the last qualifier “known by the user to be harmful” – and there’s two parts to this – is it harmful and does the user know it’s harmful?

First, let’s look at the neurobiology of addiction. What causes us to use something persistently and compulsively? Here, dopamine is the culprit. Our reward center uses dopamine and the pleasurable sensation it produces as a positive reinforcement to cause us to pursue activities which over many hundreds of generations have proven to be evolutionarily advantageous. But Dr. Gary Small, from the UCLA Brain Research Institute, warns us that this time could be different:

“The same neural pathways in the brain that reinforce dependence on substances can reinforce compulsive technology behaviors that are just as addictive and potentially destructive.”

We like to think of big tobacco as the most evil of all evil empires – guilty of promoting addiction to a harmful substance – but is there a lot separating them from the purveyors of tech – Facebook or Google, for instance? According to Tristan Harris, there may be a very slippery slope between the two. I’ve written about Tristan before. He’s the former Google Product Manager who’s launched the Time Well Spent non-profit, devoted to stopping “tech companies from hijacking our minds.” Harris points the finger squarely at the big Internet platforms for creating platforms that are intentionally designed to suck as much of our time as possible. There’s empirical evidence to back up Harris’s accusations. Researchers at Michigan State University and from two universities in the Netherlands found that even seeing the Facebook logo can trigger a conditioned response in a social media user that starts the dopamine cycle spinning. We start jonesing for a social media fix.

So, what if our smart phones and social media platforms seduce us into using them compulsively? What’s the harm, as long as it’s not hurting us? That’s the second part of the addiction equation – is whatever we’re using harmful? After all, it’s not like tobacco, where it was proven to cause lung cancer.

Ah, but that’s the thing, isn’t it? We were smoking cigarettes for almost a hundred years before we finally found out they were bad for us. Sometimes it takes awhile for the harmful effects of addiction to appear. The same could be true for our tech habit.

Tech addiction plays out at many different levels of cognition. This could potentially be much more sinister than just the simple waste of time that Tristan Harris is worried about. There’s mounting evidence that overuse of tech could dramatically alter our ability to socialize effectively with other humans. The debate, which I’ve talked about before, comes when we substitute screen-to-screen interaction for face-to-face. The supporters say that this is simply another type of social bonding – one that comes with additional benefits. The naysayers worry that we’re just not built to communicate through screen and that – sooner or later – there will be a price to be paid for our obsessive use of digital platforms.

Dr. Jean Twenge, professor of psychology at San Diego State University, researches generational differences in behavior. It’s here where the full impact of the introduction of a disruptive environmental factor can be found. She found a seismic shift in behaviors between Millennials and the generation that followed them. It was a profound difference in how these generations viewed the world and where they spent their time. And it started in 2012 – the year when the proportion of Americans who owned a smartphone surpassed 50 percent. She sums up her concern in unequivocal terms:

“The twin rise of the smartphone and social media has caused an earthquake of a magnitude we’ve not seen in a very long time, if ever. There is compelling evidence that the devices we’ve placed in young people’s hands are having profound effects on their lives—and making them seriously unhappy.”

Not only are we less happy, we may be becoming less smart. As we become more reliant on technology, we do something called cognitive off-loading. We rely on Google rather than our memories to retrieve facts. We trust our GPS more than our own wayfinding strategies to get us home. Cognitive off loading is a way to move beyond the limits of our own minds, but there may an unacceptable trade off here. Brains are like muscles – if we stop using them they begin to atrophy.

Let’s go back to that original definition and the three qualifying criteria:

  • Persistent, compulsive use
  • Harmful
  • We know it’s harmful

In the case of tech, let’s not wait a hundred years to put check marks after all of these.

 

 

Is Busy the New Alpha?

Imagine you’ve just been introduced into a new social situation. Your brain immediately starts creating a social hierarchy. That’s what we do. We try to identify the power players. The process by which we do this is interesting. The first thing we do is look for obvious cues. In a new job, that would be titles and positions. Then, the process becomes very Bayesian – we form a base understanding of the hierarchy almost immediately and then constantly update it as we gain more knowledge. We watch power struggles and update our hierarchy based on the winners and losers. We start assigning values to the people in this particular social network and; more importantly, start assessing our place in the network and our odds for ascending in the hierarchy.

All of that probably makes sense to you as you read it. There’s nothing really earth shaking or counter intuitive. But what is interesting is that the cues we use to assign standings are context dependent. They can also change over time. What’s more, they can vary from person to person or generation to generation.

In other words, like most things, our understanding of social hierarchy is in the midst of disruption.

An understanding of hierarchy appears to be hardwired into us. A recent study found that humans can determine social standing and the accumulation of power pretty much as soon as they can walk. Toddlers as young as 17 months could identify the alphas in a group. One of the authors of the study, University of Washington psychology professor Jessica Sommerville , said that even the very young can “see that someone who is more dominant gets more stuff.” That certainly squares with our understanding of how the world works. “More stuff” has been how we’ve determined social status for hundreds of years. In sociology, it’s called conspicuous consumption, a term coined by sociologist Thorstein Veblen. And it’s a signaling strategy that evolved in humans over our recorded history. The more stuff we had, and the less we had to do to get that stuff, the more status we had. Just over a hundred years ago, Veblen called this the Leisure Class.

But today that appears to be changing. A recent study seems to indicate that we now associate busyness with status. Here, it’s time – not stuff – that is the scarce commodity. Social status signaling is more apt to involve complaining about how we never go on a vacation than about our “summer on the continent”.

At least, this seems to be true in the U.S. The researchers also ran their study in Italy and there the situation was reversed. Italians still love their lives of leisure. The U.S. is the only developed country in the world without a single legally required paid vacation day or holiday. In Italy, every employee is entitled to at least 32 paid days off per year.

In our world of marketing – which is acutely aware of social signaling – this could create some interesting shifts in messaging. I think we’re already seeing this. Campaigns aimed at busy people seem to equate scarcity of time with success. The one thing missing in all this social scrambling – whether it be conspicuous consumption or working yourself to death – might be happiness. Last year a study out of the University of British Columbia found a strong link between those who value their time more than money and happiness.

Maybe those Italians are on to something.

Curmudgeon, Chicken Little or Cognoscenti?

Apparently I’m old and out of step. Curmudgeonly, even. And this is from people of my own generation. My previous column about the potential shallowness encouraged by social media drew a few comments that indicated I was just being a grumpy old man. One was from an old industry friend – Brett Tabke:

“The rest of the article is like out of the 70’s in that it is devoid of the reality that is the uber-me generation. The selfie is only a reflection of their inward focus.”

The other was from Monica Emrich, whom I’ve never had the pleasure of meeting:

” ’Social Media Is Barely Skin-Deep.’ ho hum. History shows: when new medium hits, civilization as we know it is over.”

These comments seem to telling me, “Relax. You just don’t understand because you’re too old. Everything will be great.” And, if that’s true, I’d be okay with that. I’m more than willing to be proven a doddering old fool if it means technology is ushering us into a new era of human greatness.

But what if this time is different? What if Monica’s facetious comment actually nailed it? Maybe civilization as we know it will be over. The important part of this is “as we know it.” Every technological disruption unleashes a wave of creative destruction that pushes civilization in a new direction. We seem to blindly assume it will always go in the right direction. And it is true that technology has generally elevated the human race. But not uniformly – and not consistently. What if this shift is different? What if we become less than what we were? It can happen. Brexit – Xenophobia – Trump – Populism, all these things are surfing on the tides of new technology.

Here’s the problem. There are some aspects of technology that we’ve never had to deal with before – at least, not at this scale. One these aspects (other aspects will no doubt be the topic of a future Media Insider) is that technology is now immersive and ubiquitous. It creates an alternate reality for us, and it has done in it in a few short decades. Why is this dangerous? It’s dangerous because evolution has not equipped us to deal with this new reality. In the past, when there has been a shift in our physical reality, it has taken place over several generations. Natural selection had the time to reshape the human genome to survive and eventually thrive in this new reality. Along the way, we acquired checks and balances that would allow us to deal with the potentially negative impacts of the environment.

But our new reality is different. It’s happen in the space of a single generation. There is no way we could have acquired natural defenses against it. We are operating in an environment we have been untested for. The consequences are yet to be discovered.

No, your response might be to say that, “Yes, evolution doesn’t move this quickly, but out brains can. They are elastic and malleable.” This is true, but there’s a big “but” that lies hidden in this approach. Our brains rewire to be a better match their environment. This is one of the things that humans excel at. But this rewiring happens on top of a primitive platform with some built in limitations. The assumption is that a better match with our environment provides a better chance for survival of the species.

But what if technology is throwing us a curve ball in this case? No matter what the environment we have adapted to, there has been one constant: The history of humans depends on our success in living together. We have evolved to be social animals but that evolution is predicated on the assumption that our socializing would take place face-to-face. Technology is artificially decoupling our social interactions from the very definition of society that we have evolved to be able to handle. A recent Wharton interview with Eden Collinsworth sounds the same alarm bells.

“The frontal lobes, which are the part of the brain that puts things in perspective and allows you to be empathetic, are constantly evolving. But it is less likely to evolve and develop those skills if you are in front of a screen. In other words, those skills come into play when you have a face-to-face interaction with someone. You can observe facial gestures. You can hear the intonation of a voice. You’re more likely to behave moderately in that exchange, unless it’s a just a knock-down, drag-out fight.”

Collinsworth’s premise – which is covered in her new book, Behaving Badly – is that this artificial reality is changing our concepts of morality and ethics. She reminds us the two are interlinked, but they are not the same thing. Morality is our own personal code of conduct. Ethics are a shared code that society depends on to instill a general sense of fairness. Collinsworth believes both are largely learned from the context of our culture. And she worries that a culture that is decoupled from the physical reality we have evolved to operate in may have dire consequences.

The fact is that if our morality and ethics are intended to keep us socially more cohesive, this works best in a face-to-face context. In an extreme example of this, Lt. Col. Dave Grossman, a former paratrooper and professor of psychology at West Point, showed how our resistance to killing another human in combat is inversely related to our physical distance from them. The closer we are to them, the more resistant we are to the idea of killing them. This makes sense in an evolutionary environment where all combat was hand-to-hand. But today, the killer could be in a drone flight control center thousands of miles from his or her intended target.

This evolved constraint on unethical behavior – the social check and balance of being physically close to the people we’re engaging with – is important. And while the application of the two examples I’ve cited; One – the self-absorbed behavior on social networks – and Two – the moral landscape of a drone strike operator, may seem magnitudes apart in terms of culpability, the underlying neural machinery is related. What we believe is right and wrong is determined by a moral compass set to the bearings of our environment. The fundamental workings of that compass assumed we would be face-to-face with the people we have to deal with. But thanks to technology, that’s no longer the case.

Maybe Brett and Monica are right. Maybe I’m just being alarmist. But if not, we’d better start paying more attention. Because civilization “as we know it” may be ending.

 

Social Media is Barely Skin Deep

Here’s a troubling fact. According to a study from the Georgia Institute of Tech, half of all selfies taken have one purpose, to show how good the subject looks. They are intended to show the world how attractive we are: our makeup, our clothes, our shoes, our lips, our hair. The category accounts for more selfies than all other categories combined. More than selfies taken with people or pets we love, more than us doing the things we love, more than being in the places we love, more than eating the food we love. It appears that the one thing we love the most is ourselves. The selfies have spoken

In this study, the authors reference a 1956 work from sociologist Erving Goffman– The Presentation of Self in Everyday Life. Goffman took Shakespeare’s line – “All the World is a Stage and all the men and women merely players” – quite literally. His theory was that we are all playing the part of whom we want to be perceived as. Our lives are divided up into two parts – the front, when we’re “on stage” and playing our part, and the “back” – when we prepare for our role. The roles we play depend on the context we’re in.

 

Goffman’s theory introduces an interesting variable into consideration. The way we play these roles and the importance we place on them will vary with the individual. For some of us, it will be all about the role and less about the actual person who inhabits that role. These people are obsessed about how they are perceived by others. They’re the ones snapping selfies of themselves to show the world just how marvelous they look.

For others, they care little about what the world thinks of them. They are internally centered and are focused on living their lives, rather than acting their way through their lives for the entertainment of – and validation from – others. In between the two extremes is the ubiquitous bell curve of normal distribution. Most of us live somewhere on that curve.

Goffman’s theory was created specifically to provide insight into face-to-face encounters. Technology has again throw a gigantic wrinkle into things – and that wrinkle may explain why we keep taking those narcissistic selfies.

Humans are pretty damned good at judging authenticity in a face-to-face setting. We pick up subtle cues from across a wide swath of interpersonal communication channels: vocal intonations, body language, eye-to-eye contact, micro-expressions. Together, these inputs give us a pretty accurate “bullshit detector.” If someone comes across as an inauthentic “phony” the majority of us will just roll our eyes and simply start avoiding the person. In face-to-face encounters there is a social feedback mechanism that keeps the “actors” amongst us at least somewhat honest in order to remain part of the social network that forms their audience.

But social media platforms provide the idea incubator for inauthentic presentation of our own personas. There are three factors in particular that allow shallow “actors” to flourish – even to the point of going viral.

False Intimacy and Social Distance

In his blog on Psychology Today, counselor Michael Formica talks about two of these factors – social distance and false intimacy. I’ve talked about false intimacy before in another context – the “labelability” of celebrities. Social media removes the transactional costs of retaining a relationship. This has the unfortunate side effect of screwing up the brain’s natural defenses against inauthentic relationships. When we’re physically close to a person, there are no filters for the bad stuff. We get it all. Our brains have evolved to do a cost/benefit analysis of each relationship we have and decide whether it’s worth the effort to maintain it. This works well when we depend on physically proximate relationships for our own well-being.

But social media introduces a whole new context for maintaining social relationships. When the transactional costs are reduced to a scanning of a newsfeed and hitting the “Like” button, the brain says “What the hell, let’s add them to our mental friends list. It’s not costing me anything.” In evolutionary terms, intimacy is the highest status we can give to a relationship and it typically only comes with a thorough understanding of the good and the bad involved in that relationship by being close to the person – both physically and figuratively. With zero relational friction, we’re more apt to afford intimacy, whether or not it’s been earned.

The Illusion of Acceptance

The previous two factors perfectly set the “stage” for false personas to flourish, but it’s the third factor that allows them to go viral. Every actor craves acceptance from his or her audience. Social exclusion is the worst fate imaginable for them. In a face-to-face world, our mental cost/benefit algorithm quickly weeds out false relationships that are not worth the investment of our social resources. But that’s not true online. If it costs us nothing, we may be rolling our eyes – safely removed behind our screen – as we’re also hitting the “Like” button. And shallow people are quite content with shallow forms of acceptance. A Facebook like is more than sufficient to encourage them to continue their act. To make it even more seductive, social acceptance is now measurable – there are hard numbers assigned to popularity.

This is pure cat-nip to the socially needy. Their need to craft a popular – but entirely inauthentic – persona goes into overdrive. Their lives are not lived so much as manufactured to create a veneer just thick enough to capture a quick click of approval. Increasingly, they retreat to an online world that follows the script they’ve written for themselves.

Suddenly it makes sense why we keep taking all those selfies of ourselves. When all the world’s a stage, you need a good head shot.