#AlexfromTarget – An Unexpected Consequence of Technology

1414997478566_wps_10_Original_Tweet_of_Alex_frYes, I’m belatedly jumping on the #AlexfromTarget bandwagon, but it’s in service of a greater truth that I’m trying to illustrate. Last column, I spoke about the Unintended Consequences of Technology. I think this qualifies. And furthermore, this brings us full circle to Kaila Colbin’s original point, which started this whole prolonged discussion.

It is up to us to decide what is important, to create meaning and purpose. And, personally, I think we could do a better job than we’re doing now.

So, why did the entire world go ga-ga over a grocery bagger from Texas? What could possibly be important about this?

Well – nothing – and that’s the point. Thinking about important things is hard work. Damned hard work – if it’s really important. Important things are complex. They make our brains hurt. It’s difficult to pin them down long enough to plant some hooks of understanding in them. They’re like eating broccoli, or doing push ups. They may be good for us, but that doesn’t make them any more fun.

Remember the Yir Yoront from my last column – the tribal society that was thrown into a tail spin by the introduction of steel axes? The intended consequence of that introduction was to make the Yir Yoront more productive. The axes did make the tribe more productive, in that they were able to do the essential tasks more quickly, but the result was that the Yir Yoront spent more time sleeping.

Here’s the thing about technology. It allows us to be more human – and by that I mean the mixed bag of good and bad that defines humanity. It extends our natural instincts. It’s natural to sleep if you don’t have to worry about survival. And it’s also natural for young girls to gossip about adorable young boys. These are hard-wired traits. Deep philosophical thought is not a hard-wired trait. Humans can do it, but it takes conscious effort

Here’s where the normal distribution curve comes in. Any genetically determined trait will have a normal distribution over the population. How we apply new technologies will be no different. The vast majority of the population will cluster around the mean. But here’s the other thing – that “mean” is a moving target. As our brains “re-wire” and adapt to new technologies, the mean that defines typical behavior will move over time. We adapt strategies to incorporate our new technology-aided abilities. This creates a new societal standard and it is also human to follow the unwritten rules of society. This creates a cause and effect cycle. Technologies enable new behaviors that are built on top of the foundations of human instinct – society determines whether these new behaviors are acceptable – and if they are acceptable, they become the new “mean” of our behavioral bell curve. We bounce new behaviors off the backboard of society. So, much as we may scoff at the fan-girls that gave “Alex” insta-fame – ultimately it’s not the girl’s fault, or technology’s. The blame lies with us. It also lies with Ellen DeGeneres, the New York Times, and the other barometers of societal acceptance that offered endorsement of the phenomenon.

It’s human to be distracted by the titillating and trivial. It’s also human to gossip about it. There’s nothing new here. It’s just that these behaviors used to remain trapped within the limited confines of our own social networks. Now, however, they’re amplified through technology. It’s difficult to determine what the long-term consequences of this might be. Is Nicholas Carr right? Is technology leading us down the garden path to imbecility, forever distracted by bright, shiny objects? Or is our finest moment yet to come?

The Human Stories that Lie Within Big Data

storytelling-boardIf I wanted to impress upon you the fact that texting and driving is dangerous, I could tell you this:

In 2011, at least 23% of auto collisions involved cell phones. That’s 1.3 million crashes, in which 3331 people were killed. Texting while driving makes it 23 times more likely that you’ll be in a car accident.

Or, I could tell you this:

In 2009, Ashley Zumbrunnen wanted to send her husband a message telling him “I love you, have a good day.” She was driving to work and as she was texting the message, she veered across the centerline into oncoming traffic. She overcorrected and lost control of her vehicle. The car flipped and Ashley broke her neck. She is now completely paralyzed.

After the accident, Zumbrunnen couldn’t sit up, dress herself or bath. She was completely helpless. Now a divorced single mom, she struggles to look after her young daughter, who recently said to her “I like to go play with your friends, because they have legs and can do things.”

The first example gave you a lot more information. But the second example probably had more impact. That’s because it’s a story.

We humans are built to respond to stories. Our brains can better grasp messages that are in a narrative arc. We do much less well with numbers. Numbers are an abstraction and so our brains struggle with numbers, especially big numbers.

One company, Monitor360, is bringing the power of narratives to the world of big data. I chatted with CEO Doug Randall recently about Monitor360’s use of narratives to make sense of Big Data.

“We all have filters through which we see the world. And those filters are formed by our experiences, by our values, by our viewpoints. Those are really narratives. Those are really stories that we tell ourselves.”

For example, I suspect the things that resonated with you with Ashley’s story were the reason for the text – telling her husband she loved him – the irony that the marriage eventually failed after her accident and the pain she undoubtedly felt when her daughter said she likes playing with other moms who can still walk. All of those things, while they don’t really add anything to our knowledge about the incidence rate of texting and driving accidents, are all things that strike us at a deeply emotional level because we can picture ourselves in Ashley’s situation. We empathize with her. And that’s what a story is, a vehicle to help us understand the experiences of another.

Monitor360 uses narratives to tap into these empathetic hooks that lie in the mountain of information being generating by things like social media. It goes beyond abstract data to try to identify our beliefs and values. And then it uses narratives to help us make sense of our market. Monitor360 does this with a unique combination of humans and machines.

“A computer can collect huge amounts of data and the compute can even sort that data. But “sense making” is still very, very difficult for computers to do. So human beings go through that information, synthesize that information and pull out what the underlying narrative is.”

Monitor360 detects common stories in the noisy buzz of Big Data. In the stories we tell, we indicate what we care about.

“This is what’s so wonderful about Big Data. The Data actually tells us, by volume, what’s interesting. We’re taking what are the most often talked about subjects…the data is actually telling us what those subjects are. We then go in and determine what the underlying belief system in that is.”

Monitor360’s realization that it’s the narratives that we care about is an interesting approach to Big Data. It’s also encouraging to know that they’re not trying to eliminate human judgment from the equation. Empathy is still something we can trump computers at.

At least for now.

Social Media: Matching Maturity to the Right Business Model

socialmediaLast week, I talked about the maturity continuum of social media. This week, I’d like to recap and look at the business model implications of each phase

Phase One – It’s a Fad. Here, we use a new social media tool simply because it is new. This is a classic early adopter model. The business goal here is to drive adoption as fast and far as possible, hoping that acceptance will go viral. There is no revenue opportunity at this point, as you don’t want to do anything to slow adoption. It’s all about getting it into as many hands as possible.

Phase Two – It’s a statement. You use the tool because it says something about who you are. Revenue opportunities are still limited, but this is the time for cross-promotion with brands that make a similar statement. Messaging and branding become essential at this point. You have to carve a unique niche for yourself and hope that it resonates with segments of your market. The goal is to create an emotional connection with your audience to help shore up loyalty in the next phase. This is the time to start laying the foundations of an user community.

Phase Three – It’s a tool. You use it because it offers the best functionality for a particular task. Here, things have to get more practical. This is where user testing and new feature development has to move as quickly as possible. Revenue opportunities at this point are possible, depending on the usage profile of your app. If there’s high frequency of usage, advertising sponsorship is a possibility. But be aware that this will bring inevitable push back from your users, especially if there has been no advertising up to this point. This shakes the loyalty of the “Statement” users, as they feel you’re selling out. The functionality will have to be rock solid to prevent attrition of your user base during this phase. Essentially, it will have to be good enough to “lock out” the competition. But there’s another goal here as well. Introducing new functionality allows you to move beyond being a one-trick pony. This is where you have to start moving from being a tool to the next phase…

Phase Four – It’s a Platform. If you’ve successfully transitioned to being a social media platform, you should have the opportunity to finally turn a profit. The stability of the revenue model will be wholly dependent on how high you’ve been able to raise the cost of switching. The more “sticky” your platform is, the more stable your revenue will be. But, be aware that using advertising as your revenue channel is fraught with issues in the world of social media. Unlike search, where we are used to dealing with a crystal clear indication of consumer interest, social media usage seldom comes tied to clear buyer intent. You have to worry about modality and social norms, along with the erosion of your “cool” factor.

In the last two phases, the best revenue opportunities should be directly tied to functionality and intent. The closer you can align your advertising message to the intent of the users “in the moment” the more stable your revenue model will be. In fact, if you can introduce tools that are focused on users when they are in social modes where commercial messaging is appropriate, you will find revenue opportunities dropping into your lap. For example, if users use LinkedIn to crowdsource opinions on B2B purchases, you have a natural monetization opportunity. If they’re using your app to post pictures of their cat playing a xylophone, you’re going to find it much harder to make a buck. Not impossible, but pretty damned difficult.

The Maturity Continuum of Social Media

facebook-twitter-635Social channels will come and go. Why are we still surprised by this? Just last week, Catharine Taylor talked about the ennui that’s threatening to silence Twitter. Frankly, the only thing surprising about this is that Twitter has had as long a run as they have. Let’s face it. If every there was a social media one trick pony, it’s Twitter.

The fact is, if you are a player in the social media space, you have to accept that there’s a unique maturity evolution in usage patterns. It’s a much more fickle audience than you would find in something like content publishing or search. The channels we use to express ourselves socially are subject to an extraordinary amount of irrational behavior. We project our own beliefs about who we are and how we fit into our own social networks on them. This leaves them vulnerable to sudden shifts in usage, simply because large chunks of the audience may suddenly have changed their minds about what is socially acceptable. And this is what’s currently happening to Twitter.

This is compounded by the fact that we’re talking about technology here, so where we perceive ourselves to be on the technology acceptance curve will have an impact on the social channels we find acceptable to us. If we think we’re early adopters, we’ll be quicker to move to whatever is new. Not only this, we’ll be unduly influenced by what we see other early adopters doing.

The Maturity Continuum for Social is as follows:

It’s a Fad – You use it because everyone (in your circle of influence) else is doing it. Early adopters are particularly susceptible to this effect. They’ll be the ones to test out new channels and tools, simply because they are new. But that momentum doesn’t last long. New entrants will also have to prove that they have at least a certain amount of functionality, and, more importantly, something unique that users can identify with. If this is the case, they will transition to the second phase:

It’s a Statement – You use it because it makes a statement about who you are. And with technology, it’s usually about how cutting edge you are. This makes it particularly prone to abandonment. But there are other factors at play here. Is it all business (LinkedIn) or all fun (Snapchat)? A small percentage of the user base will stick in this phase, becoming brand loyalists. The majority, however, will move on to the third phase:

It’s a Tool – You use it because it’s the best tool for the job. Here, functionality trumps all. It’s in these last two phases where rationality finally takes hold. The sheen of the BSOS (Bright Shiny Object Syndrome) has faded and we’ll only continue using it if it provides better functionality for the task at hand than any of the other alternatives. The problem here is the functional supremacy is a never-ending arms race. Sooner or later, something better will come along (if it successfully navigates through the first two phases). This is typically the end of the road for most social media one-trick ponies, and this is what is currently staring Twitter in the face.

It’s a Platform – You use it because the landscape is familiar. Here you rely on becoming a habitual “stickiness” with users and something called UI Cognitive Lock in. Essentially, this is an online real estate play. If you’ve had a long run as a single purpose tool and have developed a large user base, you have to expand that into a familiar landscape before a new contender unseats you as the tool of choice. This is what Facebook and LinkedIn are currently trying to do. And, to survive, it’s what Twitter must do as well. By assembling a number of tools, you increase the cost of switching to the point where it doesn’t make sense for most users.

Each of these phases has different usage profiles, which directly impact their respective business models. More on that next week.

 

Two Views of the Promise of Technology

technologybrainIn the last two columns, I’ve looked at how technology may be making us intellectually lazy. The human brain tends to follow the path of least resistance and technology’s goal is to eliminate resistance. Last week, I cautioned that this may end up making us both more shallow in our thinking and more fickle in our social ties. We may become an attention deficit society, skipping across the surface of the world. But, this doesn’t necessarily have to be the case.

The debate is not a new one. Momentous technologies generally come complete with their own chorus of naysayers. Whether it’s the invention of writing, the printing press, electronic communication or digital media, the refrain is the same – this will be the end of the world as we know it. But if history has taught us anything, it’s that new technologies are seldom completely beneficial or harmful. Their lasting impact lies somewhere in the middle. With the good comes some bad.

The same will be true for the current digital technologies. The world will change, both for the positive and for the negative. The difference will come in how individuals use the technology. This will spread out along the inevitable bell curve.

watchingTVLook at television, for instance. A sociologist could make a pretty convincing case for the benefits of TV. A better understanding of the global community helped ease our xenophobic biases. Public demand lead to increased international pressure on repressive regimes. There was a sociological leveling that is still happening across cultures. Civil rights and sexual equality were propelled by the coverage they received. Atrocities still happen with far too much regularity, but I personally believe the world is a less savage and brutal place than it was 100 years ago, partially due to the spread of TV.

On the flip side, we have developed a certain laziness of spirit that is fed by TV’s never ending parade of entertainment to be passively consumed. We spend less time visiting our neighbors. We volunteer less. We’re less involved in our communities. Ironically, we’re  a more idealistic society but we make poorer neighbors.

The type of programming to be found on TV also shows that despite the passive nature of the medium, we didn’t become stupider en masse. Some of us use TV for enlightenment, and some of us use it to induce ourselves into a coma. At the end of the day, I think the positives and negatives of TV as a technology probably net out a little better than neutral.

I suspect the same thing is happening with digital media. Some of us are diving deeper and learning more than ever. Others are clicking their way through site after site of brain-porn. Perhaps there are universal effects that will show up over generations that will type the scale one way or the other, but we’re too early in the trend to see those yet. The fact is, digital technologies are not changing our brains in a vacuum. Our environment is also changing and perhaps our brains are just keeping up. The 13 year old who is frustrating the hell out of us today may be a much better match for the world 20 years from now.

I’ll wrap up by leaving three pieces of advice that seem to provide useful guides for getting the best out of new technologies.

First: A healthy curiosity is something we should never stop nurturing. In particular, I find it helpful to constantly ask “how” and “why.”

Second: Practice mindfulness. Be aware of your emotions and cognitive biases and recognize them for what they are. This will help you steer things back on track when they’re leading down an unhealthy path

Third: Move from consuming content to contributing something meaningful. The discipline of publishing tends to push you beyond the shallows.

If you embrace the potential of technology, you may still find yourself as an outlier, but technology has done much to allow a few outliers to make a huge difference.

Are Our Brains Trading Breadth for Depth?

ebrain1In last week’s column, I looked at how efficient our brains are. Essentially, if there’s a short cut to an end goal identified by the brain, it will find it. I explained how Google is eliminating the need for us to remember easily retrievable information. I also speculated about how our brains may be defaulting to an easier form of communication, such as texting rather than face-to-face communication.

Personally, I am not entirely pessimistic about the “Google Effect,” where we put less effort into memorizing information that can be easily retrieved on demand. This is an extension of Daniel Wegner’s “transactive memory”, and I would put it in the category of coping mechanisms. It makes no sense to expend brainpower on something that technology can do easier, faster and more reliably. As John Mallin commented, this is like using a calculator rather than memorizing times tables.

Reams of research has shown that our memories can be notoriously inaccurate. In this case, I partially disagree with Nicholas Carr. I don’t think Google is necessarily making us stupid. It may be freeing up the incredibly flexible power of our minds, giving us the opportunity to redefine what it means to be knowledgeable. Rather than a storehouse of random information, our minds may have the opportunity to become more creative integrators of available information. We may be able to expand our “meta-memory”, Wegner’s term for the layer of memory that keeps track of where to turn for certain kinds of knowledge. Our memory could become index of interesting concepts and useful resources, rather than ad-hoc scraps of knowledge.

Of course, this positive evolution of our brains is far from a given. And here Carr may have a point. There is a difference between “lazy” and “efficient.” Technology’s freeing up of the processing power of our brain is only a good thing if that power is then put to a higher purpose. Carr’s title, “The Shallows” is a warning that rather than freeing up our brains to dive deeper into new territory, technology may just give us the ability to skip across the surface of the titillating. Will we waste our extra time and cognitive power going from one piece of brain candy to the other, or will we invest it by sinking our teeth into something important and meaningful?

A historical perspective gives us little reason to be optimistic. We evolved to balance the efforts required to find food with the nutritional value we got from that food. It used to be damned hard to feed ourselves, so we developed preferences for high calorie, high fat foods that would go a long way once we found them. Thanks to technology, the only effort required today to get these foods is to pick them off the shelf and pay for them. We could have used technology to produce healthier and more nutritious foods, but market demands determined that we’d become an obese nation of junk food eaters. Will the same thing happen to our brains?

I am even more concerned with the short cuts that seem to be developing in our social networking activities. Typically, our social networks are built both from strong ties and weak ties. Mark Granovetter identified these two types of social ties in the 70’s. Strong ties bind us to family and close friends. Weak ties connect us with acquaintances. When we hit rough patches, as we inevitably do, we treat those ties very differently. Strong ties are typically much more resilient to adversity. When we hit the lowest points in our lives, it’s the strong ties we depend on to pull us through. Our lifelines are made up of strong ties. If we have a disagreement with someone with whom we have a strong tie, we work harder to resolve it. We have made large investments in these relationships, so we are reluctant to let them go. When there are disruptions in our strong tie network, there is a strong motivation to eliminate the disruption, rather than sacrifice the network.

Weak ties are a whole different matter. We have minimal emotional investments in these relationships. Typically, we connect with these either through serendipity or when we need something that only they can offer. For example, we typically reinstate our weak tie network when we’re on the hunt for a job. LinkedIn is the virtual embodiment of a weak tie network. And if we have a difference of opinion with someone to whom we’re weakly tied, we just shut down the connection. We have plenty of them so one more or less won’t make that much of a difference. When there are disruptions in our weak tie network, we just change the network, deactivating parts of it and reactivating others.

Weak ties are easily built. All we need is just one thing in common at one point in our lives. It could be working in the same company, serving on the same committee, living in the same neighborhood or attending the same convention. Then, we just need some way to remember them in the future. Strong ties are different. Strong ties develop over time, which means they evolve through shared experiences, both positive and negative. They also demand consistent communication, including painful communication that sometimes requires us to say we were wrong and we’re sorry. It’s the type of conversation that leaves you either emotionally drained or supercharged that is the stuff of strong ties. And a healthy percentage of these conversations should happen face-to-face. Could you build a strong tie relationship without ever meeting face-to-face? We’ve all heard examples, but I’d always place my bets on face-to-face – every time.

It’s the hard work of building strong ties that I fear we may miss as we build our relationships through online channels. I worry that the brain, given an easy choice and a hard choice, will naturally opt for the easy one. Online, our network of weak ties can grow beyond the inherent limits of our social inventory, known as Dunbar’s Number (which is 150, by the way). We could always find someone with which to spend a few minutes texting or chatting online. Then we can run off to the next one. We will skip across the surface of our social network, rather than invest the effort and time required to build strong ties. Just like our brains, our social connections may trade breadth for depth.

The Pros and Cons of a Fuel Efficient Brain

Transactive dyadic memory Candice Condon3Your brain will only work as hard as it has to. And if it makes you feel any better, my brain is exactly the same. That’s the way brains work. They conserve horsepower until when it’s absolutely needed. In the background, the brain is doing a constant calculation: “What do I want to achieve and based on everything I know, what is the easiest way to get there?” You could call it lazy, but I prefer the term “efficient.”

The brain has a number of tricks to do this that involve relatively little thinking. In most cases, they involve swapping something that’s easy for your brain to do in place of something difficult. For instance, consider when you vote. It would be extraordinarily difficult to weigh all the factors involved to truly make an informed vote. It would require a ton of brainpower. But it’s very easy to vote for whom you like. We have a number of tricks we use to immediately assess whether we like and trust another individual. They require next to no brainpower. Guess how most people vote? Even those of us who pride ourselves on being informed voters rely on these brain short cuts more than we would like to admit.

Here’s another example that’s just emerging, thanks to search engines. It’s called the Google Effect and it’s an extension of a concept called Transactive Memory. Researchers Betsy Sparrow, Jenny Liu and Daniel Wegner identified the Google Effect in 2011. Wegner first explained transactive memory back in the 80’s. Essentially, it means that we won’t both to remember something that we can easily reference when we need it. When Wegner first talked about transactive memory in the 80’s, he used the example of a husband and wife. The wife was good at remembering important dates, such as anniversaries and birthdays. The husband was good at remembering financial information, such as bank balances and when bills were due. The wife didn’t have to remember financial details and the husband didn’t have to worry about dates. All they had to remember was what each other was good at memorizing. Wegner called this “chunking” of our memory requirements “metamemory.”

If we fast-forward 30 years from Wegner’s original paper, we find a whole new relevance for transactive memory, because we now have the mother of all “metamemories”, called Google. If we hear a fact but know that this is something that can easily be looked up on Google, our brains automatically decide to expend little to no effort in trying to memorize it. Subconsciously, the brain goes into power-saver mode. All we remember is that when we do need to retrieve the fact, it will be a few clicks away on Google. Nicholar Carr fretted about whether this and other cognitive short cuts were making us stupid in his book “The Shallows.”

But there are other side effects that come from the brain’s tendency to look for short cuts without our awareness. I suspect the same thing is happening with social connections. Which would you think required more cognitive effort: a face-to-face conversation with someone or texting them on a smartphone?

Face-to-face conversation can put a huge cognitive load on our brains. We’re receiving communication at a much greater bandwidth than with text.   When we’re across from a person, we not only hear what they’re saying, we’re reading emotional cues, watching facial expressions, interpreting body language and monitoring vocal tones. It’s a much richer communication experience, but it’s also much more work. It demands our full attention. Texting, on the other hand, can easily be done along with other tasks. It’s asynchronous – we can pause and pick up when ever we want. I suspect its no coincidence that younger generations are moving more and more to text based digital communication. Their brains are pushing them in that direction because it’s less work.

One of the great things about technology is that it makes our life easier. But is that also a bad thing? If we know that our brains will always opt for the easiest path, are we putting ourselves in a long, technology aided death spiral? That was Nicholas Carr’s contention. Or, are we freeing up our brains for more important work?

More on this to come next week.

The Era of Amplification

First published in Mediapost’s Search Insider, May 1, 2014

AmandaToddVideoMediapost columnist Joseph Jaffe wrote a great piece Tuesday on the Death of Anonymity. He shows how anonymity in the era of digital has become both a blessing and a curse, leading to an explosion of cowardly, bone-headed comments and cyber-bullying.  This reinforces something I’ve said repeated: technology doesn’t change human behavior; it just enables it in new ways. Heroes will find new ways to be heroes, and idiots will find new ways to be idiots.

But there is something important happening here. It’s not that technology is making us meaner, more cowardly or more stupid. I grew up with bullies, my father grew up with bullies and his father grew up with bullies. You could trace a direct line of bullies going back to the first time our ancestors walked erect, and probably further than that. So what’s different today? Why do we now need laws against cyber-bullying?

It’s because we now live in a time of increased amplification. The waves that spread from an individual’s actions go farther than ever before.  Technology increases the consequences of those actions.  A heroic act can spread through a network and activate other heroes, creating a groundswell of heroism. Unfortunately, the flip side is also true – bullying can begat more bullying. The viral spread of bullying that technology enables can make the situation hopeless for the victim.

Consider the case of Amanda Todd, a grade 10 student from Port Coquitlam, BC, Canada. Todd had been bullied for over a year by a guy who wanted “a show”. She finally relented and flashed her breasts. While not advisable, Todd’s actions were not that unusual. She wasn’t the first 15 year-old to experiment with a little sexual promiscuity after prolonged male pleading. It certainly shouldn’t have turned into a death sentence for Todd. But it did – because of amplification.

First of all, Todd’s tormentor was a man who lived thousands of miles away, in Holland. They never met. Secondly, Todd’s indiscretion was captured in a digital picture and was soon circulated worldwide. As teen-agers have been since time began, Todd was mercilessly teased. But it wasn’t just at the hands of a small circle of bullies at her high school. Taunts from around the world came from jerks who jumped on the bandwagon. A teen-ager’s psyche is typically a fragile thing, and the amplitude of that teasing was psychologically crushing for Todd. Desperate for escape, she first recorded a plea for understanding that she posted online, and then took her own life. The act that started all this should have been added to that pile of minor regrets we all assemble in our adolescence. It should not have ended the way it did. Unfortunately, Todd was a victim of amplification.

My wife and I have two daughters, one of which is about the same age as Todd. Because they grew up in the era of Amplification, we pounded home the fact that anything captured online can end up anywhere. You just can’t be careless, not even for the briefest of moments. But, of course, teenagers are occasionally careless. It’s part of the job description. They’re testing the world as a place to live in – experimenting with what it means to be an adult –  and mistakes are inevitable. Unfortunately, the potential price to be paid for those mistakes has been raised astronomically.

Here’s perhaps the most frightening thing about this. Todd’s Youtube video has been seen over 17 million times, so it too has been amplified by technology. Amanda’s story has spread through the world online. The vast majority of comments are those you would hope to see – expressions of sympathy, support, understanding and caring. But there are a handful of hateful comments of the sort that drove Todd to suicide. Technology allows us to sort and filter for negativity. In other words, technology allows bullies to connect to bullies.

In social networks, there is something called “threshold-limited spreading.” Essentially, it means that for something to spread through a network, the number of incidences needs to reach a certain threshold. In the case of bullying, as in the case of rioting or social movements, the threshold depends on the connections between like-minded individuals. If bullies can connect in a cluster, they draw courage from each other. This can then trigger a cascade effect, encouraging those “on the margin” to also engage in bullying. Technology, because of its unique ability to enable connections between those who think alike, can trigger these cascades of bullying. It doesn’t matter if the ratio of positive to negative is ten to one or even one hundred to one. All that matters is there are a sufficient number of negative comments for the would-be bully to feel that he or she has support.

I don’t know what the lasting impact of the Era of Amplification will be.  I do know that Technology has made the world a much more promising place than it was when I was born. I also know it’s made it much crueler and more frightening.  And it’s not because of any changes in who we are. It’s because the ripples of our actions now can spread further than we can even imagine.

The Psychology of Social: Are We Hardwired to Use Social Media?

Man is by nature a social animal; an individual who is unsocial naturally and not accidentally is either beneath our notice or more than human. Society is something that precedes the individual. Anyone who either cannot lead the common life or is so self-sufficient as not to need to, and therefore does not partake of society, is either a beast or a god. 

Aristotle

I’ve looked at online entertainment and I’ve looked at online tools, both in a quest to see where loyal and stable audiences might be found. But that leaves one huge part of the online landscape unexplored – online social media. In both my previous explorations, the scope of the quest quickly exploded into several posts. I think social media will be as difficult to restrict to a few posts, if not more so.

One thing that both entertainment and usefulness had in common was their foundation – our human drives. In any area I’ve explored up to now, I’ve always found our interactions with technology, as fickle as they may be, are layered over innate human drives with origins reaching back several thousands of generations. In entertainment, although the channels may have changed drastically in the past few decades (digital media, video games, virtual environments), our responses are predictably human. The things that make us cry, jump in our seats or laugh out loud really haven’t changed that much in many thousands of years. Humans adapt quickly to new technology, but our tastes remain reliably consistent.

Usefulness is a little different. In this case, our expectations of utility and the ever-rising bar of technology form somewhat of an arms race, with each upping the ante for the other. New tools allow us to do new things, which reset our expectations. These reset expectations cause us to periodically review the tools we use, and if they no longer match our expectations, we go looking for new tools. But even if we’re on the hunt for increased usefulness, we still use strategies that appear to have evolved hundreds of thousands of years ago on the savannah. I believe we forage for and evaluate useful technologies the same way we forage for food. This means that while technologies may change quickly, our behaviors towards them are remarkably predictable.

20090921_social_connectionsSo, what should we expect as we explore how the human need for society plays out in new online arenas? Again, I think it’s safe to say that our behaviors will be driven by innate human needs and strategies. So that seems to be as good a place as any to start.

In their book “Driven, How Human Nature Shapes Our Choices,” Harvard professors Paul Lawrence and Nitin Nohria tried to reduce human nature down to the lowest possible number of non-redundant factors. They came up with four irreducible drives:

  • The Need to Acquire
  • The Need to Bond
  • The Need to Learn
  • The Need to Defend

All human actions, all cultural trends, all societal behaviors will be driven by one or a combination of these factors. If Lawrence and Nohria are right, then the usage of social media should be no exception. Let’s look at the four to see how they might map onto social media usage.

The Need to Bond

I’ll start with the most obvious one – the need to Bond. Social media is all about bonding. This hits squarely at the heart of our social nature. As Aristotle said, we’re not built to be alone. Humans thrive in herds. And social media provides us a digitally mediated way to bond.

The complexity of our social bonds are staggering. It’s amazing to think of all the dimensions we impose on our social relationships. Things like status, gossip, empathy, reciprocity, jealousy, xenophobia, admiration, loyalty, love, hate and so many other emotionally charged factors constantly occupy our mind as we try to navigate the stormy waters of our social connections. One might be tempted to throw up our hands in frustration and live in social isolation, but we don’t. Why? Because evolution has proven conclusively that we’re better together than apart. That strategy has been hardwired into our genes. As much as maintaining a social network is a complete pain in the ass sometimes, it’s a necessary part of the human experience. Most times, the benefits outweigh the drawbacks.

The challenge, however, is that all this baggage will be hauled over to whatever new platforms we use to connect with others. This includes online social media. To be effective and engaging, a social media tool has to allow us to do the things we have always done to survive and thrive in our respective herd – whether it’s to increase the frequency of connection with family, gossip in real-time, brag more effectively to all of our acquaintances at once or reconnect with those that lie in the more out flung regions of our networks. While they’re all very human, these activities, when brought on to a publishing platform (which is a major feature of all social media) introduces a significant signal to noise issue.

The Need to Acquire

While we don’t usually acquire physical things through social media, we sure as hell use it to brag about the things we do acquire in the real world. A unhealthy proportion of social media activity is devoted to the acquisition of new cars, clothes, jewelry, trips, houses, boats – you name it, we tweet (or Facebook, or Instagram) about it. The arms race of social status is being waged daily on social media.

The Need to Learn

One of the biggest reasons why humans became social animals is that it was a much more efficient way to learn. In a herd, we don’t have to learn every lesson ourselves – we can learn from the experiences of other. Of course, that requires a way for lessons to spread throughout our networks. Stories, gossip, rumors – these are all social forms of information transmission. And they have all migrated onto our digital social media platforms.

The Need to Defend

This is probably the least social of Nohria and Lawrence’s Four Drives, at least as it might apply to use of social media. Humans need to defend ourselves, our kin, our community (or tribe, or nation) our possessions, our reputation, our status, our beliefs and our security. But, like all the drives, the need to defend, especially the defense of our beliefs, status or reputation, does play out in the online forum as well.

When looked at in the context of these four innate drives, it’s clear that the use of social media aligns well with our evolved requirements. It is just another channel we can use to let our pre-wired social tendencies play out. So, it passes the first gut-test. This is something we would do naturally, with or without the tools of social media. The next question is, how might our social activities change, for the good and the bad, when they’re mediated through digital channels? I’ll come back here in the next post.

 

#Meaningless #Crap

First published April 10, 2014 in Mediapost’s Search Insider

hashtagEverybody should have a voice – I get that. Thank goodness that the web and social media have democratized publication. Because of that, the power to say what’s on our mind is just a click away. From this power, great things have and will continue to come – the overthrow of tyrants, the quest for truth, freedom from oppression. I’m pretty sure those are all good things. Important things.

But I’m also pretty sure the signal to noise ratio in social media content is infinitesimal – verging on undetectable. For every post that moves humanity incrementally forward, there are thousands that drive us over the brink into mind numbing mediocrity.

For example, Justin Bieber has 51 million followers, and has tweeted 26,508 times. That, in case you’re wondering, has produced 1.35 trillion “Bieberisms,” or 193 little Bieber-tweets for every man, woman and child on planet Earth. Here’s one of his finest: “Put your heart into everything you do”. Perhaps the Biebs would be better served by using his head a little bit too. But no matter, he tweets on, sharing his special brand of wisdom. No wonder over 70% of all tweets never get read.

And, for God’s sake – stop hashtagging everything! First of all, it only belongs on Twitter and Instagram. It’s not a universal punctuation mark. And it doesn’t belong in front of every word of your post! If you’re writing about something that falls under a topic category that people actually care about – then by all means slip a hashtag in there. For example:

“Witnessing special forces retaking capital building in Kiev – #ukrainecrisis”

Or:

“Just discovered key gene in early detection of Alzheimer’s – #alzheimerresearch”

See how it works? You’re adding key content to a topic that people care about and may actually be searching for on Twitter. This is how not to use hashtags:

“Off to a funeral #selfie #zebra #sunglasses #bling #hairdown #polo #countrygirl #aero #dodge #ram #cute”

All I can say is #shoot #me.

The other problem is that with this diarrheic explosion of content flooding online, it becomes impossible to sift through all of it to find things that are truly important. Generally, most content filters use one of two criteria – recency or popularity. Recency is fine if you’re looking for breaking news. It’s a clearly understood parameter. Popularity, however, has some issues. The theory here is that the wisdom of crowds can be relied on to push the best content to the top. But that’s not really how the wisdom of crowds works. Just because something is popular doesn’t necessarily mean it’s good. And it certainly doesn’t mean it’s important. All too often, it just means that it panders to the lowest common denominator. Do we really want that to be our filtering criteria? Should Kanye West and Keeping Up with the Kardashians mark our cultural high water mark?

One last rant. “Epic” is not the right adjective to apply to concert tickets, Saturday nights at the club, bowls of chili or, when incorrectly combined with the verb “fail”, your company’s Christmas party. According to this post,

“the word epic should only be used to describe two or three things, ever. In fact, here’s a comprehensive list of all things epic: 1. Oceans 2. Lengthy Narratives 3. The Cosmos.”

That’s it.

Feel free to retweet if you wish. Or not. No one will read it anyway.