Social Media is Barely Skin Deep

Here’s a troubling fact. According to a study from the Georgia Institute of Tech, half of all selfies taken have one purpose, to show how good the subject looks. They are intended to show the world how attractive we are: our makeup, our clothes, our shoes, our lips, our hair. The category accounts for more selfies than all other categories combined. More than selfies taken with people or pets we love, more than us doing the things we love, more than being in the places we love, more than eating the food we love. It appears that the one thing we love the most is ourselves. The selfies have spoken

In this study, the authors reference a 1956 work from sociologist Erving Goffman– The Presentation of Self in Everyday Life. Goffman took Shakespeare’s line – “All the World is a Stage and all the men and women merely players” – quite literally. His theory was that we are all playing the part of whom we want to be perceived as. Our lives are divided up into two parts – the front, when we’re “on stage” and playing our part, and the “back” – when we prepare for our role. The roles we play depend on the context we’re in.

 

Goffman’s theory introduces an interesting variable into consideration. The way we play these roles and the importance we place on them will vary with the individual. For some of us, it will be all about the role and less about the actual person who inhabits that role. These people are obsessed about how they are perceived by others. They’re the ones snapping selfies of themselves to show the world just how marvelous they look.

For others, they care little about what the world thinks of them. They are internally centered and are focused on living their lives, rather than acting their way through their lives for the entertainment of – and validation from – others. In between the two extremes is the ubiquitous bell curve of normal distribution. Most of us live somewhere on that curve.

Goffman’s theory was created specifically to provide insight into face-to-face encounters. Technology has again throw a gigantic wrinkle into things – and that wrinkle may explain why we keep taking those narcissistic selfies.

Humans are pretty damned good at judging authenticity in a face-to-face setting. We pick up subtle cues from across a wide swath of interpersonal communication channels: vocal intonations, body language, eye-to-eye contact, micro-expressions. Together, these inputs give us a pretty accurate “bullshit detector.” If someone comes across as an inauthentic “phony” the majority of us will just roll our eyes and simply start avoiding the person. In face-to-face encounters there is a social feedback mechanism that keeps the “actors” amongst us at least somewhat honest in order to remain part of the social network that forms their audience.

But social media platforms provide the idea incubator for inauthentic presentation of our own personas. There are three factors in particular that allow shallow “actors” to flourish – even to the point of going viral.

False Intimacy and Social Distance

In his blog on Psychology Today, counselor Michael Formica talks about two of these factors – social distance and false intimacy. I’ve talked about false intimacy before in another context – the “labelability” of celebrities. Social media removes the transactional costs of retaining a relationship. This has the unfortunate side effect of screwing up the brain’s natural defenses against inauthentic relationships. When we’re physically close to a person, there are no filters for the bad stuff. We get it all. Our brains have evolved to do a cost/benefit analysis of each relationship we have and decide whether it’s worth the effort to maintain it. This works well when we depend on physically proximate relationships for our own well-being.

But social media introduces a whole new context for maintaining social relationships. When the transactional costs are reduced to a scanning of a newsfeed and hitting the “Like” button, the brain says “What the hell, let’s add them to our mental friends list. It’s not costing me anything.” In evolutionary terms, intimacy is the highest status we can give to a relationship and it typically only comes with a thorough understanding of the good and the bad involved in that relationship by being close to the person – both physically and figuratively. With zero relational friction, we’re more apt to afford intimacy, whether or not it’s been earned.

The Illusion of Acceptance

The previous two factors perfectly set the “stage” for false personas to flourish, but it’s the third factor that allows them to go viral. Every actor craves acceptance from his or her audience. Social exclusion is the worst fate imaginable for them. In a face-to-face world, our mental cost/benefit algorithm quickly weeds out false relationships that are not worth the investment of our social resources. But that’s not true online. If it costs us nothing, we may be rolling our eyes – safely removed behind our screen – as we’re also hitting the “Like” button. And shallow people are quite content with shallow forms of acceptance. A Facebook like is more than sufficient to encourage them to continue their act. To make it even more seductive, social acceptance is now measurable – there are hard numbers assigned to popularity.

This is pure cat-nip to the socially needy. Their need to craft a popular – but entirely inauthentic – persona goes into overdrive. Their lives are not lived so much as manufactured to create a veneer just thick enough to capture a quick click of approval. Increasingly, they retreat to an online world that follows the script they’ve written for themselves.

Suddenly it makes sense why we keep taking all those selfies of ourselves. When all the world’s a stage, you need a good head shot.

The Medium is the Message, Mr. President

Every day that Barack Obama was in the White House, he read 10 letters. Why letters? Because form matters. There’s still something about a letter. It’s so intimate. It uses a tactile medium. Emotions seem to flow easier through the use of cursive loops and sound of pen on paper. They balance between raw and reflective. As such, they may be an unusually honest glimpse into the soul of the writer. Obama seemed to get that. There was an entire team of hundreds of people at the White House that reviewed 10,000 letters a day and chose the 10 that made it to Obama, but the intent was to give an unfiltered snapshot of the nation at any given time. It was a mosaic of personal stories that – together – created a much bigger narrative.

Donald Trump doesn’t read letters. He doesn’t read much of anything. The daily presidential briefing has been dumbed down to media more fitting of the President’s 140 character long attention span. Trump likes to be briefed with pictures and videos. His information medium of choice? Cable TV. He has turned Twitter into his official policy platform.

Today, technology has exponentially multiplied the number of communication media we have available to us. And in that multiplicativity, Marshall McLuhan’s 50-year-old trope about the medium being the message seems truer than ever. The channels we chose – whether we’re on the sending or receiving end – carry their own inherent message. They say who we are, what we value, how we think. They intertwine with the message, determining how it will be interpreted.

I’m sad that letter writing is a dying art, but I’m also contributing to its demise. It’s been years since I’ve written a letter. I do write this column, which is another medium. But even here I’m mislabeling it. Technically this is a blog post. A column is a concept embedded in the medium of print – with its accompanying physical restriction of column inches. But I like to call it a column, because in my mind that carries its own message. A column comes with an implicit promise between you – the readers – any myself, the author. Columns are meant to be regularly recurring statements of opinion. I have to respect the fact that I remain accountable for this Tuesday slot that MediaPost has graciously given me. Week after week, I try to present something that I hope you’ll find interesting and useful enough to keep reading. I feel I owe that to you. To me, a “post” feels more ethereal – with less of an ongoing commitment between author and reader. It’s more akin to a drive-by-writing.

So that brings me to one of the most interesting things about letters and President Obama’s respect for them. They are meant to be a thoughtful medium between two people. The thoughts captured within are important enough to the writer that they’re put in print but they are intended just for the recipient. They are one of the most effective media ever created to ask for empathetic understanding from one person in particular. And that’s how Obama’s Office of Presidential Correspondence treated them. Each letter represented a person who felt strongly enough about something that they wanted to share it with the President personally. Obama used to read his ten letters at the end of the day, when he had time to digest and reflect. He often made notations in the margins asking pointed questions of his staff or requesting more investigation into the circumstances chronicled in a letter. He chose to set aside a good portion of each day to read letters because he believed in the message carried by the medium: Individuals – no matter who they are – deserve to be heard.

Our Brain on Reviews

There’s an interesting new study that was just published about how our brain mathematically handles online reviews that I wanted to talk about today. But before I get to that, I wanted to talk about foraging a bit.

The story of how science discovered our foraging behaviors serves as a mini lesson in how humans tick. The economists of the 1940’s and 50’s discovered the world of micro-economics, based on the foundation that humans were perfectly rational – we were homo economicus. When making personal economic choices in a world of limited resources, we maximized utility. The economists of the time assumed this was a uniquely human property, bequeathed on us by virtue of the reasoning power of our superior brains.

In the 60’s, behavior ecologists knocked our egos down a peg or two. It wasn’t just humans that could do this. Foxes could do it. Starlings could do it. Pretty much any species had the same ability to seemingly make optimal choices when faced with scarcity. It was how animals kept from starving to death. This was the birth of foraging theory. This wasn’t some homo-sapien-exclusive behavior that was directed from the heights of rationality downwards. It was an evolved behavior that was built from the ground up. It’s just that humans had learned how to apply it to our abstract notion of economic utility.

Three decades later, two researchers at Xerox’s Palo Alto Research Center found another twist. Not only had our ability to forage been evolved all the way through our extensive family tree, but we seemed to borrow this strategy and apply it to entirely new situations. Peter Pirolli and Stuart Card found that when humans navigate content in online environments, the exact same patterns could be found. We foraged for information. Those same calculations determined whether we would stay in an information “patch” or move on to more promising territory.

This seemed to indicate three surprising discoveries about our behavior:

  • Much of what we think is rational behavior is actually driven by instincts that have evolved over millions of years
  • We borrow strategies from one context and apply them in another. We use the same basic instincts to find the FAQ section of a website that we used to find sustenance on the savannah.
  • Our brains seem to use Bayesian logic to continuously calculate and update a model of the world. We rely on this model to survive in our environment, whatever and wherever that environment might be.

So that brings us to the study I mentioned at the beginning of this column. If we take the above into consideration, it should come as no surprise that our brain uses similar evolutionary strategies to process things like online reviews. But the way it does it is fascinating.

The amazing thing about the brain is how it seamlessly integrates and subconsciously synthesizes information and activity from different regions. For example, in foraging, the brain integrates information from the regions responsible for wayfinding – knowing our place in the world – with signals from the dorsal anterior cingulate cortex – an area responsible for reward monitoring and executive control. Essentially, the brain is constantly updating an algorithm about whether the effort required to travel to a new “patch” will be balanced by the reward we’ll find when we get there. You don’t consciously marshal the cognitive resources required to do this. The brain does it automatically. What’s more – the brain uses many of the same resources and algorithm whether we’re considering going to McDonald’s for a large order of fries or deciding what online destination would be the best bet for researching our upcoming trip to Portugal.

In evaluating online reviews, we have a different challenge: how reliable are the reviews? The context may be new – our ancestors didn’t have TripAdvisor or AirBNB ratings for choosing the right cave to sleep in tonight – but the problem isn’t. What criteria should we use when we decide to integrate social information into our decision making process? If Thorlak the bear hunter tells me there’s a great cave a half-day’s march to the south, should I trust him? Experience has taught us a few handy rules of thumb when evaluating sources of social information: reliability of the source and the consensus of crowds. Has Thorlak ever lied to us before? Do others in the tribe agree with him? These are hardwired social heuristics. We apply them instantly and instinctively to new sources of information that come from our social network. We’ve been doing it for thousands of years. So it should come as no surprise that we borrow these strategies when dealing with online reviews.

In a neuro-scanning study from the University College of London, researchers found that reliability plays a significant role in how our brains treat social information. Once again, a well-evolved capability of the brain is recruited to help us in a new situation. The dorsomedial prefrontal cortex is the area of the brain that keeps track of our social connections. This “social monitoring” ability of the brain worked in concert with ventromedial prefrontal cortex, an area that processes value estimates.

The researchers found that this part of our brain works like a Bayesian computer when considering incoming information. First we establish a “prior” that represents a model of what we believe to be true. Then we subject this prior to possible statistical updating based on new information – in this case, online reviews. If our confidence is high in this “prior” and the incoming information is weak, we tend to stick with our initial belief. But if our confidence is low and the incoming information is strong – i.e. a lot of positive reviews – then the brain overrides the prior and establishes a new belief, based primarily on the new information.

While this seems like common sense, the mechanisms at play are interesting. The brain effortlessly pattern matches new types of information and recruits the region that is most likely to have evolved to successfully interpret that information. In this case, the brain had decided that online reviews are most like information that comes from social sources. It combines the interpretation of this data with an algorithmic function that assigns value to the new information and calculates a new model – a new understanding of what we believe to be true. And it does all this “under the hood” – sitting just below the level of conscious thought.

Flow and the Machine

“In the future, either you’re going to be telling a machine what to do, or the machine is going to be telling you.”

Christopher Penn – VP of Marketing Technology, Shift Communications.

I often talk about the fallibility of the human brain – those irrational, cognitive biases that can cause us to miss the reality that’s right in front of our face. But there’s another side to the human brain – the intuitive, almost mystical machinations that happen when we’re on a cognitive roll, balancing gloriously on the edge between consciousness and subconciousness. Malcolm Gladwell took a glancing shot at this in his mega-bestseller: Blink. But I would recommend going right to the master of “Flow” – Mihaly Csikszentmihalyi (pronounced, if you’re interested – me-hi Chick-sent-me-hi). The Hungarian psychologist coined the term “flow” – referring to a highly engaged mental state where we’re completely absorbed with the work at hand. Csikszentmihalyi calls it the “psychology of optimal experience.”

It turns out there’s a pretty complicated neuroscience behind flow. In a blog post from gamer Adam Sinicki, he describes a state where the brain finds an ideal balance between instinctive behavior and total focus on one task. The state is called Transient Hypofrontality. It can sometimes be brought on by physical exercise. It’s why some people can think better while walking, or even jogging. The brain juggles resources required and this can force a stepping down of the prefrontal cortex, the part of the brain that causes us to question ourselves. This part of the brain is required in unfamiliar circumstances but in a situation where we’ve thoroughly rehearsed the actions required it’s actually better if it takes a break. This allows other – more intuitive – parts of the brain to come to the fore. And that may be the secret of “Flow.” It may also be the one thing that machines can’t replicate – yet.

The Rational Machine

If we were to compare the computer to a part of the brain, it would probably be the Prefrontal Cortex (PFC). When we talk about cognitive computing, what we’re really talking about is building a machine that can mimic – or exceed – the capabilities of the PFC. This is the home of our “executive function” – complex decision making, planning, rationalization and our own sense of self. It’s probably not a coincidence that the part of our brain we rely on to reason through complex challenges like designing artificial intelligence would build a machine in it’s own image. And in this instance, we’re damned close to surpassing ourselves. The PFC is an impressive chunk of neurobiology in its flexibility and power, but speedy it’s not. In fact, we’ve found that if we happen to make a mistake, the brain slows almost to a stand still. It shakes our confidence and kills any “flow” that might be happening in it’s tracks. This is what happens to athletes when they choke. With artificial intelligence, we are probably on the cusp of creating machines that can do most of what the PFC can do, only faster, more reliably and with the ability to process much more information.

But there’s a lot more to the brain than just the PFC. And it’s this ethereal intersection between ration and intuition where the essence of being human might be hiding.

The Future of Flow

What if we could harness “flow” at will? If we work in partnership with a machine that can crunch data in real time and present us with the inputs required to continue our flow-fueled exploration without the fear of making a mistake? It’s not so much a machine telling us what to do – or the reverse – as it is a partnership between human intuition and machine based rationalization. It’s analogous to driving a modern car, where the intelligent safety and navigation features backstop our ability to drive.

Of course, it may just be a matter of time before machines best us in this area as well. Perhaps machines already have mastered flow because they don’t have to worry about the consequences of making a mistake. But it seems to me that if humans have a future, it’s not going to be in our ability to crunch data and rationalize. We’ll have to find something a little more magical to stake our claim with.

 

 

Shopping is Dead. Long Live Shopping!

Last week, a delivery truck pulled up in my driveway. As the rear door rolled up, I saw the truck was full of Amazon parcels, including one for me. Between the four of us that live in our house, we have at least one online purchase delivered each week. When compared to the total retail spending we do, perhaps that’s not all that significant, but it’s a heck of a lot more than we used to spend.

We are a microcosm of a much bigger behavioral trend. A recent Mediapost article by Jack Loechner reported that online retail grew by 15.6 percent last year and represents 11.7 percent of total retail sales. An IRI report shows similar trends in consumer packaged goods. In 2015, ecommerce represented just 1.5% of all consumer packaged good sales, but they project that to climb to 10% in 2022. In fueling that increase, Amazon is not only leading the pack, but also dominating it to an awe-inspiring extent. Between 2010 and last year, Amazon’s sales in North America quintupled from $16 billion to $80 billion. Hence all those packages in the back of the afore-mentioned truck.

Now, maybe all this still represents “small potatoes” in the total world of retail, but I think we’re getting close to an inflection point. We are fundamentally changing how we think of shopping, and once we let that demon out of the box (or bubble wrapped envelope) there is no stuffing it back.

In the nascent days of online shopping, way back in 2001, an academic study looked at the experience of shopping online. The authors, Childers, Carr, Peck and Carson, divided the experience into two aspects: hedonic and utilitarian. I’ll deal with both in that order.

First of all, the hedonic side of shopping – the touchy, feely joy of buying stuff. It’s mainly the hedonic aspects that purportedly hold up the shaky foundations of all those bricks and mortar stores. And I wonder – is that a generational thing? People of my generation and older still seem to like a little retail therapy now and again. But for my daughters, the act of physically shopping is generally a pain in the ass. If they can get what they want online, they’ll do so in the click of an OneClick button. They’ll visit a mall only if they have to.

In an article early this year in The Atlantic, Derek Thompson detailed the decimation of traditional retail. Mall visits declined 50 percent between 2010 and 2013, according to the real-estate research firm Cushman and Wakefield, and they’ve kept falling every year since. Retailers are declaring bankruptcy at alarming rates. Thompson points the finger at online shopping, but adds a little more context. Maybe the reason bricks and mortar retail is bleeding so badly is that it represents an experience that is no longer appealing. A quote from that article raises an interesting point:

“ ‘What experience will reliably deliver the most popular Instagram post?’—really drive the behavior of people ages 13 and up. This is a big deal for malls, says Barbara Byrne Denham, a senior economist at Reis, a real-estate analytics firm”

Malls were designed to provide an experience – to the point of ludicrous overkill in mega-malls like Canada’s West Edmonton Mall or Minnesota’s Mall of America. But increasingly, those aren’t the experiences we’re looking for. We’re still hedonistic, but our hedonism has developed different tastes. Things like travel and dining out with friends are booming, especially with younger generations. As Denham points out, our social barometers are not determined so much but what we have as by what we’re doing and whom we’re doing it with. Social proof of such things is just one quick post away.

Now let’s deal with the utilitarian aspects of shopping. According to a recent Harris Poll, the three most popular categories for online shopping are:

  1. Clothing and Shoes
  2. Beauty and Personal Care Products
  3. Food Items

Personally, when I look at the things I’ve recently ordered online, they include:

  • A barbecue
  • Storage shelves
  • Water filters for my refrigerator
  • A pair of sports headphones
  • Cycling accessories

I ordered these things online because (respectively):

  • They were heavy and I didn’t want the hassle of dragging them home from the store; and/or,
  • They probably wouldn’t have what I was looking for at any stores in my area.

But even if we look beyond these two very good reasons to buy online, “etail” is just that much easier. It’s generally cheaper, faster and more convenient. We have a long, long tail of things to look for, the advantage of objective reviews to help filter our buying and an average shopping trip duration of just a few minutes – start to finish – as opposed to a few hours or half a day. Finally, we don’t have to contend with assholes in the parking lot.

Online already wins on almost every aspect and the delta of “surprise and delight” is just going to keep getting bigger. Mobile devices untether buying from the desktop, so we can do it any place, any time. Voice commands can save our tender fingertips from unnecessary typing and clicking. Storefronts continue to get better as online retailers run bushels of UX tests to continually tweak the buying journey.

But what’s that you say? “There are just some things that you have to see and touch before you buy?” Perhaps, although I personally remain unconvinced about the need for tactile feedback when shopping. People are buying cars online and if ever there was a candidate for hedonism, it’s an automobile. But let’s say you’re right. I already wrote about how Amazon is changing the bricks and mortar retail game. But Derek Thompson casts his crystal ball gazing even further in the future when he speculates on what autonomous vehicles might do for retail:

“Once autonomous vehicles are cheap, safe, and plentiful, retail and logistics companies could buy up millions, seeing that cars can be stores and streets are the ultimate real estate. In fact, self-driving cars could make shopping space nearly obsolete in some areas.”

Maybe you should buy some shares in Amazon, if you haven’t already. P.S. You can buy them online.

 

We’re Becoming Intellectually “Obese”

Humans are defined by scarcity. All our evolutionary adaptations tend to be built to ensure survival in harsh environments. This can sometimes backfire on us in times of abundance.

For example, humans are great at foraging. We have built-in algorithms that tell us which patches are most promising and when we should give up on the patch we’re in and move to another patch.

We’re also good at borrowing strategies that evolution designed for one purpose and applying them for another purpose. This is called exaptation. For example, we’ve exapted our food foraging strategies and applied them to searching for information in an online environment. We use these skills when we look at a website, conduct an online search or scan our email inbox. But as we forage for information – or food – we have to remember, this same strategy assumes scarcity, not abundance.

Take food for example. Nutritionally we have been hardwired by evolution to prefer high fat, high calorie foods. That’s because this wiring took place in an environment of scarcity, where you didn’t know where your next meal was coming from. High fat, high calorie and high salt foods were all “jackpots” if food was scarce. Eating these foods could mean the difference between life and death. So our brains evolved to send us a reward signal when we ate these foods. Subsequently, we naturally started to forage for these things.

This was all good when our home was the African savannah. Not so good when it’s Redondo Beach, there’s a fast food joint on every corner and the local Wal-Mart’s shelves are filled to overflowing with highly processed pre-made meals. We have “refined” food production to continually push our evolutionary buttons, gorging ourselves to the point of obesity. Foraging isn’t a problem here. Limiting ourselves is.

So, evolution has made humans good at foraging when things are scarce, but not so good at filtering in an environment of abundance. I suspect the same thing that happened with food is today happening with information.

Just like we are predisposed to look for food that is high in fats, salt and calories, we are drawn to information that:

  1. Leads to us having sex
  2. Leads to us having more than our neighbors
  3. Leads to us improving our position in the social hierarchy

All those things make sense in an evolutionary environment where there’s not enough to go around. But, in a society of abundance, they can cause big problems.

Just like food, for most of our history information was in short supply. We had to make decisions based on too little information, rather than too much. So most of our cognitive biases were developed to allow us to function in a setting where knowledge was in short supply and decisions had to be made quickly. In such an environment, these heuristic short cuts would usually end up working in our favor, giving us a higher probability of survival.

These evolutionary biases become dangerous as our information environment becomes more abundant. We weren’t built to rationally seek out and judiciously evaluate information. We were built to make decisions based on little or no knowledge. There is an override switch we can use if we wish, but it’s important to know that just like we’re inherently drawn to crappy food, we’re also subconsciously drawn to crappy information.

Whether or not you agree with the mainstream news sources, the fact is that there was a thoughtful editorial process, which was intended to improve the quality of information we were provided. Entire teams of people were employed to spend their days rationally thinking about gathering, presenting and validating the information that would be passed along to the public. In Nobel laureate Daniel Kahneman’s terminology, they were “thinking slow” about it. And because the transactional costs of getting that information to us was so high, there was a relatively strong signal to noise ratio.

That is no longer the case. Transactional costs have dropped to the point that it costs almost nothing to get information to us. This allows information providers to completely bypass any editorial loop and get it in front of us. Foraging for that information is not the problem. Filtering it is. As we forage through potential information “patches” – whether they be on Google, Facebook or Twitter – we tend to “think fast” – clicking on the links that are most tantalizing.

I would have never dreamed that having too much information could be a bad thing. But most of the cautionary columns that I’ve written about in the last few years all seem to have the same root cause – we’re becoming intellectually “obese.” We’ve developed an insatiable appetite for fast, fried, sugar-frosted information.

 

Want to be Innovative? Immerse Yourself!

In a great post earlier this year, VC Pascal Bouvier (along with Aldo de Jong and Harry Wilson) deconstructed the idea that starts ups always equate with successful innovation. Before you jump on the Lean Start Up bandwagon, realize the success rate of a start up taking ideas to market is about 0.2%. Those slow-moving, monolithic corporations that don’t realize they’re the walking dead? Well, they’re notching a 12.5% hit rate. Sure, they’re not disrupting the universe, but they are protecting their profit margin, and that’s the whole point.

The problem, Bouvier states, is one of context. Start-ups serve a purpose. So do big corporations. But it’s important to realize the context in which they both belong. We are usually too quick to adopt something that appears to be working without understanding why. We then try to hammer it into a place it doesn’t belong.

Start-ups are agents in an ecosystem. Think of them like amino acids in a primordial soup from which we hope, given the right circumstances, life might emerge. The advantage in this market-based ecosystem is that things move freely – without friction. Agents can bump up against each other quickly and catalysts can take their shot at sparking life. It is a dynamic, emergent system. Start-ups are lean and fast-moving because they have to be. It is the blueprint for their survival. It is also why the success rate of any individual start-up is so low. The market is a Darwinian beast – red of tooth and claw. Losers are ruthlessly weeded out.

A corporation is a different beast that occupies a different niche on the evolutionary timeline. It is a hierarchy of components that has already been tested by the market and has assembled itself into a replicable, successful entity. It is a complex organism and has discovered rules that allow it to compete in its ecosystem as a self-organized, vertically integrated, hopefully sustainable entity. In this way, it bears almost no resemblance to a start up. Nor should it.

This is why it’s such a daunting proposition for a start up to transition into a successful corporation. Think of the feat of self-transformation that is required here. Not only do you have to change your way of doing things – you have to change your very DNA. You have to redefine every aspect of who you are, what you do and how you do it.

If you pull out your perspective dramatically here, you see that this is a wave. Call it Schumpeterian Gale of Creative Destruction, call it a Kontdratiev Wave, call it whatever you like – this is not simply a market adaptation – this is a phase transition. The rules on one side of the wave are completely different than on the other side – just as the rules of physics are different for liquids and gases. And that applies to everything, including how you think about innovation.

We commonly believe start-ups are more innovative than corporations. But that’s not actually true. It’s the market that is more innovative. And that innovation has a very distinct characteristic. It comes from agents who are immersed in a particular part of the market. As Bouvier points out in his post, start up CEO’s solve a problem that’s “right in front of their nose.” Think of the typical start up founder. They are ear lobe deep in whatever they are doing. From this perspective, they see something they believe to be a need. They then set out to create a new solution to that need. This is the sense making cycle I keep talking about.

For a lot of start ups, sense making is ingrained. The entrepreneur is embedded in a context where it allows them to make sense of a need that has been overlooked. The magic happens when the switch clicks and the need is matched with a solution. Entrepreneurs are the synaptic connections of the market, but this requires deep immersion in the market.

There’s something else about this immersion that’s important to consider – there is nothing quantitative about it. It’s organic and natural. It’s messy and often chaotic. It’s what I call “steeping in it.” I believe this is also important to innovation. And it’s not just me. A recent study from the University of Toronto shows that creativity thrives in environments free of too much structured knowledge. The authors note, “A hierarchical information structure, compared to a flat information structure, will reduce creativity because it reduces cognitive flexibility.”

Innovation requires insight, and insight comes from being intimately immersed in something. There is a place for data analysis and number crunching, but like most things, that’s the other side of the quant/qual wave. You need both to be innovative.