Who’s Who on the Adoption Curve

For me, the Adoption Curve of the Internet of Things is fascinating to observe. Take the PoloTech shirt from Ralph Lauren, for example. It’s a “smart shirt”. The skintight shirt measures your heart rate, how deeply you’re breathing, how stable you are and a host of other key biometrics. All this is sent to your smart phone. One will set you back a cool 300 bucks. But it’s probably not the price that will separate the adopters from the laggards in this case. In the case of the PoloTech shirt, as with many of the new pieces of wearable tech, it’s likely to be your level of fitness that determines which slope of the adoption curve you’ll end up on.

polotechIf you look at the advertising of the PoloTech, it’s clear who the target is: dudes with 0.3% body fat and ridiculously sculpted torsos who live on protein drinks and 4 hour workouts. Me? Not so much. The same is true, I suspect, for the vast majority of us. Unless we’re looking for a high tech girdle to both hold back and monitor the rate of expansion of our guts, I don’t think this particular smart shirt is in the immediate future for me.

As I said, much of the current generation of wearable technology is designed to tell us just how fit we are. Logic predicts that these devices should offer the greatest benefits to those who are the least fit. They, after all, have the most to gain. But that’s not who’s jumping the adoption curve. In my world, which is recreational cycling, the ones who are religiously tracking a zillion metrics are the ones who are already on top of the statistical heap. The reason? Technology has created an open market of bragging rights. Humans are naturally competitive. We like to know how we stack up against others. But we don’t bother keeping track until we’re reasonably sure we’re well above average. So, if you log onto Strava, where many cyclists upload their tech-tracked rides, you can find out just who is the “King of the Mountain” at your local version of the Alpe d’Huez.

This brings about an interesting variation on Roger’s Technology Adoption Curve. Wearable technology often means the generation of personal data. Therefore, an appetite for that data will accelerate the adoption of those respective technologies. We don’t mind being quantified, as long as that quantification paints us in a good light. We want to live in Lake Wobegon, where all the women are strong, all the men are good-looking and all the children are above average.

Adoption of new technologies, according to Rogers, depends on 5 factors: Relative Advantage, Compatibility, Complexity, Trialability and Observability. To this, Rogers added a sixth factor – the status conferring potential of a new innovation. Physical fitness, by its nature, begs to be quantified. Athletic ability and rankings go hand in hand. Status is literally the name of the game. Therefore, there is a natural affinity between wearable technologies that tracks physical performance and fitness.

This introduces some interesting patterns of adoption for new additions to the Internet of Things. Adoption will rapidly saturate certain niches of the population, but may take much longer to cross the chasm to the general masses. And the defining characteristics of the early adopters could be completely different in each case. As more and more things become “smart” the factors of adoption will become more fragmented and diverse. Early adopters of Coke’s Freestyle vending machine will have little in common with early adopters of the PoloTech shirt.

The absorption rate of technology into our lives has been increasing exponentially, seemingly in lock step with Moore’s Law. Every day, we are introduced to more and more things that have technology embedded in them. The advantages that this technology offers will depend on who is judging it. For some, a given technology will be a perfect fit. For others, it will be like trying to squeeze into a high tech shirt that makes us look like an overstuffed sausage.

Donald Trump, The Clickbait Candidate

Intellectually, I hate clickbait. But do I click on it? You bet. Usually before I stop to think. It hits me in the quick and dirty (in every sense of the word) part of my brain. Much as I know I should be better than this, I find myself clicking through more viscerally tantalizing slideshows than I would care to admit. Humans, of which I number myself one, are suckers for sensationalism.

So, I admit to human foibles. But in doing so, I stress that they’re something we should strive to overcome. Ration should rule the day. We should not embrace a future that’s built on the pushing of our collective hot buttons.

That’s why the current ascendency of one Mr. Trump is scaring the hell out of me.

Donald Trump is not stupid. He’s built his campaign to be one massive, ongoing A/B clickbait test. He floats Outrageous Remark A against Outrageous Remark B to see which generates the biggest response. He’s probing the collective psyche of America to see what goes viral. And he knows that virality cannot live in the middle of the road. It has to live in the extreme margins. In order to be sensational, you have to provoke senses. You have to push buttons. To get people to love you, you also have to get people to hate you. It was an inevitable evolution of politicking in the Age of the Internet.

To this point, Trumps tactics appear to be working. He’s distancing his Republican opponents by increasing margins (the latest has him doubling Jeb Bush’s support, at 32% vs 16%). He’s even closing in on Hilary Clinton, trailing by just 6% in a recent poll. Trump’s sledgehammer-subtle attack on the quick and dirty shortcuts of our brains seems to be triumphing over any rational appeal to the slow and reasoned loops of logic.

But is this really how we want our leaders to be chosen?

In 1856, America was edging closer to the ideological precipice of the Civil War. It was a time when it was easy to ignite hair-triggered passions. And the country was captivated by one senatorial race in particular – in the state of Illinois. There, incumbent Stephen A. Douglas was running against a little known lawyer who had served one largely unremarkable term in Congress. His name was Abraham Lincoln. As part of the campaign, Douglas agreed to debate Lincoln on what was the only real issue of the election – the future of slavery. Prior to the debates, popular opinion had it that Douglas would eviscerate Lincoln.

lincolndouglasThe series of seven debates were spread around the state over a period of 56 days. The stakes were profound. Over 14% of the US population was black. Of them, almost 90% were slaves. The future of the union revolved on the thorny question of the legality of slavery. No matter what side of the issue you were on, whatever came out of your mouth was guaranteed to be provocative.

Each debate was 3 hours in length. The first speaker spoke for 60 minutes, the other candidate had 90 minutes to respond, and the first speaker had an additional 30 minutes as a rejoinder. In total, that was 21 hours of usually eloquent political debate. The full text of all speeches were published almost verbatim in the nation’s newspapers (papers usually fixed the grammatical errors of whichever candidate they were supporting, while leaving the opponent’s remarks in rough form.) Lincoln got off to a rough start, but hit his stride midway through the debates. By the final two debates, in Quincy and Alton, most everyone who was at objective felt that Lincoln was the clear winner. He ended up losing the senatorial race to Douglas, but emerged as the national champion of abolitionists. The momentum from those debates eventually carried him into the presidency 4 years later.

In these debates, Lincoln managed to do something extraordinary. He reframed the slavery debate – moving it from a question of social equality to one of legal liberty. This sidestepped some of the fiercely held beliefs and allowed for a more rational examination of the question. Beliefs are the bedrock of the quick and dirty mechanisms of our mind. It’s relatively easy to connect with someone’s beliefs. You just have to know the right buttons to push. It’s much more difficult to encourage people to think, as Lincoln did, and push them to question their beliefs. Beliefs act as bulwarks against open and rational consideration.

By the way, if you’re not familiar with the term, a bulwark is a great wall built to keep things out. Like, for example, a great wall on the US/Mexican border.

Is Brand Strategy a Myth?

BrandStrategyThemeOn one side of the bookshelf, you have an ever growing pile of historic business best sellers, with promising titles like In Search of Excellence, 4 +2: What Really Works, Good to Great and Built to Last. Essentially, they’re all recipes for building a highly effective company. They are strategic blueprints for success.

On the other side of the bookshelf, you have books like Phil Rosenweig’s “The Halo Effect.” He trots out a couple of sobering facts: In a rigorous study conducted by Marianne Bertrand at the University of Chicago and Antoinette Schoar at MIT, they isolated and quantified the impact of a leader on the performance of a company. The answer, as it turned out, was 4%. That’s right, on the average, even if you have a Jack Welch at the helm, it will only make about 4% difference to the performance of your company. Four percent is not insignificant, but it’s hardly the earth shaking importance we tend to credit to leadership.

The other fact? What if you followed the instructions of a Jim Collins or Tom Peters? What if you transformed your company’s management practices to emulate those of the winning case studies in these books? Surely, that would make a difference? Well, yes – kind of. Here, the number is 10. In a study done by Nick Bloom of the London School of Economics and Stephen Dorgan at McKinsey, the goal was the test the association between specific management practices and company performance. There was an association. In fact, it explained about 10% of the total variation in company performance.

These are hard numbers for me to swallow. I’ve always been a huge believer in strategy. But I’m also a big believer in good research. Rosenweig’s entire book is dedicated to poking holes in much of the “exhaustive” research we’ve come to rely on as the canonical collection of sound business practices. He doesn’t disagree with many of the resulting findings. He goes as far as saying they “seem to make sense.” But he stops short of given them a scientific stamp of endorsement. The reality is, much of what we endorse as sound strategic thinking comes down to luck and the seizing of opportunities. Business is not conducted in a vacuum. It’s conducted in a highly dynamic, competitive environment. In such an environments, there are few absolutes. Everything is relative. And it’s these relative advantages that dictate success or failure.

Rosenweig’s other point is this: Saying that we just got lucky doesn’t make a very good corporate success story. Humans hate unknowns. We crave identifiable agents for outcomes. We like to assign credit or blame to something we understand. So, we make up stories. We create heroes. We identify villains. We rewrite history to fit into narrative arcs we can identify with. It doesn’t seem right to say that 90% of company performance is due to factors we have no control over. It’s much better to say it came from a well-executed strategy. This is the story that is told by business best sellers.

So, it caught my eye the other day when I saw that ad agencies might not be very good at creating and executing on brand strategies.

First of all, I’ve never believed that branding should be handled by an agency. Brands are the embodiment of the business. They have to live and breathe at the core of that business.

Secondly, brands are not “created” unilaterally – they emerge from that intersection point where the company and the market meet. We as marketers may go in with a predetermined idea of that brand, but ultimately the brand will become whatever the market interprets it to be. Like business in general, this is a highly dynamic and unpredictable environment.

I suspect that if we ever found a way to quantify the impact of brand strategy on the ultimate performance of the brand, we’d find that the number would be a lot lower than we thought it would be. Most of brand success, I suspect, will come down to luck and the seizing of opportunities when they arise.

I know. That’s probably not the story you wanted to hear.

Ode to a Grecian Eurozone

comm-crisis I’d like to comment on the Greek debt crisis. But I don’t know anything about it. Zip..or, as they say in Athens – μηδέν. I do, however, know how to say zero in Greek, thanks to Google Translate. At least for the next few minutes. I also happen to know rather a lot right now about the Tour de France, how to wire RV batteries, how to balance pool chemicals, how to write obituaries and most of the plotlines for the Showtime series Homeland. I certainly know more about all those things than the average person. Tomorrow, I’ll probably know different stuff. And I will retain almost nothing. But if you ask me what in the world is happening right now, I’ll likely draw a blank. I’d say it’s all Greek to me, but a certain Mediapost columnist already stole that line. Damn you Bob Garfield!

I’m not really sure if I’m concerned about this. After all, I’m the one who has chosen not to watch the news for a long time. My various information sources feed me a steady diet of information, but it’s all been predetermined based on my interests. I’m in what they call a “filter bubble.” I’ve become my own news curator and somewhere along the line, I’ve completely filtered out anything to do with the Greek economy. It’s because I’m not really interested in the Greek economy, but I’m thinking maybe I should be.

(Incidentally, am I the only one who finds it a bit ironic that the word “economy” comes from – you guessed it – the Greek words for “house” and “management”)

The problem is that I have a limited attention span. My memory capacity is a little more voluminous, but there are definite limits to that, as well. To make matters worse, Google is making me intellectually lethargic. I don’t try as hard to remember stuff because I don’t have to. Why learn how to count to 10 in Greek when I can just look it up when I need to. I’m not alone in this. We’re all going down the same blind cornered path together. Sooner or later, we’ll all run into a major crisis we never saw coming. And it’s because we’ve all been looking in different places.

40 years ago, to be well informed, you had to pay attention to mainstream news sources. It was the only option we had. We all got feed the same diet of information. Some of us retained more than others, but we all dined at the same table. Our knowledge capacity was first filled from these common news sources. Then, after that, we’d fill whatever nooks and crannies were left with whatever our unique interests might be. But we all, to some extent, shared a common context. Knowledge may not have been deep, but it was definitely broad.

Now, if I choose to learn more about the Greek economy, I certainly have plenty of opportunities to do so. But I’d be starting with a blank slate. It would take some work to get up to speed. So I have to decide whether it’s worth the effort for me to inform myself. Is the return worth the investment? Something has to tip the balance to make it important enough to learn more about whatever it is the Greeks are referendumming (referendering?) about. And in the meantime, there will be a lot of other things competing for that same limited supply of information gathering attention. Tomorrow, for instance, it might become really important for me to find out how close BC is to legalizing pot, or what the wild fire hazard is in Northern Saskatchewan, or what July’s weather is like in Chiang Mai. All of these things are relatively easy to find, but I have to reserve enough retention capacity to use the information once I find it. Information may want to be free, but the resources required to utilize it depletes our limited stores of cognitive ability.

Perhaps we’re saving more of our attention for on demand information requirements. Or maybe we’re just filtering out more of what we used to call news. Whatever the cause, I think we’re loosing our common cultural context, bit by byte. A community is defined by what it has in common, and the more technology allows us to pursue our individual interests, the more we surrender the common narratives that used to bind us.

Feed Up with Feedback Requests

Sorry Google. I realize this is my last chance to tell you about my experience. But you see, you’re in a long line of companies that are also desperate for the juicy details of my various consumer escapades. Best Western, Ford, Kia, Home Depot, Apple, Samsung – my in box is completely clogged with pleas for the “dets” of my transactional interactions with them. I’ve never been more popular – or frustrated.

I appreciate the idea of customer follow up. I really do. But as company after company jumps on the customer feedback bandwagon, poor ordinary mortals like myself don’t have a hope in hell of keeping up. It could be a full time job just filling out surveys and rating every aspect of my life on a scale that runs from “abysmal” to “awesome” The irony is, these customer feedback requests are actually having the opposite effect. Even if my interactions with the brand are satisfactory, the incessant nagging to find out if I “like them, I really like them” are beginning to piss me off. In the quest to quantify brand affinity, these companies are actually eroding it. Ooops! Talk about unintended consequences.

So, if we accept the fact that knowing what our customers think about us is a good thing, and we also accept the fact that our customers have better things to do with their lives than fill out post-purchase surveys, we have to find a more elegant way to get the job done.

First of all, customer feedback should be part of a full customer relationship continuum. It should be just one customer touch point, not the customer touch point. You have to earn the credibility that gives you the right to ask for my feedback. Too many companies don’t worry about gauging satisfaction “in the moment.” If you don’t care enough to ask if I’m happy when I’m right in front of you, why should I believe that you’ll pay any attention to my survey. But too many companies jam this request for feedback on their customers without doing the spadework required to build a relationship first.

Worse, because compensation is increasingly being tied to feedback results, you get the “please say you’ll love me” pleading on the sales floor. See if this sounds familiar: “You’ll be receiving a survey from head office asking me how I’ve done. I don’t get a bonus unless you give me top marks in each category. So if there’s anything I can do better, please tell me now.” There are so many things that are just plain wrong with this that I don’t know where to start. It’s smarmy and disingenuous. It also puts the customer in a very awkward position. When it’s happened to me, I just murmur something like, “No, you’ve been great,” and run with all speed to the nearest exit.

The next thing we have to realize is that not all purchases are created equal. Remember the Risk/Reward matrix I talked about in last week’s column about how our brains process pricing information? While this applies to our motivational balance going into a purchase, it also provides some clues to the emotion landscape that exists post-purchase. If the purchase was in the low risk/low reward quadrant, like the home improvement supplies I picked up at Home Depot this weekend, it’s a task that has been crossed off my to-do list. It’s done. It’s over. The last thing I want to do is prolong that task by filling out a survey about said task. But, if it’s something that falls into the high risk/high reward quadrant, such as a major vacation, then I am probably more apt to invest some time to give you some feedback. The Rule of Thumb is: the higher the degree of risk or reward, the more likely I am to fill out a survey.

The final thing to remember about customer surveys is that you’re capturing extremes. The people who fill out surveys are usually the ones that either hate you or love you. So you get a very skewed perspective on how you’re doing. What you’re missing is the vast middle of your market that may not be sufficiently motivated to toss you either a brick or a bouquet.

I’m all for getting to know your customers better. But it has to be part of a total approach. It begins with simple things, like actually listening to them when you’re engaging with them.

How Our Brains Process Price Information

On-Off-Switch-For-Human-BrainWe have a complex psychological relationship with pricing. A new brain scanning study out of Harvard and Stanford starts to pick apart the dynamics of that relationship.

Uma R. Karmarkar, Baba Shiv, and Brian Knutson wanted to see how we evaluate a potential purchase when the price is the first piece of information we get as opposed to the last piece of information. They used both fMRI scanning and behavioral tracking to see how the study participants responded. Participants were given $40 dollars to spend and then were presented with a number of sample offers. In all cases, the price represented an attractive bargain on the product featured. But one group was given the price first, and the second group was given the price last.

There was another critical difference in the evaluation process as well. In the first phase of the study, participants were shown products that they would like to buy, and in the second phase, they were shown products that they would have to buy. The difference between the two was how they activated the reward center of our brain – the nucleus accumbens. I’ve been talking for years about the importance of understanding the balance of risk and reward in our purchase decisions. This study provides a little more understanding about how our brain processes those two factors.

In the first phase, participants were shown a variety of products that they would consider rewarding. These would fall into the first quadrant of the risk/reward matrix I introduced in my column from 5 years ago. The researchers were paying particular attention to two different parts of the brain – the nucleus accumbens and the medial prefrontal cortex. For a layman’s analogy, think of you and a five year old walking down the toy aisle in a department store. The nucleus accumbens is the five year old who starts chanting, “I want it. I want it. I want it.” The medial prefrontal cortex is the adult who decides if they’re actually going to buy it. In the study, the researchers found that the sequence in which these two parts of the brain “lit up” depended on whether or not you saw the price first. If you saw the product first, the nucleus accumbens started its chant – “I want it.” If you saw the price first, the prefrontal medial cortex kicked into action and started evaluating whether the offer represented a good bargain. In the case of the reward products, although the sequence varied, the actually purchase process didn’t. In most cases, participants still ended up making the purchase, whether price was presented first or last.

But things changed when the researchers tried a variety of products that fell into the second quadrant of the risk reward matrix – low risk and low reward. These are the everyday items we have to buy. In the study, they included things like a water filtration pitcher, a pack of AA batteries, a USB drive, and a flashlight. There was nothing here that was likely to get the nucleus accumbens starting to chant.

Now, it should be noted that this follow-up study did not include the fMRI scanning, but by tracking purchasing behaviors we can make some pretty educated guesses as to what’s happening in the respective brains of our participants. Here, presenting prices first resulted in a significant increase in actual purchases over instances when price was presented last. If price comes first, we can imagine that the prefrontal cortex is indicating that it’s a good bargain on a needed product. But if a relatively boring product is presented first for evaluation to the nucleus accumbens, there’s little to excite the reward center.

An important caveat to this part of the study comes with knowing that the prices presented represented significant savings on the products. After the simulated purchases, participants were asked to indicate a price they would be willing to pay for the product. When the price was the lead, the named prices tended to be a little lower, indicating that if you are going to lead with price, especially for quadrant two products, you’d better make sure you’re offering a true bargain.

If anything, this study provides further proof of the value of knowing a prospect’s mental landscape. What are the risk and reward factors that will be motivating them? Will the media prefrontal cortex or the nucleus accumbens be calling the shots? What priming effects might an early introduction of price introduce into the process?

When I wrote about the risk/reward matrix five years ago, one commenter said “a simple low-high risk/low-high reward graph is not very useful for driving just in time and location based offers, discounts, etc.” I respectfully disagree. While more sophisticated models are certainly possible, I think even a simple 2X2 matrix that helps map out the decision factors that are in play with purchases would be a significant step forward. And this isn’t about driving real time variations on offers. It’s about understanding the fundamentals of the buyer’s decision process. There’s nothing wrong with simplicity, especially if it drives greater usage.

An Eulogy for “Kathy” – The First Persona

My column last week on the death of the persona seemed to find a generally agreeable audience. But prior to tossing our cardboard cutouts of “Sally the Soccer Mom” in the trash bin, let’s just take a few minutes to remind ourselves why personas were created in the first place.

Alan Cooper – the father of usability personas – had no particular methodology in mind when he created “Kathy,” his first persona. Kathy was based on a real person that Cooper had talked to during his research for a new project management program. Cooper found himself with a few hours on his hands every day when his early 80’s computer chugged away, compiling the latest version of his program. He would use the time to walk around a golf course close to his office and run through the design in his head. One day, he engaged himself in an imaginary dialogue with “Kathy,” a potential customer who was requesting features based on her needs. Soon, he was deep in his internal discussion with Kathy. His first persona was a way to get away from the computer and cubicle and get into the skin of a customer.

There are a few points here that important to note. “Kathy” was based on input from a real person. The creation of “Kathy” had no particular goal, other than to give Cooper a way to imagine how a customer might use his program. It was a way to make the abstract real, and to imagine that reality through the eyes of another person. At the end we realize that the biggest goal of a persona is just that – to imagine the world through someone else’s eyes.

As we transition from personas to data modeling, it’s essential to keep that aspect alive. We have to learn how to live in someone else’s skin. We have to somehow take on the context of their world and be aware of their beliefs, biases and emotions. Until we do this, the holy grail of the “Market of One” is just more marketing hyperbole.

I think the persona started its long decline towards death when it transitioned from a usability tool to a marketing one. Personas were never intended to be a slide deck or a segmentation tool. They were just supposed to be a little mental trick to allow designers to become more empathetic – to slip out of their own reality and into that of a customer. But when marketers got their hands on personas, they do what marketers tend to do. They added the gloss and gutted the authenticity. At that moment, personas started to die.

So, for all the reasons I stated last week, I think personas should be allowed to slip away into oblivion. But if we do so, we have to find a way to understand the reality of our customers on a one to one basis. We have to find a better way to accomplish what personas were originally intended to do. We have to be more empathetic.

Because humans are humans, and not spreadsheets, I’m not sure we can get all the way there with data alone. Data analysis forces us to put on another set of lenses – ones that analyze – not empathize. Those lenses help us to see the “what” but not the “why.” It’s the view of the world that Alan Cooper would have had if he never left his cubicle to walk around the Old Del Monte golf course, waving his arms and carrying on his internal dialogue with “Kathy.” The way to empathize is to make connections with our customers – in the real world – where they live and play.  It’s using qualitative methods like ethnographic research to gain insights that can then be verified with data. Personas may be dead, but qualitative research is more important than ever.

The Persona is Dead, Long Live the Person

First, let me go on record as saying up to this point, I’ve been a fan of personas. In my past marketing and usability work, I used personas extensively as a tool. But I’m definitely aware that not everyone is equally enamored with personas. And I also understand why.

Personas, like any tool, can be used both correctly and incorrectly. When used correctly, they can help bridge the gap between the left brain and the right brain. They live in the middle ground between instinct and intellectualism. They provide a human face to raw data.

But it’s just this bridging quality that tends to lead to abuse. On the instinct side, personas are often used as a short cut to avoid quantitative rigor. Data driven people typically hate personas for this reason. Often, personas end up as fluffy documents and life sized cardboard cutouts with no real purpose. It seems like a sloppy way to run things.

On the intellectual side, because quant people distrust personas, they also leave themselves squarely on data side of the marketing divide. They can understand numbers – people not so much. This is where personas can shine. At their best, they give you a conceptual container with a human face to put data into. It provides a richer but less precise context that allows you to identify, understand and play out potential behaviors that data alone may not pinpoint.

As I said, because personas are intended as a bridging tool, they often remain stranded in no man’s land. To use them effectively, the practitioner should feel comfortable living in this gap between quant and qual. Too far one way or the other and it’s a pretty safe bet that personas will either be used incorrectly or be discarded entirely.

Because of this potential for abuse, maybe it’s time we threw personas in the trash bin. I suspect they may be doing more harm than good to the practice of marketing. Even at their best, personas were meant as a more empathetic tool to allow you to thing through interactions with a real live person in mind. But in order to make personas play nice with real data, you have to be very diligent about continually refining your personas based on that data. Personas were never intended to be placed on a shelf. But all too often, this is exactly what happens. Usually, personas are a poor and artificial proxy for real human behaviors. And this is why they typically do more harm than good.

The holy grail of marketing would be to somehow give real time data a human face. If we could find a way to bridge left brain logic and right brain empathy in real time to discover insights that were grounded in data but centered in the context of a real person’s behaviors, marketing would take a huge leap forward. The technology is getting tantalizingly close to this now. It’s certainly close enough that it’s preferable to the much abused persona. If – and this is a huge if – personas were used absolutely correctly they can still add value. But I suspect that too much effort is spent on personas that end up as documents on a shelf and pretty graphics. Perhaps that effort would be better spent trying to find the sweet spot between data and human insights.

Mad Men: 2065

So, Don Draper is now history. Well, actually, he’s always been history. He started and finished as a half-century look back at what advertising was. Part of the appeal of Mad Men was the anthropological quaintness of the whole thing – “Can you believe they used to do that?” We, smug in our political correctness, can watch an episode secure in the knowledge that the misogynistic, substance-abusive, racist world of Sterling Cooper and Partners is long gone. The world, and with it, advertising, have come a long way!

But, one wonders, what would happen if a similar premise was launched in 2065? What about advertising now would look similarly unacceptable to viewers then?

Draper’s world was the world of the creative spark igniting the big idea. It was the world of the catchy jingle and meme-worthy slogans. The Don Drapers of the world could do no wrong great enough to tarnish the glow of their ability to blow away a client in a pitch or snag a Clio. Creative gods stood firmly astride their kingdoms on Madison Avenue.

Now, of course, we know better. Those were simpler times. Clients, and consumers, are not nearly that naïve. Today, we demand quantitative data and testing to back up our creative inspirations. It’s not just about Big Ideas. Today, advertising is also about Big Data.

But, 50 years from now, will our current preoccupation with data look anachronistic or prescient to that future audience? Are we exhibiting some equally entertaining naiveté? Will the pendulum swing back to the big idea – or will some other alternative present itself? Will data profiling, targeting and programmatic buying look as quaint then as a corny jingle and a three-martini lunch look to us now?

Advertising in the era of Don Draper had gone through its own evolution. At the turn of the century, thanks to the Industrial Revolution, a flood of new products entered the market. Advertising’s first job was to make consumers aware of new offerings, opening new markets in the process. Its primary goal was to inform.

But, by the 50’s and 60’s, mass media had made consumers aware of most product categories. Advertising’s job became to persuade consumers to purchase products they already knew existed. Its primary goal was to persuade. Market share, rather than market expansion, became the end goal. Hence the era of the big idea. You don’t need a big idea to inform, but you do need one to persuade.

Today, however, with the expanding capabilities of technology and micro-manufacturing fueling a new revolution of innovation, we may be coming back to a time where awareness is the primary concern. Advertising’s job seems to be to navigate increasingly complex filters to create awareness in increasingly targeted audiences. The era of branding that found it’s legs in the era of Don Draper already seems to be morphing into something much different that what we’ve known previously. Who knows what that will look like in another 50 years?

The thing about history is that it gives you the intellectual distance required to recognize how silly we once were. The greater the distance, the safer we feel in laughing at ourselves. In the case of advertising, 50 years seems to be an adequate buffer to feel pretty smug with our historical hindsight. Of course, if somehow you could be transported back to 1965 and talk to the average creative director at a big agency, it’s doubtful they would appreciate being enlightened about their ignorance.

So, if we project that forward to today, it makes you wonder. What are the things we do now that our grandchildren will be laughing at in 50 years?

The Mother of All Disruption

Once again fellow Online Spin author Tom Goodwin has piqued my interest. He starts to unwrap a tremendously thorny problem in his column of last Thursday – Time to Think about Regulation for Disruption. Today, I’d like to take this question up one level – do we have to rethink government entirely?

Government is almost entirely a reactionary business. Even far sighted, historic documents such as the Constitution of the United States and the Magna Carta were reactions to the untenable circumstances that preceded them. And these are the exceptions. The vast majority of governing involves a highly bureaucratic and excruciatingly slow process that attempts to respond to emerging breaches in the unspoken code of fairness that our society tries to live by. Realistically, from the time the need for a new law is recognized to the time a bill is passed, months or even years can pass.

Months or years were, practically speaking, adequate in the world we once knew. But today, that is no longer the case. In that time, complex ecosystems can establish around the breach in question, and, as Tom points out, entire industries may have been decimated in the process. This is the reality of disruption.

In a world that seeks order and governance, this is a bad thing. But, now that we have unleashed the technological Kraken, is that a world we can reasonably expect? Slowly but surely we are dismantling every aspect of our hierarchical society and replacing it with a horizontal network. Hierarchies can’t work horizontally. Something has to give.

Disruptions are a characteristic of networked structures. In order for networks to work, each component of that network has to be given the freedom to act. If the action of an individual resonates with other parts of the network, the actions are picked up and amplified. Each individual act has the potential to become a disruption – with corresponding consequences. Everything becomes accelerated in a network.

Government is built on the ideological foundation of a hierarchy. The word “government” means “to steer.” The assumption is that our society is capable of being steered. This, in turn, assumes that our society all wants to go in the same direction. But if we enforce these restrictions on a network, networks cease to work. Yes, we quell the negative disruptions, but we also eliminate the positive ones.

The United States of America is one of the least restrictive societies on the planet. The founding fathers drafted their articles to enshrine that freedom. You (as a Canadian, I have to say “you”) have managed to balance the practical necessities of government with the lack of restrictions typical of a market economy. Markets naturally emerge from networks. Because the U.S. treasures freedom and innovation, it was inevitable that it would emerge as the testing ground for the impacts of technological advances. You are the canary in the coalmine of massive disruption.

Tom urges lawmakers to become more proactive. But historically speaking, that’s just not the way government works. It’s like riding a cow in the Kentucky Derby and wondering why you can’t keep up. I just don’t think that our current hierarchical system of government is up to the job. It’s a great system, with a ton of democratic checks and balances, but it was built for a different era – one built along vertical lines.

The final issue is one of enforcement. Even if laws are passed to deal with emerging disruptions, it’s becoming almost impossible to enforce them. If lawmakers are scrambling to keep up with society, law enforcers have capitulated entirely. We just can’t afford to enforce the laws we already have on the books.

So, if this is the problem, what is the answer? I think, perhaps, it lies in the very same properties of networks. Government and laws became necessary to avoid abuses of power. Power comes from hierarchies. As societies level out the old dictates of fairness become increasingly relevant. We all have universal concepts of fairness. Abuses of what we consider to be fair are generally dealt with quickly and effectively at the network level. Networks tend to police themselves, as long as there is a common understanding of what is acceptable and what is not. In short, we have to think of regulation in terms of market and network dynamics, not hierarchical governance.

I admit this is tough to wrap your head around. In a world of disruptions, this is the Mother of all Disruption. But symptomatically speaking, it appears that our historic notion of government is ailing. As frightening as it may be to contemplate, we should start thinking about what may replace it.