The Wave Form of Complex Strategy

I’ve been thinking about waves a lot lately. As I said to a recent group of marketing technologists, nature doesn’t plan in straight lines. Nature plays out in waves. As soon as you start looking for oscillations, you seem them everywhere. Seasons, our brains, the economy – if complexity lurks there, chances are there is a corresponding wave.

So how do waves tie into my recent two columns (Part One and Part Two) about agency relationships? Simply this – like most complex things, our corporate strategy should also plot itself against a wave-like cycle. And in that cycle, there is a place for both external partnerships and internal execution.

Let me give you two examples of the ubiquity of waves.

Remember how I talked about Bayesian Strategy? Again, it’s a wave, or, if you’d prefer, a loop (which is simply a wave plotted in a different form). It is a process of setting a frame, opening that frame to external validation and then updating that frame based on our newly perceived reality. This approach to strategy borrows from the work done on how we make sense of the world, which is also a loop, or a wave.

Alex “Sandy” Pentland’s “Science of Great Teams” also embodies its own wave:

“Successful teams, especially successful creative teams, oscillate between exploration for discovery and engagement for integration of the ideas gathered from outside sources. At the MIT Media Lab, this pattern accounted for almost half of the differences in creative output of research groups.”

Alex_Sandy_Pentland-300x300

Alex “Sandy” Pentland

The thing about waves is that they require very different approaches at the peaks and valleys of the wave. The oscillation is caused by this dynamic tension. The act of gathering input is very different than the act of synthesizing and acting on that output. And it’s very difficult to do both at the same time. Again, Pentland found this in his observation of effective teams, “Exploration and engagement, while both good, don’t easily coexist, because they require that the energy of team members be put to two different uses.”

Increasingly, in complex situations, we have to incorporate wave planning into our strategic approach. And when it comes to marketing, this will likely include a wave that winds itself through working with an external partner to gather the value that comes from their external perspective and in creating an internal “sense-making” discipline with an embedded marketing team. This will require a clear understanding of control and authority transference at the appropriate times. Like the Exploration/Engagement cycle of Pentland’s teams, both are necessary but they shouldn’t necessarily run in parallel.

I’ve found in the past that most of the value that can come from a strong external partnership gets burned off in turf wars and discounting outside information and advice because it doesn’t come from “inside”. Even when this information is accepted, it’s subsumed into internal dialogues and documentation, losing whatever insight it once offered.

Similarly, the partner loses precious cycles trying to keep up to speed with the internal directional course changes that inevitably happen. The problem comes when both these processes try to co-exist and run along the same straight line. The result is a rapidly zig-zagging line that tries to stay the course but loses any energy it might have had in constantly readjusting itself to meet “straight line” strategic objectives.

I believe the right answer to the in-house/agency debate is not an “Either/Or” but rather a wave-aware “And.”

The Case for Strong External Marketing Partnerships

sherryturkle_z_vert-e7bcba251b6a25e2a2f398afac97d290cd28ec8b-s300-c85

Sherry Turkle

We like to spend time with others that agree with what we have to say. In her book, Reclaiming Conversation, Sherry Turkle says this leads to us living in a bubble – in this case – a bubble of agreement. While soothing to our own sensibilities, this can be a dangerous path to walk down. It leads to dangerous biases in perception like Group Think and Information Cascades. It doesn’t give us a true picture of what the world is really like.

Last week, I said that a shorter “sense-making” cycle is one reason why moving advertising and marketing in-house might be a better way to go. But what if those sense-making cycles lead to a skewed view of the world because of perceptual distortion? What if it leads to us seeing the world not as it is, but as we wish it was? Today, as promised, I want to look at the other side of the question – the advantages that can come from having strong external partnerships.

As I said last week, Bayesian Strategy relies on three principles:

  • Strategic planning is a continuous and iterative process
  • Strategic plans are nothing more than hypotheses that are then subject to validation through empirical data
  • The span of the loop between the setting of the strategic frame and the data that validates it should be kept as short as possible.

While moving more functions – including marketing – in-house helps with the last of these, it can lead to problems with the second step: Empirical Validation.

Prolonged ideological homogeneity is never a good thing. Yet human nature craves it. So, from Socrates on down, we have created rational frameworks that force us to consider divergent thoughts. Democracy is built on such a framework. But over time, most organizations naturally move towards a shared opinion of the world – and that opinion usually starts at the top. It’s what Avinash Kaushik calls the HIPPO Syndrome – The Highest Paid Person’s Opinion.

Agreement bubbles expand due to confirmation bias. Even if we pay lip service to validating our opinions with empirical data, what we count as data depends on what we believe. We look for evidence that confirms our beliefs. We can deny we do it, we can chastise ourselves for doing it, but the fact is, it’s human nature. In the end, we’ll still do it, because we’re programmed to do so.

One way to reliably poke our “agreement bubbles” is to build robust mechanisms to both encounter and embrace ideas from outside the bubble. Remember a few months ago, when I wrote that cultures in which higher percentages of atheists are found also tend to be more innovative? The same factors are at work here. Those cultures have more ideological divergence. More perspectives are considered. The result is almost always a more accurate view of the world. Everyone wants to believe they are “right”, but what is “right” – or as close as is possible – is a synthesis of many different opinions and beliefs.

In this case – especially with something as vital to strategy as marketing – a strong external partnership can force us to consider our agreement bubbles. This is where an agency can bring new views to the table. But the agency and the client have to realize that this is where the value of these partnerships lies. They have to embrace this role and build the trust required to introduce external perspectives into the strategic sense-making cycle.

With two sides of the argument now sketched out, we’ll look next week at how the agency partnership of the future might look.

The Case for Bringing Marketing Inhouse

“We were just writing a lot of checks to agencies, but digital marketing is now in our brand DNA.”
Blake Cahill – Philips Global Head of Digital

When we talk about disruptions in marketing, one of the elephants in the room is the increasing demand to bring marketing in house. Companies like Phillips are bringing more and more marketing functions in-house. As an ex-agency guy, this will sound either blasphemous or disingenuous, but I suspect that it might be the right way to go. I’ll tell you why. It has a lot to do with the evolution of strategy.

In the past, we did two things when we planned strategy. We planned in relatively straight lines and we planned over long time frames – a minimum of 5 years was not unusual.

Here’s how it would play out. Executives would go through their strategic planning exercise, which may or may not include getting input from the internal and external marketers. Strategic plans would be formed and this would then be broken down into departmental directives. Department heads – including marketing – would then execute against the plan, with periodic progress reviews scheduled. The entire loop, from input to executional plans, could easily span several months or even a year or more.

The extended timeline is just one of the issues with this approach to strategy. The other problem is that this assumes that strategic planning is only something executives can do. The strategic frame is only set at the highest levels of the organization. And it’s the executive’s prerogative to either consider or completely ignore any input from their direct reports. Even if they do consider it, this feedback is likely several steps removed from the source – namely – the market.

I’ve written before about the concept of Bayesian Strategy. There are three basic foundations to this approach:

  • Strategic planning is a continuous and iterative process
  • Strategic plans are nothing more than hypotheses that are then subject to validation through empirical data
  • The span of the loop between the setting of the strategic frame and the data that validates it should be kept as short as possible.

With Bayesian strategy, the corporation needs to maintain a number of acutely aware “sensing” interfaces that provide constant data about the corporation’s current “reality”:

  • The internal “reality” – especially in more qualitative areas that might fall outside typical KPI’s – like moral, satisfaction, communication effectiveness, etc.
  • The external “reality” – What’s happening in the market? What are customer’s perceptions? What is the competition doing?

These “sensing” interfaces create the frame for the organization. As such, they’re integral to the setting and updating of strategy. Just as our brain depends on our senses to define our sense of what’s real, the organization depends on these interfaces. And when it comes to the “external” reality, no department is in a better position to make sense of the world than marketing. The span of distance between marketing and the management of the company should be as short as possible. This is very difficult to achieve when you rely on external partners for that marketing.

When a company like Phillips brings marketing in-house, it’s more than just a cost-saving or consolidation effort. It’s bringing the function of marketing as close as possible to the core brand. It’s not only giving it a seat at the strategic table but making it one of the key contributors to that strategy. Like I said, it’s a move that makes a lot of sense.

But there is another side to this story, and that has to do with perspective. I’ll look at the flip side of this argument next Tuesday.

 

The Future of the Workplace

I noticed a post a few weeks back that said many companies are abandoning their sprawling suburban campuses and are moving back to the city. I found this interesting, because where we work, like so many things in our lives, seems to be in the midst of disruption.

Frederick_Winslow_Taylor_crop

Frederick Winslow Taylor

 

The psychology of the workplace is now a thing. It never used to be. In fact, my youngest daughter is focusing on exactly that as she pursues her post-grad thesis. In the Frederick Winslow Taylor induced hangover that most of corporate America has been trying to get over in the past several decades, workers were considered machinery. Which was a step forward. Prior to that, they were considered grist for the mill. At least Taylor recognized that well maintained machinery worked better than neglected machinery.

But there have been a significant number of studies looking at how the psychology of the individual contributes to the corporate bottom line. And some interesting paradoxes are emerging. Many of these deal with the nature of the workplace.

We used to think of all workplaces as factories. They were built where land was relatively cheap. This led to the whole concept of the suburban campus. But we spend a lot of time at work. We should be happy there. And our work life should not be out of sync with the rest of our lives. So being exiled to the corporate hinterlands of Blandeville, Connecticut or Nondescript, New Jersey may not fit very well with our life plans anymore. We want workplaces that are close to where we choose to live. We want an integrated work-life balance, not an artificially divided one.

The location of our office isn’t the only thing being disrupted. Should we even go to the office at all? Telecommuting has been explored as a viable option by a number of companies.

When I was CEO of my own company we tried our own telecommuting experiment. The rationale is pretty compelling: if you just need a computer and a connection to work, why bear the expense of all the trappings of a formal office? Additionally, it allowed us to recruit in cities where we didn’t have an office. Finally, there was little doubt the majority of our telecommuting employees were happier with the new arrangement.

For us as employers, however, the results were mixed. When our company was acquired, the new owners ended the telecommuting experiment. It was not a popular decision with our employees. I initially fought against it, but eventually, I came around and supported the requirement to share a physical space. This was a few years before Marissa Mayer brought the same hammer down on the telecommuting employees of Yahoo. The infamous memo was sent at Mayer’s behest by Yahoo’s Head of HR, Jackie Reses on February 22, 2013. Here is an excerpt that provides context for the decision:

“That is why it is critical that we are all present in our offices. Some of the best decisions and insights come from hallway and cafeteria discussions, meeting new people, and impromptu team meetings. Speed and quality are often sacrificed when we work from home.”

We found the same thing. While employees loved telecommuting and were generally disciplined in ensuring we got full value from them, we missed the collaboration and creativity that comes from chance encounters and serendipitous discussions. One could make a strong argument that telecommuting might be more efficient in terms of productivity, but an increasing number of studies show that effectiveness is often sacrificed.

Like most things in the sphere of human behavior, I think the disruption of the workplace is subject to the pendulum effect. The starting point was the faceless beige cubicle satirized in Dilbert. As this started to change, we swung too far over to the other side, embracing the geographically unlimited possibilities of a connected workplace. But we found that something was sacrificed in the transition. The best answer likely falls somewhere between these two extremes.

I have talked before about the research done by MIT’s Alex “Sandy” Pentland. He found that the most effective teams have two distinct phases they go through – exploration and engagement. Innovation and creativity comes from exploration. Productivity comes from engagement. I suspect that telecommuting might work well for engagement. But exploration requires some type of common ground – literally. For example, Pentland found something as simple as all employees taking coffee breaks at the same time lead to a significant increase in team effectiveness.

However the workplace may evolve in the future, I believe we’re learning that some essential element of teamwork still requires us being in the same place at the same time, or, as John F. Kennedy once said, “breathing the same air.”

 

 

 

Disruption 101

We Online Spinners are talking a lot about disruption. Dave Morgan has been talking about disruption in the Advertising and Marketing Technology space. I’ve been looking at disruption in other areas, including academia. Cory Treffiletti, Kaila Colbin, Maarten Albarda have all looked at various aspects of disruption. A quick look back at the past few months’ Spin columns show that well over half of them deal with disruption in one way or another.

Maybe it’s time we did a primer on the idea of disruption.

Disruption is what happens when something stable becomes unstable. That’s kind of a “duh..obviously” statement, but there are some very important concepts lurking in there.

When an environment is stable, it allows for the development of extensive but fragile ecosystems. In a corporate sense, this allows for the development of very complicated supply chains, with several “value niches” emerging along that chain. The more complicated the chain, the higher the potential for profit. Each link adds another level of complication, allowing for someone to be squeezing a little more profit from the end consumer.

In addition to extensive ecosystems, stable environments also allow some members of those ecosystems to achieve significant scale. Things are predictable and this allows organizations to grow, embed processes and systems, thereby improving efficiency and profitability. Often, one organization can establish itself at several levels along the supply chain, maximizing its profit potential.

In our physical world, stability is generally a by-product of friction. The higher the degree of friction – or what economist Ronald Coase called “transactional costs” – the more stable the market becomes. Barriers to entry are higher. Competitive factors are dampened. Capital becomes the main predictor of success.

Then – everything changes. We get hit with instability.

In our current case, we got hit with a double whammy: The disruption we’re experiencing is caused by the removal of friction. Technology is reducing transactional costs in a huge swath of industries.

Technology is an interesting catalyst. We think that technology changes behaviors. I don’t believe so. I think technology enables behaviors to change, in that it allows its users to do something they already wanted to do, but couldn’t because of some obstacle. It allows for an attractive alternative that didn’t previously exist. That technology is usually offered to the broadest base of users available and this triggers the disruption, which starts from the ground up. Typically, technology also removes the friction that enables those delicate hierarchal supply chains to form and flourish.

When the disruption begins and the incumbent ecosystem is threatened, the first casualties are the most fragile members of that ecosystem. These are usually the smaller niche players that rely on the bigger hosts that make up the ecosystem. The bigger hosts can survive longer and often swallow up the first casualties in an attempt to shore up their defenses. They will also often make a half-hearted attempt to respond to the disruption by adopting the technology and going after the disruptors. This never works. Disruption is not in their genetic make up. Their priority is always protecting the status quo, because that’s where their profit lies.

As disruption forever alters the environment, eventually the previous ecosystem withers and dies. A new (temporary) stability emerges – along with a new ecosystem – built on the foundation of the previous disruption and the entire cycle starts again.

The Collateral Damage of Disruption

Not all the stories of disruption are of the “David vs. Goliath” variety. Sometimes they are more of the “David vs. Goliath vs. Innocent Bystanders” ilk.

Stewart Wills reminded me of this last week when I was writing about Alexandra Elbakyan and the Elsevier vs. Sci-Hub case. It’s easy to take aim at Elsevier. After all, they’re a very big 4.2 billion dollar target. It’s just too easy to demonize them. But they’re not the only academic publisher in the world.

“Siding with this particular self-styled “Robin Hood” may seem like a no-brainer (and a good, easy-to-tell story), but everyone seems so interested in focusing on big bad Elsevier that they miss a lot of important other affected parties in the picture.”

Wills pointed me to a posting from Caldera Publishing Solutions, a consulting firm that caters to smaller academic publishers. This post refutes my statement of last week that Elsevier is the only one being harmed by the actions of pirates like Elbakyan. In fact, there is an extended chain of bystanders that threaten to be washed away by the tsunami of disruption that’s bearing down on the academic world. For example, there are “dozens and dozens” of society journals who use huge publishers like Elsevier as a clearinghouse. Behind much of the research in the Sci-Hub library, you’ll find non-profit societies, which means that this is “less of a story of Robin Hood robbing from the town’s greedy sheriff, and more a story of Robin Hood stealing from the town’s hospitals and charities.”

The post draws an analogy to a disruptive wave that first broke 17 years ago now: Napster and illegal file sharing. Given that we now have close to two decades of hindsight in this particular case, it might be useful to do a post-mortem on Napster’s impact on the music industry.

I’m not sure if you happened to watch the Grammys, but if you did, you saw Neil Portnow, president of the National Academy of Recording Arts and Sciences, deliver a plea against streaming music services. The problem, said Mr. Portnow, is these services have commoditized music to the degree that royalties amount to fractions of a cent for each play of a song. That may be fine if you’re Rihanna or Sam Smith, but not great if you’re a struggling independent artist.

The problem with the plea is the same tactical error the Academy has made since the first such sermon, delivered by then president Michael Greene at the 2002 Grammies – it was delivered in the wrong church. It’s very hard to feel sorry for the music industry when the most obvious examples – the artists in the audience – are all multi-millionaires drowning under the weight of their own bling. Portnow might be right when he says music may no longer be a viable career, but it’s hard to swallow that message when delivered in the midst of such excess.

But did Napster, and the subsequent removal of friction from the music industry, truly wreak the damage that NARAS keeps warning us about? The fact is, we now have access to far more music than we did in 1990. We can discover new music more readily. Artists can now self produce and distribute. They can even use Songkick to launch their own tours, or Kickstarter to fund a new album. Will they all get rich? No. But they have a better chance than they did two decades ago, when the only path to stardom led directly through the big (and cutthroat) business of music publishing. Napster, and its technological descendants, did what disruption is supposed to do. They cleaned up the market, creating direction connections between the producers and the consumers.

As Stewart Wills reminded me, there are unintended consequences of disruption. One of them is that when the supply chain begins to be violently shaken from below, as was the case with the music industry, the earliest victims are typically small and fragile members of the ecosystem that depend on a bigger host. These tend to either fall of or become absorbed into the more robust survivors. That’s why you don’t find many corner record stores any more.

But then again, good blacksmiths or door-to-door milkmen are also damned hard to find.

 

 

 

The Face of Disruption

If you ask publishing giant Elsevier, Alexandra Elbakyan is a criminal – a pernicious pirate.

If you ask the Lifeboat Foundation, or blogger P.Z Myers, or millions of students around the world, Alexandra Elbakyan is a hero.

Labels can be tricky things, especially in a world of disruption.

ElboykanMs. Elbakyan certainly doesn’t look like a criminal. You would walk right past her on a campus quad and think nothing of it. She looks pretty much what you would expect a post-grad neuroscience student from Kazakhstan to look like.

But her face is the face of disruption. And she’s at the receiving end of a lawsuit launched by Elsevier that, if you were to take it seriously, would be worth several billion dollars.

Just over a year ago, I wrote a column about the academic journal racket. The work of thousands of researchers is published by Elsevier and others and remains locked behind hugely expensive pay walls. Elbakyan, as a post-grad research student at a university that couldn’t afford to pay the licensing fees to gain access to these journals, got frustrated. In a letter she wrote in response to the lawsuit, she elaborated on this frustration:

“When I was a student in Kazakhstan University, I did not have access to any research papers. These papers I needed for my research project. Payment of 32 dollars is just insane when you need to skim or read tens or hundreds of these papers to do research. I obtained these papers by pirating them.”

Elbaykan was not alone in this piracy.

“Later I found there are lots and lots of researchers (not even students, but university researchers) just like me, especially in developing countries. They created online communities (forums) to solve this problem.”

“…to solve this problem.” There, in a nutshell, is the source of disruption. Elbakyan thought there had to be a more efficient way to facilitate this communal piracy and turned to technology, launching the Sci-Hub search portal in 2011. Depending on the donation of access keys from academics at institutions that had subscriptions to research publishers, Sci-Hub bypasses the paywall and locates the paper a researcher is looking for. It then delivers the paper and saves a copy for LibGen, a library of “pirated” papers that will continue to be freely available to future researchers. The LibGen database now has over 48 million papers available.

Is Elbaykan guilty of piracy? Absolutely – as it’s defined by the law. She makes no bones about the fact. She uses the term repeatedly in her own letter of defense.

But, in that letter, Alexandra Elbaykan also appeals to a higher law – the law of fairness. She is not stealing from the authors of that research, who receive no compensation for their work from the publisher. When Elsevier claims “irreparable harm” the only harm that can be identified is to their own business model. There is no harm to academics, who are becoming increasingly hostile to the business practices of publishers like Elsevier. There is certainly no harm to fellow researchers, who now have open access to knowledge, helping them in their own work. And there is no harm to the public, who can only benefit from the more open sharing of knowledge amongst academics. The only one hurt here is Elsevier.

According to RELX’s (the parent company of Elsevier) 2014 annual report, the company raked in £ 2,944 M ($4.23 billion US) from it’s various subscription businesses. The Scientific, Technical and Medical division (the same division that Elbaykan “irreparably harmed”) had revenues of £ 2,048 M ($2.94 B US) and a tidy little operating profit of £787 M ($ 1.13 B US).

Poor Elsevier.

The question that should be asked here is not whether Elsevier’s business model has been harmed, but rather, does it deserve to live? According to that same annual report, they “help scientists make new discoveries, lawyers win cases, doctors save lives and executives forge commercial relationships with their clients.”

Actually, no.

Elsevier does none of those things. The information they deal in does those things. And that same information is finding a way to be free, thanks to people like Alexandra Elbaykan. Elsevier is just the middleman who is being cut out of the supply chain through technology.

The American legal system will undoubtedly side with Elsevier. The law, as it is currently written, defends the right of a corporation to do business, whether or not people like you and me deem that business ethical. But ultimately, we rely on our laws to be fair, and what is fair depends on the context of our society. That context can be changed through the forces of disruption.

Sometimes, disruption comes in the guise of a young post grad student from Kazakhstan.

Is Amazon Creating a Personalized Store?

There was a brief Amazon-related flurry of speculation last week. Apparently, according to a podcast posted by Wharton, Amazon is planning on opening 300 to 400 bricks and mortar stores.

That’s right. Stores – actual buildings – with stuff in them.

What’s more, this has been “on the books” at Amazon for a while. Amazon CEO Jeff Bezos was asked by Charlie Rose in 2012 if they would every open physical stores. Bezos replied, ““We would love to, but only if we can have a truly differentiated idea,” he said. “We want to do something that is uniquely Amazon. We haven’t found it yet, but if we can find that idea … we would love to open physical stores.”

With that background, the speculation makes sense. If Amazon is pulling the trigger, they must have “found the idea.” So what might that idea be?

Amazon does have a test store in their own backyard of Seattle. What they have chosen to do there, in a footprint about the tenth of the size of the former Barnes and Noble store that was there, is present a “highly curated” store that caters to “local interests.”

Most of the speculation about the new Amazon experiment in “back-to-the-future” retail centers around potential new supply chain management technology or payment methods. But there was one quote from Amanda Nicholson, professor of retail practice at Syracuse University’s Whitman School of Management, that caught my attention; “she said that space represents ‘a test’ to see if Amazon can create ‘a new kind of experience’ using data analytics about customers’ preferences.”

This becomes interesting if we spend some time thinking about the purchase journey we typically take. What Amazon had done online brilliantly is remove friction from two steps in that journey: filtering options and conducting the actual transaction. For certain kinds of purchases, this is all we need. If we’re buying a product that doesn’t rely on tactile feedback, like a digital file or a book, Amazon has connected all the dots required to take us from awareness to purchase.

But that certainly doesn’t represent all potential purchases. That could be the reason that online purchases only represent 9% of all retail. There are many products that require an “experience” between the filtering of options available to us and the actual purchase. These things still require the human “touch” – literally. Up to now, Amazon has remained emotionally distant from these types of purchases. But perhaps a new type of retail location could change that.

Let me give you an example. If you’re a cyclist (like me) you probably have a favorite bike shop. Bike stores are not simply retail outlets. They are temples of bike worship. Bike shops are usually an independent business run by people who love to talk about their favorite rides, the latest bikes or pretty much anything to do with cycling. Going to a bike store is an experience.

But Trek, one of the largest bike manufacturers in the world, also recognized the efficiency of the online model. In 2015, they announced the introduction of Trek Connect, their attempt to find a happy middle ground between practical efficiency and emotional experience. Through Trek Connect, you can configure and order your bike online, but pick it up and have it serviced at your local bike shop.

However, what Amazon may be proposing is not simply about the tactile requirements of certain types of purchases. What if Amazon could create a personalized real world shopping experience?

Right now, there is a gap between our online research and filtering activity and our real world experiential activity. Typically, we shortlist our candidates, gather required information, often in the form of a page printed off from a website, and head down to the nearest retail location. There, the hand off typically leaves a lot to be desired. We have to navigate a store layout that was certainly not designed with our immediate needs in mind. We have to explain what we want to a floor clerk who seems to have at least a thousand other things they’d rather be doing. And we are not guaranteed that what we’re looking for will even be in stock.

But what if Amazon could make the transition seamless? What if they could pick up all the signals from our online activity and create a physical “experiential bubble” for us when we visited the nearest Amazon retail outlet?

Let me go back to my bike purchasing analogy in way of an example. Let’s say I need a new bike because I’m taking up triathlons. Amazon knows this because my online activity has flagged me as an aspiring triathlete. They know where I live and they have a rich data set on my other interests, which includes my favored travel destinations. Amazon could take this data and, under the pretext of my picking up my bike, create a personalized in store experience for me, including a rich selection of potential add-on sales. With Amazon’s inventory and fulfillment prowess, it would be possible to merchandise a store especially for me.

I have no idea if this is what Amazon has “in store” for the future, but the possibility is tantalizing.

It may even make me like shopping.

 

 

 

A New Way to Determine Corporate Value

Last week, I talked about the trend of “hyper” expectations and corporate valuations. Peter Fader, a marketing professor at the Wharton School, commented, “This is why we need to replace the guesswork of tech valuation with the more rigorous, valid, and operational notion of “customer-based corporate valuation.”

I had a chance to look at Professor Fader’s paper. Essentially, he proposes a new model for the valuation of subscription-based businesses based on a calculation of customer lifetime value that uses publicly available information. While interesting in it’s own right, there is a fundamental shift of thinking here that I believe should be explored further.

There are a few standard equations that are used to calculate the value of a firm. If the firm is public, essentially its value is determined by its share price. And that share price is determined by activity in the market – the activity of shareholders. And that activity is dependent on analysts who pass judgment on companies based on projected return to shareholders. At every turn, our entire system of business finance is very heavily weighted towards ownership, which makes sense in a market-based economy. Buyers and sellers determine value.

But what Fader et al are proposing brings another essential stakeholder into the equation – the customer. It’s amazing to me that all the valuation equations we use to determine the value of a corporation don’t involve any direct measure of that corporation’s customer. Sure, we include things like profit, revenue, free cash flow and none of these things would exist without customers, but we never actually attempt to determine the value of a customer. Fader starts the process with the estimation of that value. That simple paradigmatic shift yields a very different view of the world.

For example, if we are to determine the value of a company through the lifetime value of its customers; we have to look at that company in a much different way than the typical financial analyst. We have to look at things like customer loyalty, brand affinity and the likelihood that a company will gain new market share through the disruption of markets. Last week, I used Amazon as an example. Here is a company that has been tremendously disruptive. It has essentially created a new marketplace and, in the process, upended retail as we know it. One would expect this to be taken into account when trying to determine the value of Amazon.

The problem is that things like customer loyalty and brand affinity are emotions. Emotions are not things that are easily quantified. It’s much easier to measure things like quarterly earnings and discounted free cash flow. Most of these things depend on using the past to predict the future. They also rely on the firm’s ability to prognosticate. Typically, all the heavy lifting of factoring in the fuzziness of things like future customer value is left to the company. If a company misses its projections, it is penalized by the analysts, resulting in a decrease of share price.

Ultimately, the gap between how we have historically determined the value of companies and how we might in the future comes down to a matter of our ability to determine what may come to pass. We strive for perfect predictability. We want to place our bets based on solid information and analysis. But, in a disruptive marketplace, this desire for predictability may ultimately sink us. Customers will always determine the value of a company and in a marketplace where transactional and switching costs are both plunging, those customers have the ability to switch buying behaviors instantly. The old saying, “No one ever got fired for buying IBM” has not been true for at least three decades.

Like it or not, if we want to get a true picture of the value of a company, we’re going to have to use some guesswork. And, most importantly, we’re going to have to make sure we include customers in whatever equation we’re using.

 

Nobel Intentions, Ignoble Consequences

It was 20 years ago that I discovered the Internet. According to the International Telecommunication Union, that put me in select company. There were only 77 million users of the Internet by the end of 1996. That represented a little more than 1% of the world’s population. 66% of those were in the US, due likely to access restrictions in other areas. I know I logged on to the Web as soon as I could. I had actually been online with Compuserve for a few years prior to that, but it was in 1996 that the first ISP opened in the Canadian city I live in. I was one of the first to set up an account.

Three years later I changed my business to focus exclusively on online marketing. We became one of the fastest growing companies in Canada. Eleven years after start up (or, more accurately, realignment) we sold that company.

Things moved rather quickly after I first went online. At least, I thought they did. But compared to the growth of other start ups – say, Google for instance – I was a very little fish in a very big pond.

The Nobel Survey

In 2001, Cisco conducted a survey of past Nobel Prize winners. By then, Internet usage had mushroomed. Half a billion people – almost 9% of the world’s population – were online. The Internet appeared to be a real thing. The question asked was, “Where will the Internet take us over the next 20 years?

The Laureates were mostly optimistic in their replies. Here’s a quick summary

  • 87% said the Internet would improve education.
  • 93% felt it would provide greater access to libraries, information and teachers.
  • 74% saw the coming of virtual classrooms by 2020.
  • 82% said it would accelerate innovation
  • 83% felt it would improve productivity
  • 72% believed it would improve quality of life and provide more economic opportunity to people in less developed countries
  • 93% saw it improving communications with people in other countries
  • 76% predicted a breaking down of borders

On the negative side, 65% feared it would violate personal privacy, 51% saw it increasing alienation and 44% felt it would lead to greater political or economic inequity.

15 Years later…

I think you could safely put a check beside every single box on the Nobel Laureate wish list. In fact, as optimistic as these predictions seemed just 15 years ago, they seem conservative in hindsight. Online classrooms have been a reality for a few years and education is undergoing a massive reformation. In 2011, 10 years after the survey was conducted, McKinsey estimated that 10% of GDP growth in developed countries was directly attributable to the Internet. And the fact that almost half the world now has Internet access speaks to the role it plays in communication across cultures.

But none of the laureates predicted a gut punch to the cab drivers of the world. No one foresaw the short-sheeting of the traditional hospitality industry. And there was not a peep of new forms of investment predation that would be measured in microseconds.

The Biggest Can of WD-40 Ever

All the benefits of the Internet – and all the negative consequences – come from the same common factor: the elimination of friction. Economist Ronald Coase rightly identified friction – or, in his terminology, “transactional costs” – as the reason corporations exist. Until very recently, geographic distance introduced friction into pretty much every aspect of our society. It took physical resources to overcome friction. Physical resources required capital. Capital could most efficiently be raised and controlled by corporations.

The Internet enabled a new type of connection. It was agnostic to physical distance. But, more importantly, it was a peer-to-peer connection. There was no hierarchy to the Internet. Hierarchies depend on friction. As soon as that friction is removed, the hierarchies begin to fall apart. They are no longer required.

All the good things that were predicted in 2001 came from a removal of friction. But so did all the bad. In the case, the word “regulation” can be often be substituted for “friction.” Regulation is just another form of hierarchal control.

I’ve been “online” for 20 years now. It certainly accelerated every aspect of my life; most positively, some negatively. But one thing’s for certain. Going backwards is not an option.