Twitch – Another Example of a Frictionless Market

twitch_logo3Twitch just sold for $1 billion dollars. That’s not really news. We’ve become inured to the never-ending stream of tech acquisitions that instantly transforms entrepreneurial techies into some of the richest people on the planet. No, what’s interesting about Twitch is if we slow down long enough to think about how this particular start up managed to create $1 billion in value.

A billion dollars is a lot of money. If we looked back just 50 years, a billion dollars in assets would make a company number 40 on the Fortune 500. If Twitch were somehow teleported back to 1964, it would rank just eight slots under Procter and Gamble (assets worth $1.15 billion) and three slots above Sunoco (assets of $0.88 billion). Coca-Cola would be left in the dust with a mere $485 million in assets. Today a half billion dollars is chump change in Silicon Valley terms.

This becomes more amazing when you consider that Twitch is only 3 years old. And it really started as an accident.

justin_kanRemember EDtv? Probably not. It was a pretty forgettable 1999 movie (based on a 1994 Quebec film called Louis 19, King of the Airwaves) starring Matthew McConaughey. The idea was that Ed would be followed by cameras 24 hours a day, 7 days a week, making his life a reality TV show. 1998’s The Truman Show had a similar theme (albeit with better ratings). Anyway, the point made in both movies was that an average life, if televised, could be entertaining enough to make people watch. In 2006, Emmett Shear and Justin Kan decided to test the premise. They launched Justin.tv. Soon they invited others to simulcast their lives as well.

What Kan and Shear did, although they probably weren’t intending to at the time, was create a platform that allowed anyone to be a real-time broadcaster with zero transactional costs. They created a perfect market for live TV. Last week I talked about AirBnB, TripAdvisor and VRBO.com creating a more perfect market for tourism. The key characteristic of a perfect market is that barriers to entry are reduced to zero, turning the market into an emergent sandbox from which new things tend to pop up. And that’s exactly what happened with Twitch.

Shear and Kan found that one group in particular embraced the idea of livecasting – gamers. They could communicate with other gamers, but they could also show off their mad gaming skills. Using the Justin.tv platform, Twitch was launched for the gaming industry in 2011. And thanks to Twitch, gaming has become a spectator sport – at a massive scale.

Twitch’s “stars” – like 30-year-old Tessa Brooks, who goes by “Tessachka” and broadcasts an average of 42 hours of programming a week – post their schedules so that their audiences can tune in. Twitch has about 55 million viewers per month who consume over 16 billion minutes of video programming. According to SocialBlade.com, this month, “Riotgames” is the top ranked Twitch broadcaster, with almost a million followers and over 18 million channel views.

Again, those are big numbers. A network show that pulls in 18 million viewers would be number 5 in the Nielsen ratings. And while Netflix’s House of Cards or Orange is the New Black may have made waves at the Emmies, The Atlantic estimates that only 2 – 3 million people watch a newly posted episode in the first week. On a good week, Riotgames could blow that away without twitching a trigger finger.

Twitch not only created a platform that generates audiences, it also generated a marketplace. Where there are eyeballs, there’s revenue potential. Twitch cuts its gamers in for a cut of the advertising revenue. I couldn’t find numbers on how lucrative this could be, but I suspect Justin may be able to quit his day job.

Like I said, the Twitch story is interesting, but what is vastly more interesting is the market dynamics that it has unleashed. Amazon’s $1 billion bid is not for the technology. It’s for the community and the market that comes with that community. When it comes to leveraging the potential of zero transactional cost markets, Amazon knows a thing or two. And one of the things it knows is that in frictionless markets, if you can navigate the turbulence, tremendous value can be created in an amazing short time. Say, for instance, $1 billion in just 3 years. It took Procter and Gamble 127 years to be worth that much.

Technology is Moving Us Closer to a Perfect Market

I have two very different travel profiles. When I travel on business, I usually stick with the big chains, like Hilton or Starwood. The experience is less important to me than predictability. I’m not there for pleasure; I’m there to sleep. And, because I travel on business a lot (or used to), I have status with them. If something goes wrong, I can wave my Platinum or Diamond guest card around and act like a jerk until it gets fixed.

But, if I’m traveling for pleasure, I almost never stay in a chain hotel. In fact, more and more, I stay in a vacation rental house or apartment. It’s a little less predictable than your average Sheraton or Hampton Inn, but it’s almost always a better value. For example, if I were planning a last minute get away to San Francisco for Labor Day weekend, I’d be shelling out just under $400 for a fairly average hotel room at the Hilton by Union Square. But for about the same price, I could get an entire 4 bedroom house that sleeps 8 just two blocks from Golden Gate park. And that was with just a quick search on AirBnB.com. I could probably find a better deal with the investment of a few minutes of my time.

perfect_market_1Travel is just one of the markets that technology has made more perfect. And when I say “perfect” I use the term in its economic sense. A perfect market has perfect competition, which means that the barriers of entry have been lowered and most of the transactional costs have been eliminated. The increased competition lowers prices to a sustainable minimum. At that point, the market enters a state called the Pareto Optimal, which means that nothing can be changed without it negatively impacting some market participants and positively impacting others.

Whether a perfect market is a good thing or not depends on your perspective. If you’re a long-term participant in the market and your goal is to make the biggest profit possible, a perfect market is the last thing you want. If you’re a new entrant to the market, it’s a much rosier story – any shifts that take the market closer to a Pareto Optimal will probably be to your benefit. And if you’re a customer, you’re in the best position of all. Perfect markets lead inevitably to better value.

Since the advent of VRBO.com and, more recently, AirBnB.com, the travel marketplace has moved noticeably closer to being perfect. Sites like these, along with travel review aggregators like TripAdvisor.com, have significantly reduced the transaction costs of the travel industry. The first wave was the reduction of search costs. Property owners were able to publish listings in a directory that made it easy to search and filter options. Then, the publishing of reviews gave us the confidence we needed to stray beyond the predictably safe territory of the big chains.

But, more recently, a second wave has further reduced transaction costs independent vacation property owners. I was recently talking to a cousin who rents his flat in Dublin through AirBnB, which takes all the headaches of vacation property management away in return for a cut of the action. He was up and running almost immediately and has had no problem renting his flat during the weeks he makes it available. He found the barriers to entry to be essentially zero. A cottage industry of property managers and key exchange services has sprung up around the AirBnB model.

What technology has done to the travel industry is essentially turned it into a Long Tail business model. As Chris Anderson pointed out in his book, Long Tail markets need scale free networks. Scale free networks only work when transaction costs are eliminated and entry into the market is free of friction. When this happen, the Power Law distribution still stays in place but the tail becomes longer . The Long Tail of Tourism now includes millions of individually owned vacation properties. For example, AirBnB has almost 800 rentals available in Dublin alone. According to Booking.com, that’s about 7 times the total number of hotels in the city.

Another thing that happens is, over time, the Tail becomes fatter. More business moves from the head to the tail. The Pareto Principle states that in Power Law distributions, 20 % of the businesses get 80% of the business. Online, the ratio is closer to 72/28.

These shifts in the market are more than just interesting discussion topics for economists. They mark a fundamental change in the rules of the game. Markets that are moving towards perfection remove the advantages of size and incumbency and reward nimbleness and adaptability. They also, at least in this instance, make life more interesting for customers.

When Are Crowds Not So Wise?

the-wisdom-of-crowdsSince James Surowiecki published his book “The Wisdom of Crowds”, the common wisdom is – well – that we are commonly wise. In other words, if we average the knowledge of many people, we’ll be smarter than any of us would be individually. And that is true – to an extent. But new research suggests that there are group decision dynamics at play where bigger (crowds) may not always be better.

A recent study by Iain Couzin and Albert Kao at Princeton suggests that in real world situations, where information is more complex and spotty, the benefits of crowd wisdom peaks in groups of 5 to 20 participants and then decreases after that. The difference comes in how the group processes the information available to them.

In Surowiecki’s book, he uses the famous example of Sir Francis Galton’s 1907 observation of a contest where villagers were asked to guess the weight of an ox. While no individual correctly guessed the weight, the average of all the guesses came in just one pound short of the correct number. But this example has one unique characteristic that would be rare in the real world – every guesser had access to the same information. They could all see the ox and make their guess. Unless you’re guessing the number of jellybeans in a jar, this is almost never the case in actual decision scenarios.

Couzin and Kao say this information “patchiness” is the reason why accuracy tends to diminish as the crowd gets bigger. In most situations, there is commonly understood and known information, which the researchers refer to as “correlated information.” But there is also information that only some of the members of the group have, which is “uncorrelated information.” To make matters even more complex, the nature of uncorrelated information will be unique to each individual member. In real life, this would be our own experience, expertise and beliefs.  To use a technical term, the correlated information would be the “signal” and the uncorrelated information would be the “noise.” The irony here is that this noise is actually beneficial to the decision process.

In big groups, the collected “noise” gets so noisy it becomes difficult to manage and so it tends to get ignored. It drowns itself out. The collective focuses instead on the correlated information. In engineering terms this higher signal-to-noise ratio would seem to be ideal, but in decision-making, it turns out a certain amount of noise is a good thing. By focusing just on the commonly known information, the bigger crowd over-simplifies the situation.

Smaller groups, in contrast, tend to be more random in their make up. The differences in experiences, knowledge, beliefs and attitudes, even if not directly correlated to the question at hand, have a better chance of being preserved. They don’t get “averaged” out like they would in a bigger group. And this “noise” leads to better decisions if the situation involves imperfect information. Call it the averaging of intuition, or hunches. In a big group, the power of human intuition gets sacrificed in favor of the commonly knowable. But in a small group, it’s preserved.

In the world of corporate strategy, this has some interesting implications. Business decisions are almost always complex and involve imperfectly distributed information. This research seems to indicate that we should carefully consider our decision-making units. There is a wisdom of crowds benefit as long as the crowd doesn’t get too big. We need to find a balance where we have the advantage of different viewpoints and experiences, but this aggregate “noise” doesn’t become unmanageable.

Have the Odds Caught Up with Apple?

google-vs-appleGoogle has just surpassed Apple as the most valuable brand in the world. In diving deeper on this, there are several angles one could take. If you live in the intersection of brand and technology marketing, as I have for the last several years, this is noteworthy on many levels. One, for instance, are the dramatic shifts in Millward Brown’s assigned brand value for the two companies – with Google soaring 40 percent, and Apple plunging 20 percent. According to Millward Brown’s Brandz™ Study, Google’s brand is worth $158 billion, up from $113 billion last year. And the post-Jobs Apple is down to $147 billion from last years $185 billion number one spot. Combined, that’s an $83 billion swap in valuations. Apple was one of the few brands to actually loose ground in this year’s report.

I personally find this interesting because of some recent research I’ve been doing on corporate strategy for an upcoming book. It comes as a surprise to no one reading this column that I’m a big believer in corporate strategy. But in my research, I’ve been forced to admit that strategy is a little understood and over-hyped concept. Actually, let me clarify that – strategy as it’s taught in most MBA programs is little understood and over-hyped. Executives and consultants pull matrices and strategic frameworks out of thin air, and injudiciously apply them to any and all situations. With all due deference to the Michael Porters, Peter Druckers, Jim Collins and Tom Peters of the world, I suspect the world of corporate strategy is more complex than 5 universal steps, a four box matrix or simple models illustrated with a few circles and arrows. The mistaken assumption with all this is that all strategic wisdom must flow from top to bottom.

Let’s go back to Apple and Google. Apple, under Jobs, was a traditional hierarchy. More than this, it was a hierarchy ensconced in an ivory tower. Due, no doubt, to the considerable hubris of Mr. Jobs, Apple believed that all good things had to be laboriously squeezed out of their own design process and mercilessly tweaked to perfection.

Google, on the other hand, fully embraces the concept of a market to drive innovation. Notice I say “a market”, not “the market.” Here, I refer to markets as a tool, not an entity. The distinction is important. Markets are built to facilitate exchanges. They use valuation mechanisms (such as pricing) to protect fairness and introduce equilibrium in the market. It their most ideal form, markets allow any member of the marketplace to contribute and be judged on the value of their contribution, not their status. In Google’s case, the 20% free time rule, Google Labs and their experimentation with prediction markets all use market dynamics to drive both innovation and corporate strategy. Markets allow for a Darwinian approach to strategy, pulling it up from the bottom rather than driving it down from the top. And, as evolutionary biologist Leslie Orgel liked to say, “Evolution is cleverer than you are.”

But there are trade-offs. Bottom up approaches to strategy need some mechanism to pick winners and losers. There needs to be the corporate equivalent of natural selection. This, again, is where markets can help. Without robust and definitive selection tools, the bottoms-up organization can vacillate endlessly, never making any headway. Also, management of execution in bottom-up organizations can be a much more challenging balancing act. Dictatorships might not be a lot of fun for the “dictatees” but you can definitely get the trains running on time.

Here’s one last thing to keep in mind. Every time we trot out Apple in the era of Steve Jobs as an example of anything to do with corporations, we tend to forget that in the normal distribution of visionary talent, Jobs was an extreme outlier. He was a once in a generation anomaly. You can’t build a corporate strategy around the hope that you have a Steve Jobs on the executive payroll. Sooner or later, the odds will catch up to you.

Will Apple’s brand value bounce back in 2015? Perhaps. But in the dynamically complex market that is today’s reality, I’d be placing my bets on organizations that have learned to adapt and evolve in complexity.

Today, Spend Some Time in Quadrant Two

First published April 17, 2014 in Mediapost’s Search Insider

Last week, I ranted, and it was therapeutic — for me, at least. Some of you agreed that the social media landscape was littered with meaningless crap. Others urged me to “loosen up and take a chill pill,” intimating that I had slipped across the threshold of “grumpy old man-itis.” Guilty, I guess, but there was a point to my rant. We need to spend more time with important stuff, and less time with content that may be popular but trivial.

Hey, I’m the first to admit that I can be tempted into wasting gobs of time with a tweet like: “Prom season sizzles with KFC chicken corsages.” This is courtesy of Guy Kawasaki. Guy’s Twitter feed is a fire hose of enticing trivia. And the man (with the team that supports him) does have a knack of writing tweets with irresistible hooks. Come on. Who could resist checking out a fried chicken corsage?

But here’s the problem. Online is littered with fried chicken corsages. No matter where we turn, we’re bombarded by these tasty little tidbits of brain candy. Publishers have grown quite adept at stringing these together, leading us from trivial link to trivial link. Personally, I’m a sucker for Top Ten lists. But after succumbing to the temptation for “just a second” I find myself, 20 minutes later, having accomplished nothing other than learning what the 10 Biggest Reality Show Blunders were, or where the 10 Most Extravagant Homes in the U.S. happen to be.

Entertaining? Absolutely.

Useful? Doubtful.

Important?  Not a chance.

merrillcoveymatrixWe need to set aside time for important stuff. A few decades ago, I happened to read Stephen Covey’s “First Things First,” which introduced a concept I still try to live by to this day. Covey called it the Urgent/Important matrix. It’s a simple two-by-two matrix with four quadrants:

1 – Urgent and Important – for example, a fire in your kitchen.

2 – Not Urgent but Important – long-term planning.

3 – Urgent but Not Important – interruptions.

4 – Not Important and Not Urgent – time-wasters.

Covey’s Advice? Better balance your time in these quadrants. Quadrant One takes care of itself. We can’t ignore these types of crises. But we should try to minimize the distractions that fall into Quadrant Three and cut down the time we spend in Quadrant Four. Then, we should move as much of this freed-up time as possible into Quadrant Two.

Covey’s Quadrants are more applicable than ever to the online world.  I suspect most of us spend the majority of time in the online equivalents of Quadrant Three (responding to emails or other instant forms of messaging that aren’t really important) or Quadrant Four (online time wasters). We probably don’t spend much time in Quadrant Two (which I’ll abbreviate it to Q2). In fact, in writing this column, I tried to find a quick guide to finding important stuff online. I have a few places I like to go, which I’ll share in a moment, but despite the vast potential of online as a Q2 resource, it doesn’t seem that anyone is it making it easy to filter for “importance.” As I said in my last column, we have filters for popularity and recency, but I couldn’t find anything helping me track down Q2 candidates.

So, here is my contribution to helping you set aside more quality Q2 time:

Amazon Kindle and DevonThink: Reading thought-provoking books is my favorite Q2 activity.  I try to set aside at least an hour a day to read. Anytime someone suggests a book or I find one referenced, I download immediately it from Kindle and add it to the queue. Then, as I read, I use Kindle’s highlight feature to create a summary of the important ideas. After, I copy my highlighted notes into DevonThink, a tool that helps track and archive notes and resources for future reference.

Scientific American & Science Daily: I’m a science geek. I love learning about the latest advances — in particular, new discoveries in the areas of psychology and neuroscience. When I find an interesting article, I again save it to DevonThink.

Google Scholar and Questia: Every so often, I dive into the world of academia to find research done in a particular area, usually related to a blog post or column idea. Google Scholar usually unearths a number of publicly available papers on most topics. And, if you share my predilection for academic research, a subscription to Questia is worth considering.

Big Think, weforum.org and TED: Looking for big ideas — world-changing stuff? These three sites are the place to find them.

HBR, Wired, The Atlantic and The Economist: Another favorite topic of mine is corporate strategy — particularly how organizations have to adapt to a rapidly evolving environment. I find sites like these great for giving me a sense of what’s happening in the world of business.

Hey, it may not be a fried chicken corsage, but these aren’t bad ways to spend an hour or two a day.

 

Now, That’s a Job Description I Could Get Behind!

First published February 20, 2014 in Mediapost’s Search Insider

I couldn’t help but notice that last week’s column, where I railed against the marketer’s obsession with tricks, loopholes and pat sound bites got a fair number of retweets. The irony? At least a third of those retweets twisted my whole point – that six seconds (or any arbitrary length of message) isn’t the secret to getting a prospect engaged. The secret is giving them something they want to engage with.

tweet ss

As anyone who has been unfortunate to spend some time with me when I’m in particularly cynical mood about marketing can attest to, I go a little nuts with this “Top Ten Tricks” or “The Secret to…” mentality that seems pervasive in marketing. I’m pretty sure that anyone who retweeted last week’s column with a preface like “Does your advertising engage your consumer in 6 seconds or less? If not, you’re likely losing customers” didn’t bother to actually read past the first paragraph. Maybe not even the first line.

And that’s the whole problem. How can we expect marketers to build empathy, usefulness and relevance into their strategy when many of them have the attention span of a small gnat? As my friend Scott Brinker likes to say when it comes to marketer’s misbehaving, “This is why we can’t have nice things.”

Marketing – good marketing – is not easy but it’s also not a black box. It’s not about secrets or tricks or one-off tactics. It’s about really understanding your customers at an incredibly deep level and then working your ass off to create a meaningful engagement with them. Trying to reduce marketing to anything less than that is like trying to breeze your way through 50 years of marriage by following the Top 3 Tricks to get lucky this Friday night.

Again, this is about meaningful engagements. And when I say meaningful, it’s the customer that gets to decide what’s meaningful. That’s what’s potentially so exciting about breakthroughs like the Oreo Super Bowl campaign. It’s the opportunity to learn what’s meaningful to prospects and then to shift and tailor our responses in real time. Until now, marketing has been “Plan, Push and Pray.” We plan our attack, we push out our message and we pray it finds it’s target and that they respond by buying stuff. If they don’t buy stuff, something went wrong, probably in the planning stage. But that is an awfully long feedback loop.

You’ll notice something about this approach to marketing. The only role for the prospect is as a consumer. If they don’t buy, they don’t participate.  This comes as a direct result of the current job description of a marketer: Someone who gets someone else to buy stuff. But what if we rethink that description? Technology that enables real time feedback is allowing us to create an entirely new relationship with customers. What would happen if we redefined marketing along these lines: To understand the customer’s reality, focusing on those areas where we can solve their problems and improve that reality?

And as much as that sounds like a pat sound bite, if you really dig into it, it’s far from a quick fix. This is a way to make a radically different organization. And it moves marketing into a fundamentally different role. Previously, marketing got its marching orders from the CEO and CFO. Essentially, they were responsible for moving the top line ever northward. It was an internally generated mandate – to increase sales.

But what if we rethink this? What if the entire organization’s role is to constantly adapt to a dynamic environment, looking for advantageous opportunities to improve that environment? And, in this redefined vision, what if marketing’s role was to become the sense-making interface of the company? What if it was the CMO’s job was to consistently monitor the environment, create hypotheses about how to best create adaptive opportunities and then test those hypotheses in a scientific manner?

In this redefinition of the job, Big Data and Real Time Marketing take on significantly new qualities, first as a rich vein of timely information about the marketplace and secondly as a never ending series of instant field experiments to provide empirical backing to strategy.

Now, marketing’s job isn’t to sell stuff, it’s to make sense of the market and, in doing so, help define the overall strategic direction of the company. There are no short cuts, no top ten tricks, but isn’t that one hell of a job description?

How Can Humans Co-Exist with Data?

First published February 6, 2014 in Mediapost’s Search Insider

tumblr_inline_mpt49sqAwV1qz4rgpLast week, I talked about our ability to ignore data. I positioned this as a bad thing. But Pete Austin called me on it, with an excellent counterpoint:

Ignoring Data is the most important thing we do. Only the people who could ignore the trees and see the tiger, in real-time, survived to become our ancestors.”

Too true. We’re built to subconsciously filter and ignore vast amounts of input data in order to maintain focus on critical tasks, such as avoiding hungry tigers. If you really want to dive into this, I would highly recommend Daniel Simons and Christopher Chabris’s “The Invisible Gorilla.” But, as Simons and Chabris point out, with example after example of how our intuitions (which we use as filters) can mislead us, this “inattentional blindness” is not always a good thing. In the adaptive environment in which we evolved, it was pretty effective at keeping us alive.  But in a modern, rational environment, it can severely inhibit our ability to maintain an objective view of the world.

But Pete also had a second, even more valid point:

“What you need to concentrate on now is “curated data”, where the junk has already been ignored for you.”

And this brought to mind an excellent example from a recent interview I did as background for an upcoming book I’m working on.  This idea of pre-filtered, curated data becomes a key consideration in this new world of Big Data.

Nowhere are the stakes higher for the use of data than in healthcare. It’s what lead to the publication of a manifesto in 1992 calling for a revolution in how doctors made life and death decisions. One of the authors, Dr. Gordon Guyatt, coined the term “Evidence based medicine.” The rational is simple here. By taking an empirical approach to not just diagnosis but also to the best prescriptive path, doctors can rise above the limitations of their own intuition and achieve higher accuracy. It’s data driven decision-making, applied to health care. Makes perfect sense, right? But even though Evidence based medicine is now over 20 years old, it’s still difficult to consistently apply at the doctor to individual patient level.

I had the chance to ask Dr. Guyatt why this was:

“Essentially after medical school, learning the practice of medicine is an apprenticeship exercise and people adopt practice patterns according to the physicians who are teaching them and their role models and there is still a relatively small number of physicians who really do good evidence-based practice themselves in terms of knowing the evidence behind what they’re doing and being able to look at it critically.”

The fact is, a data driven approach to any decision-making domain that previously used to rely on intuition just doesn’t feel – well – very intuitive. It’s hard work. It’s time consuming. It, to Mr. Austin’s point, runs directly counter to our tiger-avoidance instincts.

Dr. Guyatt confirms that physicians are not immune to this human reliance on instinct:

“Even the best folks are not going to do it – maybe the best folks – but most folks are not going to be able to do that very often.”

The answer in healthcare, and likely the answer everywhere else where data should back up intuition, is the creation of solid data based resources, which adhere to empirical best practices without requiring every single practitioner to do the necessary heavy lifting. Dr. Guyatt has seen exactly this trend emerge in the last decade:

“What you need is preprocessed information. People have to be able to identify good preprocessed evidence-based resources where the people producing the resources have gone through that process well.”

The promise of curated, preprocessed data is looming large in the world of marketing. The challenge is that, unlike medicine, where data is commonly shared and archived, in the world of marketing much of the most important data stays proprietary. What we have to start thinking about is a truly empirical, scientific way to curate, analyze and filter our own data for internal consumption, so it can be readily applied in real world situations without falling victim to human bias.

What’s Apple’s Plan for 2014?

First published January 2, 2014 in Mediapost’s Search Insider

apple-storeWhen new markets open, value chains first build up, then across. Someone first creates a vertically integrated experience, and then the market opens up as free competition drives efficiency. This is the challenge that currently lies ahead of Apple.

Apple has been the acknowledged master at creating seamless vertically integrated experiences. They did it with the personal computer. They did it with music. They did it with mobile. They did it with tablets. The advantage of working within a closed value chain is that you control every aspect of the experience. You can make sure that everyone plays nice with each other.

The challenge is that at some point, as adoption heats up, you simply cannot scale fast enough to meet market demand. Open competition drives horizontal competition, which drives down prices. The lack of control up and down the chain introduces some short-term user pain, but eventually the dynamics of an open market overcome this and the advantages of having several companies working on an opportunity outweigh the disadvantages.

Apple loves early markets. Or, at least, they have in the past. Under Jobs, they had a knack of creating an elegantly integrated experience that was carefully crafted from top to bottom within the walls of Cupertino. The vision and obsession with detail that defined the Jobs era was a potent combination when it came to building vertical experiences. Somehow, Apple was able to open new markets over and over again, seemingly at will. They were able to bridge Geoffrey Moore’s “Chasm” – by making new experiences painless enough for the front end of the adoption bell curve. As markets rode up the curve, markets turned from vertical to horizontal, driving a decline in margins and prices. This is where Apple tended to kick out and look for the next wave to catch.

But that was then, and this is now. As mentioned, Apple doesn’t do very well when markets turn horizontal. They depend on high margins. Only once, with the Mac, were they able to come back and stake out a respectable claim in a horizontal market. And they almost disappeared in the process. The number of dependent circumstances that would be required to repeat that trick is such that I doubt they’re eager to go down the same path with the iPhone or iPad.

In the year end summaries, many are talking about a seeming anomaly –  that despite Android’s massive market share dominance over iOS (81% vs 12.9%, according to a recent Forbes article) it’s Apple that’s ringing up the holiday sales with mobile shoppers (23% vs Android’s paltry 5%).  This becomes more understandable when you put it in the context of a vertical market that is becoming horizontal. Shopping experiences are still much less painful on iOS. And, you have a user base that is much more comfortable with mobile ecommerce because they’re on the leading edge of the adoption curve. They’ve had a mobile device for a number of years now. Android users, in general, tend to be further back on the curve. As the benefits of Darwinian competition redefine the mobile marketplace along more horizontal lines, those ecommerce numbers will revert to a more natural balance, but it will take some time.

As this inevitable change in the marketplace happens, the question then becomes, “What does Apple do next?” Can they find the next wave? And, if they do, does an Apple without Jobs still have what it takes to create the vertical experience that can open up a new market? There are plenty of opportunities – the two most notable ones being connected entertainment devices (the much-rumored new generation of Apple TV) and wearable technology (iWatches, etc).

Apple has always been known for keeping their cards glued against their chest. In 2014, it remains to be seen if they have anything amazing up their sleeve.

A Tale of Two Research Philosophies

First published December 19, 2013 in Mediapost’s Search Insider

They only sit about five miles apart physically. One’s in Palo Alto, the other’s in Mountain View. But when it comes to how R&D is integrated into an organization’s strategy, there is significantly more distance between Xerox’s PARC and Google.

Xerox Alto computer

Xerox Alto computer

I recently visited both locations on the same day. PARC, of course, is the legendary research wing that created the graphical user interface, the personal computer, object oriented programming, the mouse, Ethernet and the laser printer. It was at PARC that Steve Jobs saw the interface that would eventually form the OS foundation for the Macintosh. Every time we touch the technology that today we take for granted, we should give thanks to the many people who have called the unassuming campus on Coyote Hill Road home.

But in 1969, when PARC was first created, there was a different attitude towards R&D. Research required isolation and distance from the regular business rhythms of the mother ship. Xerox could not have put more distance between its head office, in Rochester, N.Y., and its new research arm, 3,000 miles away. When it came to innovation, the choice of location was fortuitous. PARC, together with HP and other Silicon Valley pioneers, tapped into the stream of talent that was coming out of Stanford. In fact, PARC is located on land leased from Stanford. It soon became an innovation hotbed, thanks to the visionary leadership of Bob Taylor, who headed up the Computer Science division. But Xerox’s track record of bringing its own innovations to the market was dismal. As great as the physical distance was between PARC and the executive wing of Xerox in upstate New York, the philosophical distance was several times greater.

Google’s research efforts, under the leadership of Peter Norvig, is taking a much different direction, likely due to lessons learned from PARC and others.  Research is embedded in the ever-expanding Google campus that currently sprawls along Amphitheatre Parkway and Charleston Road. There is a free flow of traffic and communication between current product engineering teams (many riding brightly colored Google bikes) and those working on longer-term projects. The distance between “today” and “tomorrow” is minimized at every opportunity.

Norvig commented on this in a recent interview with me:

We don’t have a separate research entity whose job is to be isolated from the rest of the company and think about the future. Rather, everybody’s job, regardless of their job title, is to make our products better or invent a new product. So the distinction between being a researcher versus an engineer is not how academic you are, it’s not how forward-thinking you are  — whether you’re looking at this year or next year or the year after. It’s more in terms of the area that you work in. If you work in core search or in core distributed computer systems, then your title’s going to be software engineer, even if you’re a Nobel Prize-winning professor.

Google has taken a hybrid approach to research, in which even long-term projects are developed at production scale, minimizing the risk of projects failing during the technology transfer phase. Norvig touched on this in a recent article:

Elaborate research prototypes are rarely created, since their development delays the launch of improved end-user services. Typically, a single team iteratively explores fundamental research ideas, develops and maintains the software, and helps operate the resulting Google services — all driven by real-world experience and concrete data. This long-term engagement serves to eliminate most risk to technology transfer from research to engineering.

This was exactly the trap that PARC ran into, when some of the most innovative advances in the history of computing failed to significantly contribute to Xerox’s bottom line.  Google has thrown the doors open for internal research teams to access the full power of complete data sets and production scale systems while espousing the practice of agile development. The goal is to ensure that all innovation that happens at Google is not too far removed from the goal of either diversifying Google’s revenue stream with new products, or contributing to existing ones.

The Emerging Data Ecosystem

First published December 13, 2013 in Mediapost’s Search Insider

big-dataData is ubiquitous, and that is true pretty much everywhere. It was certainly true at the Search Insider Summit, where every panel and presentation talked about data. And not just any data — this was “Big Data.”  But what exactly is Big Data — just more data? Or is there a fundamental shift happening here?

I believe there is. When I think about Big Data, I think about an emerging data ecosystem, where the explosion of available data will exponentially increase the complexity of the ecosystem. This is not just more data, but a different environment that will require different strategies.

Typically, the data we currently use is either first-party data — the data that emerges as part of our business process — or structured third-party data, available from a rapidly growing number of data vendors. This is probably what most people think of when they think of Big Data. But I don’t consider data in this form a departure from the data we’re used to using. There’s more of it, true, but the process is already identified. It just needs to be scaled to deal with increased volumes.

Let me use one example from the recent Search Insider Summit. The Weather Company has recently launched a new division called Weather FX, aimed at taking the vast amount of weather data it has to create predictive models to help companies add weather-based variables to their own data sets. For example, ad targeting can now be weather-sensitive, ramping up campaigns and changing messaging based on predicted changes in weather patterns. While pretty impressive, this is a relatively straightforward use of data. The data feeds are well structured and have been “predigested” by Weather FX to make them easy to implement.

Big Data, at least in my interpretation, is a different beast altogether. Here, data is messy, often unstructured, hard to find and in raw form. To further complicate matters, it lives in disparate siloes that often have no market-facing interface. T It’s an organic ecosystem that bears more than a passing similarity to how we think of natural resources. This data needs to be identified, nurtured and harvested (or mined, if you’d prefer).

It’s this data that will lead to a true view of Big Data, a world of vast data nodes that require significant development before they can be used. Think of how the world was a century and a half ago, when a lot of raw stuff — wood, minerals, water, crops, livestock — lay scattered about our planet. At the time, there was little in the way of established manufacturing and distribution chains that transformed that raw stuff into consumable products. Over time, the chain emerged, but a lot of logistical challenges had to be addressed along the way. The same is true, I believe, for data.

But there’s another challenge with Big Data: It’s not always clear how to use it. It needs a framework. You can’t dump a ton of various metals and a couple barrels of oil into a big black box, shake it and expect a Ford Focus to drop out. You need to have a pretty clear idea of what your expected outcome is. And you need to have a long chain that moves your raw material towards your end product. In the early days of creating physical goods, these chains were often verticalized within a single organization, but as the ecosystem evolved, the markets became more horizontal. I would expect the same pattern to emerge in the data ecosystem.

If you create a conceptual framework within which to use data, you can determine which data is required and how that data will be used. You can pick your data sources, and identify the gaps and resource as required to address those gaps. Often, because we’re in the earliest stages of this process, we will need to explore, guess and iteratively test before the data will provide value.

This definition of Big Data requires new rules and strategies. It requires a commitment to mining raw data and integrating it in useful ways. It will mean dynamically adapting to the continuing data explosion. It will require blood, sweat and tears. This is not a “plug and play” exercise. When I think of Big Data, that’s what I think about.