A Tale of Two Research Philosophies

First published December 19, 2013 in Mediapost’s Search Insider

They only sit about five miles apart physically. One’s in Palo Alto, the other’s in Mountain View. But when it comes to how R&D is integrated into an organization’s strategy, there is significantly more distance between Xerox’s PARC and Google.

Xerox Alto computer

Xerox Alto computer

I recently visited both locations on the same day. PARC, of course, is the legendary research wing that created the graphical user interface, the personal computer, object oriented programming, the mouse, Ethernet and the laser printer. It was at PARC that Steve Jobs saw the interface that would eventually form the OS foundation for the Macintosh. Every time we touch the technology that today we take for granted, we should give thanks to the many people who have called the unassuming campus on Coyote Hill Road home.

But in 1969, when PARC was first created, there was a different attitude towards R&D. Research required isolation and distance from the regular business rhythms of the mother ship. Xerox could not have put more distance between its head office, in Rochester, N.Y., and its new research arm, 3,000 miles away. When it came to innovation, the choice of location was fortuitous. PARC, together with HP and other Silicon Valley pioneers, tapped into the stream of talent that was coming out of Stanford. In fact, PARC is located on land leased from Stanford. It soon became an innovation hotbed, thanks to the visionary leadership of Bob Taylor, who headed up the Computer Science division. But Xerox’s track record of bringing its own innovations to the market was dismal. As great as the physical distance was between PARC and the executive wing of Xerox in upstate New York, the philosophical distance was several times greater.

Google’s research efforts, under the leadership of Peter Norvig, is taking a much different direction, likely due to lessons learned from PARC and others.  Research is embedded in the ever-expanding Google campus that currently sprawls along Amphitheatre Parkway and Charleston Road. There is a free flow of traffic and communication between current product engineering teams (many riding brightly colored Google bikes) and those working on longer-term projects. The distance between “today” and “tomorrow” is minimized at every opportunity.

Norvig commented on this in a recent interview with me:

We don’t have a separate research entity whose job is to be isolated from the rest of the company and think about the future. Rather, everybody’s job, regardless of their job title, is to make our products better or invent a new product. So the distinction between being a researcher versus an engineer is not how academic you are, it’s not how forward-thinking you are  — whether you’re looking at this year or next year or the year after. It’s more in terms of the area that you work in. If you work in core search or in core distributed computer systems, then your title’s going to be software engineer, even if you’re a Nobel Prize-winning professor.

Google has taken a hybrid approach to research, in which even long-term projects are developed at production scale, minimizing the risk of projects failing during the technology transfer phase. Norvig touched on this in a recent article:

Elaborate research prototypes are rarely created, since their development delays the launch of improved end-user services. Typically, a single team iteratively explores fundamental research ideas, develops and maintains the software, and helps operate the resulting Google services — all driven by real-world experience and concrete data. This long-term engagement serves to eliminate most risk to technology transfer from research to engineering.

This was exactly the trap that PARC ran into, when some of the most innovative advances in the history of computing failed to significantly contribute to Xerox’s bottom line.  Google has thrown the doors open for internal research teams to access the full power of complete data sets and production scale systems while espousing the practice of agile development. The goal is to ensure that all innovation that happens at Google is not too far removed from the goal of either diversifying Google’s revenue stream with new products, or contributing to existing ones.

The Emerging Data Ecosystem

First published December 13, 2013 in Mediapost’s Search Insider

big-dataData is ubiquitous, and that is true pretty much everywhere. It was certainly true at the Search Insider Summit, where every panel and presentation talked about data. And not just any data — this was “Big Data.”  But what exactly is Big Data — just more data? Or is there a fundamental shift happening here?

I believe there is. When I think about Big Data, I think about an emerging data ecosystem, where the explosion of available data will exponentially increase the complexity of the ecosystem. This is not just more data, but a different environment that will require different strategies.

Typically, the data we currently use is either first-party data — the data that emerges as part of our business process — or structured third-party data, available from a rapidly growing number of data vendors. This is probably what most people think of when they think of Big Data. But I don’t consider data in this form a departure from the data we’re used to using. There’s more of it, true, but the process is already identified. It just needs to be scaled to deal with increased volumes.

Let me use one example from the recent Search Insider Summit. The Weather Company has recently launched a new division called Weather FX, aimed at taking the vast amount of weather data it has to create predictive models to help companies add weather-based variables to their own data sets. For example, ad targeting can now be weather-sensitive, ramping up campaigns and changing messaging based on predicted changes in weather patterns. While pretty impressive, this is a relatively straightforward use of data. The data feeds are well structured and have been “predigested” by Weather FX to make them easy to implement.

Big Data, at least in my interpretation, is a different beast altogether. Here, data is messy, often unstructured, hard to find and in raw form. To further complicate matters, it lives in disparate siloes that often have no market-facing interface. T It’s an organic ecosystem that bears more than a passing similarity to how we think of natural resources. This data needs to be identified, nurtured and harvested (or mined, if you’d prefer).

It’s this data that will lead to a true view of Big Data, a world of vast data nodes that require significant development before they can be used. Think of how the world was a century and a half ago, when a lot of raw stuff — wood, minerals, water, crops, livestock — lay scattered about our planet. At the time, there was little in the way of established manufacturing and distribution chains that transformed that raw stuff into consumable products. Over time, the chain emerged, but a lot of logistical challenges had to be addressed along the way. The same is true, I believe, for data.

But there’s another challenge with Big Data: It’s not always clear how to use it. It needs a framework. You can’t dump a ton of various metals and a couple barrels of oil into a big black box, shake it and expect a Ford Focus to drop out. You need to have a pretty clear idea of what your expected outcome is. And you need to have a long chain that moves your raw material towards your end product. In the early days of creating physical goods, these chains were often verticalized within a single organization, but as the ecosystem evolved, the markets became more horizontal. I would expect the same pattern to emerge in the data ecosystem.

If you create a conceptual framework within which to use data, you can determine which data is required and how that data will be used. You can pick your data sources, and identify the gaps and resource as required to address those gaps. Often, because we’re in the earliest stages of this process, we will need to explore, guess and iteratively test before the data will provide value.

This definition of Big Data requires new rules and strategies. It requires a commitment to mining raw data and integrating it in useful ways. It will mean dynamically adapting to the continuing data explosion. It will require blood, sweat and tears. This is not a “plug and play” exercise. When I think of Big Data, that’s what I think about.

360 Degrees of Seperation

First published December 5, 2013 in Mediapost’s Search Insider

IMT_iconsIn the past two decades or so, a lot of marketers talked about gaining a 360-degree view of their customers.  I’m not exactly sure what this means, so I looked it up.  Apparently, for most marketers, it means having a comprehensive record of every touch point a customer has had with a company. Originally, it was the promise of CRM vendors, where anyone in an organization, at any time, can pull up a complete customer history.

So far, so good.

But like many phrases, it’s been appropriated by marketers and its meaning has become blurred. Today, it’s bandied about in marketing meetings, where everyone nods knowingly, confident in the fact that they are firmly ensconced in the customer’s cranium and have all things completely under control. “We have a 360-degree view of our customers,” the marketing manager beams, and woe to anyone that dares question it.

But there are no standard criteria that you have to meet before you use the term. There is no rubber-meets-the-road threshold you have to climb over. No one knows exactly what the hell it means. It sure sounds good, though!

If a company is truly striving to build as complete a picture of their customers as possible, they probably define 360 degrees as the total scope of a customer’s interaction with their company. This would follow the original CRM definition. In marketing terms, it would mean every marketing touch point and would hopefully extend through the customer’s entire relationship with that company. This would be 360-degrees as defined by Big Data.

But is it actually 360 degrees? If we envision this as a Venn diagram, we have one 360-degree sphere representing the mental model of customers, including all the things they care about. We have another 360-degree sphere representing the footprint of the company and all the things they do. What we’re actually looking at then, even in an ideal world, is where those two spheres intersect. At best, we’re looking at a relatively small chunk of each sphere.

So let’s flip this idea on its head. What if we redefine 360 degrees as understanding the customer’s decision space? I call this the Buyersphere. The traditional view of 360 degrees is from the inside looking out, from the company’s perspective. The Buyersphere moves the perspective to that of the customer, looking from the outside in. It expands the scope to include the events that lead to consideration, the competitive comparisons, the balancing of buying factors, interactions with all potential candidates and the branches of the buying path itself.  What if you decide to become the best at mapping that mental space?  I still wouldn’t call it a 360-degree view, but it would be a view that very few of your competitors would have.

One of the things that I believe is holding Big Data back is that we don’t have a frame within which to use Big Data. Peter Norvig, chief researcher for Google, outlined 17 warning signs in experimental design and interpretation. One was lack of a specific hypothesis, and the other was a lack of a theory. You need a conceptual frame from which to construct a theory, and then, from that theory, you can decide on a specific hypothesis for validation. It’s this construct that helps you separate signal from noise. Without the construct, you’re relying on serendipity to identify meaningful patterns, and we humans have a nasty tendency to mistake noise for patterns.

If we look at opportunities for establishing a competitive advantage, redefining what we mean by understanding our customers is a pretty compelling one. This is a construct that can provide a robust and testable space within which to use Big Data and other, more qualitative, approaches. It’s relatively doable for any organization to consolidate its data to provide a fairly comprehensive “inside-out” view of customer’s touch points. Essentially, it’s a logistical exercise. I won’t say it’s easy, but it is doable.  But if we set our goal a little differently, working to achieve a true “outside-in” view of our company, that sets the bar substantially higher.

360 degrees? Maybe not. But it’s a much broader view than most marketers have.

Evolutionary Hotspots in Marketing

First published November 21, 2013 in Mediapost’s Search Insider

paramoscene1_7in

The Páramos Ecosystem

The Páramos are remarkable places: grasslands that sit above the tree lines in the Andes, some 10,000 feet above sea level. What makes them remarkable are the things that grow and live there — like Espeletia uribei,which looks like a huge palm tree, but is actually an overgrown member of the daisy family.

The Páramos just happen to be the place on earth where evolution happens the fastest.  There are other places where species evolve quickly, including Darwin’s Galápagos Islands, but scientists believe the Páramos are the hottest of the evolutionary hot spots.

The reason for this supercharged speciation is the climate, which makes them a very tough place to call home.  They’re located at the equator, so they get sunshine year round. But the elevation introduces harsh temperatures and extreme ultraviolet exposure. Also, the weather can change in a heartbeat. A few minutes can mark the difference between sunshine, mist and full-on storms.  This constant adaptive stress has resulted in biodiversity not seen anywhere else on the planet.

In biology, evolution is measured by the rate of mutation. In the business world, mutation equates to innovation. A new idea introduces a wild card into the competitive environment, just as a genetic mutation introduces a wild card into nature. It disrupts the status quo, either positively or negatively. That’s why it’s important for organizations to embrace failure. Openness to error encourages innovation, driving the competitive evolution of the company. Successful innovations can be game-changers, as long as you create a framework to identify unsuccessful innovations before they do irreparable damage.

So if we accept that corporate evolution is a good thing, and we want to increase our mutation/innovation rate, then it makes sense to seek our own organizational “Páramos.” These will be departments or divisions where volatility is the norm, rather than the exception. Stability is the enemy of innovation. Typically, these will be areas that require rapid reaction to external forces and adaption to new environmental factors. Much as we like to mythologize the lone genius toiling away in an ivory tower or R&D lab, the history of innovation shows that it most often comes from far messier, more organic sources.

In the Páramos, it’s the harsh, unpredictable climate that drives evolution. In a company, it’s the instability of the competitive marketplace that drives the forces of innovation. So it makes sense that the hotspots will be those areas of the organization that have the most exposure to that marketplace. Front-line touch points with customers, head to head contact with competitors and real world usage of your products or services are the externalities you’ll be looking for. That makes sales, marketing and customer service prime candidates for becoming your own Páramos.

The challenge is to enable innovation at this level. Typically, innovation in an organization is constrained (and unfortunately, often choked to death) by bureaucratic frameworks that build in “top-down” governance from executives who are traditionally miles away from the “Páramos” in the org chart. This is exactly the wrong approach. Mechanisms should be developed to encourage “bottom-up” innovation in these identified hotspots, with appropriate guidelines for identifying successful opportunities as quickly as possible, allowing organizations to fast-track the winners and cut their losses on the losers. These hotspots can become the strategic radar of the organization.

Darwin’s “dangerous idea” has completely changed biology. Currently, it’s causing everyone from psychologists to economists to rethink their respective fields. In the future, don’t be surprised if it has a similar impact on marketing and corporate strategy.

Yahoo Under the Mayer Regime

First published November 7, 2013 in Mediapost’s Search Insider

marissa-mayer-7882_cnet100_620x433OK, it has a new logo. The mail interface has been redesigned. But according to a recent New York Timespiece, Yahoo still doesn’t know what it wants to be when it grows up. Marissa Mayer seems to be busy, with a robust hiring spree, eight new acquisitions, 15 new product updates, a nice 20% bump in traffic and a stock price that’s been consistently heading north. But all this activity hasn’t seemed to coalesce into a discernible strategy — from the outside, anyway.

It’s probably because Mayer is busy rebuilding the guts of the organization. Cultures are notoriously difficult things to change. In any organization where a major change in direction is required, you will have to deal with several layers of inertia — and, even more challenging, momentum heading the wrong way.  In the blog post, design guru Don Norman agrees, ““The major changes she has made are not what the logo looks like or a new Yahoo Mail. The major changes are what the company looks like internally. She’s revitalizing the inside of the company, and what everyone sees on the surface are just little ripples.”

To be fair, Yahoo has been an organization lacking a clear direction for a long, long time. I remember speaking at the Sunnyvale campus years ago, when Yahoo was still being remade into a media property, under the direction of Terry Semel. There were entire departments (including the core search team) that felt cut adrift. Since then, the strategic direction of Yahoo has resembled that of a Roomba vacuum, plowing forward until it senses an obstacle, then heading off in an entirely new direction.

What was interesting about the recent Times post was the marked contrast to the rumors and kvetching coming from Mayer’s old digs: Google. There, the big news seems to be the ultra-secret party barge anchored in San Francisco bay. And a Quora thread entitled “What’s the Worst Part about Working at Google?” paints a picture of a frat house that has yet to wake up and realize the party’s over:

  • Overqualified people working at menial jobs.
  • Frustration at not being able to contribute anything meaningful in an increasingly bureaucratic environment.
  • Engineers with egos outstripping their skills.
  • Bottlenecks preventing promotion,
  • A permanent “party” atmosphere that makes it difficult to get any actual work done.

But perhaps the most telling comment came from someone who spent seven years at Google, who said that all the meaningful innovation comes from an exceedingly small group, headed by Larry and Sergey. The rest of the Googlers are just along for the ride:

Here’s something to ponder.  The only meaningful organic products to come out of Google were Search and then AdSense.  (Android — awesome, purchased.  YouTube — awesome, purchased, etc. Larry and/or Sergey were obviously intimately involved in both.  Maps – awesome, purchased. Google Plus is a flop for all non-Googlers globally, Chrome browser is great, but no direct monetization (indirectly protects search), the world has passed the Chrome OS by… etc. ) Fast-forward 14 years, and the next big thing from Google, I bet, will be Google Glass, and guess who PMd it.  Sergey Brin.  Tiny number of wave creators, huge number of surfers.

So we have Google, still surfing a wave that started 15 years ago, and Yahoo struggling to get in position to catch the next one. For both, the challenge is a fundamental one: How do you effect change in a massive organization and get thousands of employees contributing in a meaningful way? Ironically, it may turn out that Marissa Mayer has significant advantage here. If you’re bright, ambitious and looking to do something meaningful with your career, what would be more appealing: trying to shoehorn your way into an already overcrowded house party, or the opportunity to roll up your sleeves and resurrect one of the Web’s great brands?

Whom Would You Trust: A Human or an Algorithm?

First published October 31, 2013 in Mediapost’s Search Insider

I’vmindrobote been struggling with a dilemma.

Almost a year ago, I wrote a column asking if Big Data would replace strategy. That started a several-month journey for me, when I’ve been looking for a more informed answer to that query. It’s a massively important question that’s playing out in many arenas today, including medicine, education, government and, of course, finance.

In marketing, we’re well into the era of big data. Of course, it’s not just data we’re talking about. We’re talking about algorithms that use that data to make automated decisions and take action. Some time ago, MediaPost’s Steve Smith introduced us to a company called Persado, that takes an algorithmic approach to copy testing and optimization. As an ex-copywriter turned performance marketer I wasn’t sure how I felt about that. I understand the science of continuous testing but I have an emotional stake in the art of crafting an effective message. And therein lies the dilemma. Our comfort with algorithms seems to depend on the context in which we’re encountering them and the degree of automation involved.

Let me give you an example, from Ian Ayre’s book “Super Crunchers.” There’s a company called Epagogix that uses an algorithm to predict the box-office appeal of unproduced movie scripts. Producers can retain the service to help them decide which projects to fund. Epagogix will also help producers optimize their chosen scripts to improve box-office performance. The question here is, do we want an algorithm controlling the creative output of the movie industry? Would we be comfortable take humans out of the loop completely and see where the algorithm eventually takes us?

Now, you may counter that we could include feedback from audience responses. We could use social signals to continually improve the algorithm, a collaborative filtering approach that uses the power of Big Data to guide the film industry’s creative process. Humans are still in the loop in this approach, but only as an aggregated sounding board. We have removed the essentially human elements of creativity, emotion and intuition. Even with the most robust system imaginable, are you comfortable with us humans taking our hands off the wheel?

Here’s another example from Ayre’s book. There is substantial empirical evidence that shows algorithms are better at diagnosing medical conditions than clinical practitioners. In a 1989 study by Dawes, Faust and Meehl, a diagnosis algorithmic rule set was consistently more reliable than actual clinical doctors. They then tried a combination, where doctors were made aware of the outcomes of the algorithm but were the final judges. Again, doctors would have been better off going with the results of the algorithm. Their second-guessing increased their margin of error significantly.

But, even knowing this, would you be willing to rely completely on an automated algorithm the next time you need medical attention? What if there was no doctor involved at all, and you were diagnosed and treated by an algo-driven robot?

There is also mounting (albeit highly controversial) evidence showing that direct instruction produces better learning outcomes that traditional exploratory teaching methods. In direct instruction, scripted automatons could easily replace the teacher’s role. Test scores could provide self-optimizing feedback loops. Learning could be driven by algorithms and delivered at a distance. Classrooms, along with teachers, could disappear completely. Is this a school you’d sign your kid up for?

Let’s stoke the fires of this dilemma a little. In a frightening TED talk, Kevin Slavin talks about how algorithms rule the world and offers a few examples of how algorithms have gotten it wrong in the past. The pricing algorithms of Amazon priced an out-of-print book called “The Making of a Fly” at a whopping $23.6 million dollars. Surprisingly, there were no sales. And in financial markets, where we’ve largely abdicated control to algorithms, those same algorithms spun out of control in 2012 no fewer than 18,000 times. So far, these instances have been identified and corrected in milliseconds, but there’s always a Black Swan chance that one time, they’ll crash the economy just for the hell of it.

But should we humans feel too smug, let’s remember this sobering fact: 20% of all fatal diseases were misdiagnosed. In fact, misdiagnosis accounts for about one-third of all medical error. And we humans have no one but ourselves to blame but for that.

As I said – it’s a dilemma.

Beware Confirmation Bias

First published September 5, 2013 in Mediapost’s Search Insider

Most testing of marketing is disproportionately biased towards the positive. We test to find winners. But in the process, we often cut losers off without a second glance. And this can be dangerously myopic.

I’ve talked in the past about taking a Bayesian approach to strategy. The more I explore this idea, the better I like it. But it comes with some challenges – the biggest being that we’re not Bayesian by nature. In fact, there’s a cognitive bias roughly the size of a good-sized cow barn that often leaves us blind to the true state of affairs. In psychological circles, it’s called Confirmation Bias, and in a comprehensive academic review in 1998, Raymond Nickerson stated its potential negative impact, “If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration.”

Here’s the thing. We love to be right. We hate to be wrong. So we will go to extraordinary lengths to make sure that we’re proven correct. And we won’t even know we’re doing it. Our brain, working surreptitiously in the background, doesn’t alert us to how biased we actually are. The many tricks that go along with Confirmation Bias usually play out subconsciously.

If we try to be good little Bayesians, we have to embrace alternative ideas of all shapes and sizes, whether or not they agree with our current view of things. In fact, we should be prepared to rip our current view apart, as it’s in the disproving and rebuilding of hypotheses that the truth is eventually found.

Here’s where things go wrong in most market testing. We usually test to prove our hunches right. We go in with a favored option and try to build a case for it.  We may deny it, but we all do it. That means that the less favored alternatives usually get short shrift. And it’s often in one of these alternatives that the optimal choice may be found. The more that there is at stake in the test, the more susceptible we are to Confirmation Bias.

Here is the rogue’s gallery of typical Confirmation Bias tricks:

Favored Hypothesis Information Seeking and Interpretation – As I said, we tend to seek information that supports our favored hypothesis, and avoid information that would contradict it. In the Bayesian view, this is equivalent to ignoring likelihood ratios.

Preferential Treatment of Evidence Supporting Existing Beliefs – Even if we somehow collect unbiased information, we will tend to focus on the information that supports our favored view. It gets “over-weighted” in analysis.

Looking for Positive Cases – This is the classic trap of testing only for winners and ignoring the losers. Often, the losers can tell us more about the true state of affairs.

The Primacy Effect – We tend to pay more attention to the first information we look at, which can bias analysis of any subsequent information.

Belief Persistence – Even when the evidence mounts that our original hunch is wrong, we can be incredibly inventive in twisting evidentiary frameworks to provide continuing support. Along with this is another bias called the “Sunk Cost Fallacy.”  The more we have invested in our original hunch (i.e. a major multimillion-dollar campaign that was launched based on it) the more tenacious we are in holding on to it.

Going back a few columns to Philip Tetlock’s Hedgehogs and Foxes, he found that Foxes make much better natural Bayesians. They are more open to updating their beliefs. The big takeaway here? Keep an open mind.

The Marketing Classic Few Marketers Have Ever Read

First published August 22, 2013 in Mediapost’s Search Insider

It may be the best book you’ll ever read on marketing, but you won’t find it in the marketing section of Amazon.  They have it variously filed in three different categories: Politics and Social Sciences, Technology and Text Books. The book is Everett Rogers’ “Diffusion of Innovations,” and you should add it to your reading list.

The book is a comprehensive review of how new ideas spread and take hold in our society, and although it was first written in the 60s (it’s currently in its fifthedition), the findings are as fresh and relevant as ever. Its relevance to marketing is immediate and tangible. After all, what else is marketing but promoting the  adoption and diffusion of new things?

Rogers traces almost a century of diffusion research to see how everything from new high-yield corn varieties to birth control were adopted in various cultures. While there are not a lot of examples purely from the consumer marketplace, the generalized observations beg to be applied to marketing campaigns pushing new (and hopefully improved) products.

Consider these five innovation-specific variables that affect how quickly a new idea is adopted:

1)   Relative advantage – How much of a true advantage does the new innovation offer over what is currently being used? Rogers offers an important caveat here: “The receiver’s perceptions of the attributes of an innovation – not the attributes as classified by experts or change agents, affect its rate of adoption.”

2)   Compatibility – How well does the innovation fit into the framework of the customer’s current situation? Is it an incremental innovation, easily added, or a discontinuous innovation, requiring significant pain on the part of the user to adopt?

3)   Complexity – What is the learning curve that comes bundled with the innovation? The steeper the curve, the slower the rate of adoption.

4)   Trialability – Is it possible to try the product firsthand to determine the relative advantage (see #1)?

5)   Observability – Being the herders we are, adoption is sometimes a matter of “monkey see, monkey do.”

These factors may seem fundamental, but every day new “innovative” products are turned loose on the market, there to wither and die, simply because one or several of these check boxes remain unchecked.

Rogers also spend significant time looking at the social dynamics of diffusion and adoption, including the role of early adopters, change agents, influencers, mass communication channels and interpersonal persuasion. I found amazing close correlations to the findings of my own research into buying behaviors in the B2B world.

At the risk of oversimplifying this seminal work, Rogers found that adoption balances at the intersection of risk and reward. Risk stalls adoption, reward drives it forward, and clarity of communicating this risk/reward balance in a relevant way is either the catalyst or the inhibitor that determines how steep the adoption curve is.

This is a textbook, so expect a small investment of effort to wade through the rather academic delivery, but if you persevere (and to be fair, I’ve suffered through much worse in other books) you’ll come away with perhaps the clearest summation of marketplace dynamics ever put in print.

Maybe We Need More Skin in the Game

First published August 15, 2013 in Mediapost’s Search Insider

I think our world, — or, more specifically, our marketplace — is a little too abstract. We — and by we, I mean the marketers, the suppliers to the market — live too far removed from the market itself: the consumers of the supplied goods.

It’s a point touched on by Nassim Nicholas Taleb in his most recent book, “Antifragile.” Marketers and manufacturers, he suggests, don’t have enough skin in the game to keep them honest. They’re too far removed from accountability. There are too many protective buffers between them and the consequences of their actions.

The law is supposed to provide the accountability — but let’s face it, when it comes to enforcing accountability in the marketplace, we’re a long way from the Code of Hammurabi (one of the first legal codes known), where sloppy workmanship enacted a pretty definite penalty: If a builder has built a house for a man, and has not made his work sound, and the house he built has fallen, and caused the death of its owner, that builder shall be put to death.

Or, consider if the actions of the captain of the Exxon Valdez would have been different if he would have been answerable to a law like this: If a man has hired a boat and boatman, and loaded it with corn, wool, oil, or dates, or whatever it be, and the boatman has been careless, and sunk the boat, or lost what is in it, the boatman shall restore the boat which he sank, and whatever he lost that was in it.

The world was a smaller and more intimate place back then. You couldn’t hide behind corporate lawyers, malpractice insurance and legal loopholes. If you screwed up, chances are you’d lose an eye, a hand or even your life. If you built a bridge that collapsed, you might as well have been under the bridge, because your fate would be the same.

Now, I’m not sure we’re ready to return to the brutal simplicity of an “eye for an eye” legal code, but it does bring up a rather thorny issue: If there are little to no consequences for shoddy or unethical work, what keeps us honest? There’s nothing like skin in the game to provide some pretty compelling motivation for ethical business practices. And there’s nothing like a consequence-free pass to encourage fast and loose corporate behavior.

The good news, I suppose, is that technology is once again making the world a little more intimate. McLuhan’s Global Village is coming to pass, and the unethical of the world are increasingly being held accountable for their actions. In fact, the speed at which this is happening is confounding the legal systems of many a nation, as vigilantism and frontier justice are increasingly springing up, unchecked by due process and judicial oversight.

I avoid trying to predict the future, but fairness and accountability are hardwired into us, so I suspect that as technology allows us to identify those responsible in the most egregious cases, we will be moved to demand action. We will force the market to have more skin in the game, as our opinions and beliefs, in aggregate, will define that market.

Reengineering Hiring

First published August 9, 2013 in Mediapost’s Search Insider

In all my years in business, the one thing I found consistently difficult was hiring good people. We spent a lot of time honing our screening skills, but I sometimes suspect we would have been just as far ahead by flipping a coin.

Over time, we found we achieved pretty good success rates with our lower-level hires, but the one area where consistent success eluded me was in our management recruitment. It seemed that the more senior the position, the worse our track record was. We had a few outright disasters.

When it comes down to it, hiring someone is making a prediction. You examine the evidence and try to foresee if that person will perform at an acceptable level in the position you have vacant. And, as I said in my last column, we humans don’t tend to be very good at making predictions. The more there is at stake in the position to be filled, the worse the consequences if our predictions are faulty. In looking at our past management hires, I realize that it wasn’t that our predictive powers were any less effective, it’s just that the pain of being wrong was more acute.

So, it was with some reassurance and more than a dollop of schadenfreude that I learned that Google has had exactly the same problem. That’s right, Google — the same company that has a zillion brilliant engineers working on every problem known to mankind. But those engineers have to come from somewhere, right? Someone has to hire them. And there, ay, there’s the rub!

In a recent interview in the New York Times, Laszlo Bock, senior vice president for people operations at Google, confessed that Google has tweaked, and, in some cases, massively overhauled its recruitment process.  Take, for example, Google’s famous early predilection for college G.P.A.s. According to Bock, based on actual performance, “G.P.A.s are worthless as a criteria for hiring, and test scores are worthless — no correlation at all except for brand-new college grads, where there’s a slight correlation. Google famously used to ask everyone for a transcript and G.P.A.s and test scores, but we don’t anymore, unless you’re just a few years out of school. We found that they don’t predict anything.”

Google has also slowly backed away from its ironclad requirement that every hire have a degree. Bock revealed, “The proportion of people without any college education at Google has increased over time as well. So we have teams where you have 14% of the team made up of people who’ve never gone to college.”

Sometimes, interviewers fall into the trap of over-playing their own cleverness and “expertise.” We spend more time trying to stroke our own ego by staging an impromptu show of power during the interview than in really listening to what the interviewee is saying. Google found that tricks like brainteasers, while they may make the interviewer feel clever, are worthless in screening out duds. The much less flashy but tried-and-true list of standardized behavioral questions (“Give me an example of when you…”) is a far better predictive indicator.

Finally, Bock admits that screening for leadership positions is the most difficult challenge, because leadership is something that defies easy definitions. “We’ve found that leadership is a more ambiguous and amorphous set of characteristics than the work we did on the attributes of good management, which are more of a checklist and actionable.” So you can ask questions, probing for effective leadership, but because leading people tends to fall into the category of ill-defined problems, you often have to do the best job you can in the hiring process, and then track performance religiously. In this case, “slow to hire, quick to fire” is a good principle to follow.

I found Bock’s last words, on the role of Big Data in management decisions, including those involving people’s performance, revealing: “Big Data — when applied to leadership — has tremendous potential to uncover the 10 universal things we should all be doing. But there are also things that are specifically true only about your organization, and the people you have and the unique situation you’re in at that point in time. I think this will be a constraint to how big the data can get because it will always require an element of human insight.”