Now, That’s a Job Description I Could Get Behind!

First published February 20, 2014 in Mediapost’s Search Insider

I couldn’t help but notice that last week’s column, where I railed against the marketer’s obsession with tricks, loopholes and pat sound bites got a fair number of retweets. The irony? At least a third of those retweets twisted my whole point – that six seconds (or any arbitrary length of message) isn’t the secret to getting a prospect engaged. The secret is giving them something they want to engage with.

tweet ss

As anyone who has been unfortunate to spend some time with me when I’m in particularly cynical mood about marketing can attest to, I go a little nuts with this “Top Ten Tricks” or “The Secret to…” mentality that seems pervasive in marketing. I’m pretty sure that anyone who retweeted last week’s column with a preface like “Does your advertising engage your consumer in 6 seconds or less? If not, you’re likely losing customers” didn’t bother to actually read past the first paragraph. Maybe not even the first line.

And that’s the whole problem. How can we expect marketers to build empathy, usefulness and relevance into their strategy when many of them have the attention span of a small gnat? As my friend Scott Brinker likes to say when it comes to marketer’s misbehaving, “This is why we can’t have nice things.”

Marketing – good marketing – is not easy but it’s also not a black box. It’s not about secrets or tricks or one-off tactics. It’s about really understanding your customers at an incredibly deep level and then working your ass off to create a meaningful engagement with them. Trying to reduce marketing to anything less than that is like trying to breeze your way through 50 years of marriage by following the Top 3 Tricks to get lucky this Friday night.

Again, this is about meaningful engagements. And when I say meaningful, it’s the customer that gets to decide what’s meaningful. That’s what’s potentially so exciting about breakthroughs like the Oreo Super Bowl campaign. It’s the opportunity to learn what’s meaningful to prospects and then to shift and tailor our responses in real time. Until now, marketing has been “Plan, Push and Pray.” We plan our attack, we push out our message and we pray it finds it’s target and that they respond by buying stuff. If they don’t buy stuff, something went wrong, probably in the planning stage. But that is an awfully long feedback loop.

You’ll notice something about this approach to marketing. The only role for the prospect is as a consumer. If they don’t buy, they don’t participate.  This comes as a direct result of the current job description of a marketer: Someone who gets someone else to buy stuff. But what if we rethink that description? Technology that enables real time feedback is allowing us to create an entirely new relationship with customers. What would happen if we redefined marketing along these lines: To understand the customer’s reality, focusing on those areas where we can solve their problems and improve that reality?

And as much as that sounds like a pat sound bite, if you really dig into it, it’s far from a quick fix. This is a way to make a radically different organization. And it moves marketing into a fundamentally different role. Previously, marketing got its marching orders from the CEO and CFO. Essentially, they were responsible for moving the top line ever northward. It was an internally generated mandate – to increase sales.

But what if we rethink this? What if the entire organization’s role is to constantly adapt to a dynamic environment, looking for advantageous opportunities to improve that environment? And, in this redefined vision, what if marketing’s role was to become the sense-making interface of the company? What if it was the CMO’s job was to consistently monitor the environment, create hypotheses about how to best create adaptive opportunities and then test those hypotheses in a scientific manner?

In this redefinition of the job, Big Data and Real Time Marketing take on significantly new qualities, first as a rich vein of timely information about the marketplace and secondly as a never ending series of instant field experiments to provide empirical backing to strategy.

Now, marketing’s job isn’t to sell stuff, it’s to make sense of the market and, in doing so, help define the overall strategic direction of the company. There are no short cuts, no top ten tricks, but isn’t that one hell of a job description?

How Can Humans Co-Exist with Data?

First published February 6, 2014 in Mediapost’s Search Insider

tumblr_inline_mpt49sqAwV1qz4rgpLast week, I talked about our ability to ignore data. I positioned this as a bad thing. But Pete Austin called me on it, with an excellent counterpoint:

Ignoring Data is the most important thing we do. Only the people who could ignore the trees and see the tiger, in real-time, survived to become our ancestors.”

Too true. We’re built to subconsciously filter and ignore vast amounts of input data in order to maintain focus on critical tasks, such as avoiding hungry tigers. If you really want to dive into this, I would highly recommend Daniel Simons and Christopher Chabris’s “The Invisible Gorilla.” But, as Simons and Chabris point out, with example after example of how our intuitions (which we use as filters) can mislead us, this “inattentional blindness” is not always a good thing. In the adaptive environment in which we evolved, it was pretty effective at keeping us alive.  But in a modern, rational environment, it can severely inhibit our ability to maintain an objective view of the world.

But Pete also had a second, even more valid point:

“What you need to concentrate on now is “curated data”, where the junk has already been ignored for you.”

And this brought to mind an excellent example from a recent interview I did as background for an upcoming book I’m working on.  This idea of pre-filtered, curated data becomes a key consideration in this new world of Big Data.

Nowhere are the stakes higher for the use of data than in healthcare. It’s what lead to the publication of a manifesto in 1992 calling for a revolution in how doctors made life and death decisions. One of the authors, Dr. Gordon Guyatt, coined the term “Evidence based medicine.” The rational is simple here. By taking an empirical approach to not just diagnosis but also to the best prescriptive path, doctors can rise above the limitations of their own intuition and achieve higher accuracy. It’s data driven decision-making, applied to health care. Makes perfect sense, right? But even though Evidence based medicine is now over 20 years old, it’s still difficult to consistently apply at the doctor to individual patient level.

I had the chance to ask Dr. Guyatt why this was:

“Essentially after medical school, learning the practice of medicine is an apprenticeship exercise and people adopt practice patterns according to the physicians who are teaching them and their role models and there is still a relatively small number of physicians who really do good evidence-based practice themselves in terms of knowing the evidence behind what they’re doing and being able to look at it critically.”

The fact is, a data driven approach to any decision-making domain that previously used to rely on intuition just doesn’t feel – well – very intuitive. It’s hard work. It’s time consuming. It, to Mr. Austin’s point, runs directly counter to our tiger-avoidance instincts.

Dr. Guyatt confirms that physicians are not immune to this human reliance on instinct:

“Even the best folks are not going to do it – maybe the best folks – but most folks are not going to be able to do that very often.”

The answer in healthcare, and likely the answer everywhere else where data should back up intuition, is the creation of solid data based resources, which adhere to empirical best practices without requiring every single practitioner to do the necessary heavy lifting. Dr. Guyatt has seen exactly this trend emerge in the last decade:

“What you need is preprocessed information. People have to be able to identify good preprocessed evidence-based resources where the people producing the resources have gone through that process well.”

The promise of curated, preprocessed data is looming large in the world of marketing. The challenge is that, unlike medicine, where data is commonly shared and archived, in the world of marketing much of the most important data stays proprietary. What we have to start thinking about is a truly empirical, scientific way to curate, analyze and filter our own data for internal consumption, so it can be readily applied in real world situations without falling victim to human bias.

Never Underestimate the Human Ability to Ignore Data

First published January 30, 2014 in Mediapost’s Search Insider

ignore_factsIt’s one thing to have data. It’s another to pay attention to it.

We marketers are stumbling over ourselves to move to data-driven marketing. No one would say that’s a bad thing. But here’s the catch in that. Data driven marketing is all well and good when it’s a small stakes game – optimizing spend, targeting, conversion rates, etc. If we gain a point or two on the topside, so much the better. And if we screw up and lose a point or two – well – mistakes happen and as long as we fix it quickly, no permanent harm done.

But what if the data is telling us something we don’t want to know? I mean – something we really don’t want to know. For instance, our brand messaging is complete BS in the eyes of our target market, or they feel our products suck, or our primary revenue source appears to be drying up or our entire strategic direction looks to be heading over a cliff? What then?

This reminds me of a certain CMO of my acquaintance who was a “Numbers Guy.” In actual fact, he was a numbers guy only if the numbers said what he wanted them to say. If not, then he’d ask for a different set of numbers that confirmed his view of the world. This data hypocrisy generated a tremendous amount of bogus activity in his team, as they ran around grabbing numbers out of the air and massaging them to keep their boss happy. I call this quantifiable bullshit.

I think this is why data tends to be used to optimize tactics, but why it’s much more difficult to use data to inform strategy. The stakes are much higher and even if the data is providing clear predictive signals, it may be predicting a future we’d rather not accept. Then we fall back on our default human defense: ignore, ignore, ignore.

Let me give you an example. Any human who functions even slightly above the level of brain dead has to accept the data that says our climate is changing. The signals couldn’t be clearer. And if we choose to pay attention to the data, the future looks pretty damn scary. Best-case scenario – we’re probably screwing up the planet for our children and grand children. Worst-case scenario – we’re definitely screwing up the planet and it will happen in our lifetime. And we’re not talking about an increased risk of sunburn. We’re talking about the potential end of our species. So what do we do? We ignore it. Even when flooding, drought and ice storms without historic precedent are happening in our back yards. Even when Atlanta is paralyzed by a freak winter storm. Nothing about what is happening is good news, and it’s going to get worse. So, damn the data, let’s just look the other way.

In a recent poll by the Wall Street Journal, out of a list of 15 things that Americans believed should be top priorities for President Obama and Congress, climate change came out dead last – behind pension reform, Iran’s nuclear program and immigration legislation. Yet, if we look at the data that the UN and the World Economic Forum collects, quantifying the biggest threats to our existence, climate change is consistently near the top, both in terms of likelihood and impact. But, it’s really hard to do something about it. It’s a story we don’t want to hear, so we just ignore the data, like the afore-said CMO.

As we get access to more and more data, it will be harder and harder to remain uninformed, but I suspect it will have little impact on our ability to be ignorant. If we don’t know something, we don’t know it. But if we can know something, and we choose not to, that’s a completely different matter. That’s embracing ignorance. And that’s dangerous. In fact, it could be deadly.

The Emerging Data Ecosystem

First published December 13, 2013 in Mediapost’s Search Insider

big-dataData is ubiquitous, and that is true pretty much everywhere. It was certainly true at the Search Insider Summit, where every panel and presentation talked about data. And not just any data — this was “Big Data.”  But what exactly is Big Data — just more data? Or is there a fundamental shift happening here?

I believe there is. When I think about Big Data, I think about an emerging data ecosystem, where the explosion of available data will exponentially increase the complexity of the ecosystem. This is not just more data, but a different environment that will require different strategies.

Typically, the data we currently use is either first-party data — the data that emerges as part of our business process — or structured third-party data, available from a rapidly growing number of data vendors. This is probably what most people think of when they think of Big Data. But I don’t consider data in this form a departure from the data we’re used to using. There’s more of it, true, but the process is already identified. It just needs to be scaled to deal with increased volumes.

Let me use one example from the recent Search Insider Summit. The Weather Company has recently launched a new division called Weather FX, aimed at taking the vast amount of weather data it has to create predictive models to help companies add weather-based variables to their own data sets. For example, ad targeting can now be weather-sensitive, ramping up campaigns and changing messaging based on predicted changes in weather patterns. While pretty impressive, this is a relatively straightforward use of data. The data feeds are well structured and have been “predigested” by Weather FX to make them easy to implement.

Big Data, at least in my interpretation, is a different beast altogether. Here, data is messy, often unstructured, hard to find and in raw form. To further complicate matters, it lives in disparate siloes that often have no market-facing interface. T It’s an organic ecosystem that bears more than a passing similarity to how we think of natural resources. This data needs to be identified, nurtured and harvested (or mined, if you’d prefer).

It’s this data that will lead to a true view of Big Data, a world of vast data nodes that require significant development before they can be used. Think of how the world was a century and a half ago, when a lot of raw stuff — wood, minerals, water, crops, livestock — lay scattered about our planet. At the time, there was little in the way of established manufacturing and distribution chains that transformed that raw stuff into consumable products. Over time, the chain emerged, but a lot of logistical challenges had to be addressed along the way. The same is true, I believe, for data.

But there’s another challenge with Big Data: It’s not always clear how to use it. It needs a framework. You can’t dump a ton of various metals and a couple barrels of oil into a big black box, shake it and expect a Ford Focus to drop out. You need to have a pretty clear idea of what your expected outcome is. And you need to have a long chain that moves your raw material towards your end product. In the early days of creating physical goods, these chains were often verticalized within a single organization, but as the ecosystem evolved, the markets became more horizontal. I would expect the same pattern to emerge in the data ecosystem.

If you create a conceptual framework within which to use data, you can determine which data is required and how that data will be used. You can pick your data sources, and identify the gaps and resource as required to address those gaps. Often, because we’re in the earliest stages of this process, we will need to explore, guess and iteratively test before the data will provide value.

This definition of Big Data requires new rules and strategies. It requires a commitment to mining raw data and integrating it in useful ways. It will mean dynamically adapting to the continuing data explosion. It will require blood, sweat and tears. This is not a “plug and play” exercise. When I think of Big Data, that’s what I think about.

Evolutionary Hotspots in Marketing

First published November 21, 2013 in Mediapost’s Search Insider

paramoscene1_7in

The Páramos Ecosystem

The Páramos are remarkable places: grasslands that sit above the tree lines in the Andes, some 10,000 feet above sea level. What makes them remarkable are the things that grow and live there — like Espeletia uribei,which looks like a huge palm tree, but is actually an overgrown member of the daisy family.

The Páramos just happen to be the place on earth where evolution happens the fastest.  There are other places where species evolve quickly, including Darwin’s Galápagos Islands, but scientists believe the Páramos are the hottest of the evolutionary hot spots.

The reason for this supercharged speciation is the climate, which makes them a very tough place to call home.  They’re located at the equator, so they get sunshine year round. But the elevation introduces harsh temperatures and extreme ultraviolet exposure. Also, the weather can change in a heartbeat. A few minutes can mark the difference between sunshine, mist and full-on storms.  This constant adaptive stress has resulted in biodiversity not seen anywhere else on the planet.

In biology, evolution is measured by the rate of mutation. In the business world, mutation equates to innovation. A new idea introduces a wild card into the competitive environment, just as a genetic mutation introduces a wild card into nature. It disrupts the status quo, either positively or negatively. That’s why it’s important for organizations to embrace failure. Openness to error encourages innovation, driving the competitive evolution of the company. Successful innovations can be game-changers, as long as you create a framework to identify unsuccessful innovations before they do irreparable damage.

So if we accept that corporate evolution is a good thing, and we want to increase our mutation/innovation rate, then it makes sense to seek our own organizational “Páramos.” These will be departments or divisions where volatility is the norm, rather than the exception. Stability is the enemy of innovation. Typically, these will be areas that require rapid reaction to external forces and adaption to new environmental factors. Much as we like to mythologize the lone genius toiling away in an ivory tower or R&D lab, the history of innovation shows that it most often comes from far messier, more organic sources.

In the Páramos, it’s the harsh, unpredictable climate that drives evolution. In a company, it’s the instability of the competitive marketplace that drives the forces of innovation. So it makes sense that the hotspots will be those areas of the organization that have the most exposure to that marketplace. Front-line touch points with customers, head to head contact with competitors and real world usage of your products or services are the externalities you’ll be looking for. That makes sales, marketing and customer service prime candidates for becoming your own Páramos.

The challenge is to enable innovation at this level. Typically, innovation in an organization is constrained (and unfortunately, often choked to death) by bureaucratic frameworks that build in “top-down” governance from executives who are traditionally miles away from the “Páramos” in the org chart. This is exactly the wrong approach. Mechanisms should be developed to encourage “bottom-up” innovation in these identified hotspots, with appropriate guidelines for identifying successful opportunities as quickly as possible, allowing organizations to fast-track the winners and cut their losses on the losers. These hotspots can become the strategic radar of the organization.

Darwin’s “dangerous idea” has completely changed biology. Currently, it’s causing everyone from psychologists to economists to rethink their respective fields. In the future, don’t be surprised if it has a similar impact on marketing and corporate strategy.

Whom Would You Trust: A Human or an Algorithm?

First published October 31, 2013 in Mediapost’s Search Insider

I’vmindrobote been struggling with a dilemma.

Almost a year ago, I wrote a column asking if Big Data would replace strategy. That started a several-month journey for me, when I’ve been looking for a more informed answer to that query. It’s a massively important question that’s playing out in many arenas today, including medicine, education, government and, of course, finance.

In marketing, we’re well into the era of big data. Of course, it’s not just data we’re talking about. We’re talking about algorithms that use that data to make automated decisions and take action. Some time ago, MediaPost’s Steve Smith introduced us to a company called Persado, that takes an algorithmic approach to copy testing and optimization. As an ex-copywriter turned performance marketer I wasn’t sure how I felt about that. I understand the science of continuous testing but I have an emotional stake in the art of crafting an effective message. And therein lies the dilemma. Our comfort with algorithms seems to depend on the context in which we’re encountering them and the degree of automation involved.

Let me give you an example, from Ian Ayre’s book “Super Crunchers.” There’s a company called Epagogix that uses an algorithm to predict the box-office appeal of unproduced movie scripts. Producers can retain the service to help them decide which projects to fund. Epagogix will also help producers optimize their chosen scripts to improve box-office performance. The question here is, do we want an algorithm controlling the creative output of the movie industry? Would we be comfortable take humans out of the loop completely and see where the algorithm eventually takes us?

Now, you may counter that we could include feedback from audience responses. We could use social signals to continually improve the algorithm, a collaborative filtering approach that uses the power of Big Data to guide the film industry’s creative process. Humans are still in the loop in this approach, but only as an aggregated sounding board. We have removed the essentially human elements of creativity, emotion and intuition. Even with the most robust system imaginable, are you comfortable with us humans taking our hands off the wheel?

Here’s another example from Ayre’s book. There is substantial empirical evidence that shows algorithms are better at diagnosing medical conditions than clinical practitioners. In a 1989 study by Dawes, Faust and Meehl, a diagnosis algorithmic rule set was consistently more reliable than actual clinical doctors. They then tried a combination, where doctors were made aware of the outcomes of the algorithm but were the final judges. Again, doctors would have been better off going with the results of the algorithm. Their second-guessing increased their margin of error significantly.

But, even knowing this, would you be willing to rely completely on an automated algorithm the next time you need medical attention? What if there was no doctor involved at all, and you were diagnosed and treated by an algo-driven robot?

There is also mounting (albeit highly controversial) evidence showing that direct instruction produces better learning outcomes that traditional exploratory teaching methods. In direct instruction, scripted automatons could easily replace the teacher’s role. Test scores could provide self-optimizing feedback loops. Learning could be driven by algorithms and delivered at a distance. Classrooms, along with teachers, could disappear completely. Is this a school you’d sign your kid up for?

Let’s stoke the fires of this dilemma a little. In a frightening TED talk, Kevin Slavin talks about how algorithms rule the world and offers a few examples of how algorithms have gotten it wrong in the past. The pricing algorithms of Amazon priced an out-of-print book called “The Making of a Fly” at a whopping $23.6 million dollars. Surprisingly, there were no sales. And in financial markets, where we’ve largely abdicated control to algorithms, those same algorithms spun out of control in 2012 no fewer than 18,000 times. So far, these instances have been identified and corrected in milliseconds, but there’s always a Black Swan chance that one time, they’ll crash the economy just for the hell of it.

But should we humans feel too smug, let’s remember this sobering fact: 20% of all fatal diseases were misdiagnosed. In fact, misdiagnosis accounts for about one-third of all medical error. And we humans have no one but ourselves to blame but for that.

As I said – it’s a dilemma.

Beware Confirmation Bias

First published September 5, 2013 in Mediapost’s Search Insider

Most testing of marketing is disproportionately biased towards the positive. We test to find winners. But in the process, we often cut losers off without a second glance. And this can be dangerously myopic.

I’ve talked in the past about taking a Bayesian approach to strategy. The more I explore this idea, the better I like it. But it comes with some challenges – the biggest being that we’re not Bayesian by nature. In fact, there’s a cognitive bias roughly the size of a good-sized cow barn that often leaves us blind to the true state of affairs. In psychological circles, it’s called Confirmation Bias, and in a comprehensive academic review in 1998, Raymond Nickerson stated its potential negative impact, “If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration.”

Here’s the thing. We love to be right. We hate to be wrong. So we will go to extraordinary lengths to make sure that we’re proven correct. And we won’t even know we’re doing it. Our brain, working surreptitiously in the background, doesn’t alert us to how biased we actually are. The many tricks that go along with Confirmation Bias usually play out subconsciously.

If we try to be good little Bayesians, we have to embrace alternative ideas of all shapes and sizes, whether or not they agree with our current view of things. In fact, we should be prepared to rip our current view apart, as it’s in the disproving and rebuilding of hypotheses that the truth is eventually found.

Here’s where things go wrong in most market testing. We usually test to prove our hunches right. We go in with a favored option and try to build a case for it.  We may deny it, but we all do it. That means that the less favored alternatives usually get short shrift. And it’s often in one of these alternatives that the optimal choice may be found. The more that there is at stake in the test, the more susceptible we are to Confirmation Bias.

Here is the rogue’s gallery of typical Confirmation Bias tricks:

Favored Hypothesis Information Seeking and Interpretation – As I said, we tend to seek information that supports our favored hypothesis, and avoid information that would contradict it. In the Bayesian view, this is equivalent to ignoring likelihood ratios.

Preferential Treatment of Evidence Supporting Existing Beliefs – Even if we somehow collect unbiased information, we will tend to focus on the information that supports our favored view. It gets “over-weighted” in analysis.

Looking for Positive Cases – This is the classic trap of testing only for winners and ignoring the losers. Often, the losers can tell us more about the true state of affairs.

The Primacy Effect – We tend to pay more attention to the first information we look at, which can bias analysis of any subsequent information.

Belief Persistence – Even when the evidence mounts that our original hunch is wrong, we can be incredibly inventive in twisting evidentiary frameworks to provide continuing support. Along with this is another bias called the “Sunk Cost Fallacy.”  The more we have invested in our original hunch (i.e. a major multimillion-dollar campaign that was launched based on it) the more tenacious we are in holding on to it.

Going back a few columns to Philip Tetlock’s Hedgehogs and Foxes, he found that Foxes make much better natural Bayesians. They are more open to updating their beliefs. The big takeaway here? Keep an open mind.

The Marketing Classic Few Marketers Have Ever Read

First published August 22, 2013 in Mediapost’s Search Insider

It may be the best book you’ll ever read on marketing, but you won’t find it in the marketing section of Amazon.  They have it variously filed in three different categories: Politics and Social Sciences, Technology and Text Books. The book is Everett Rogers’ “Diffusion of Innovations,” and you should add it to your reading list.

The book is a comprehensive review of how new ideas spread and take hold in our society, and although it was first written in the 60s (it’s currently in its fifthedition), the findings are as fresh and relevant as ever. Its relevance to marketing is immediate and tangible. After all, what else is marketing but promoting the  adoption and diffusion of new things?

Rogers traces almost a century of diffusion research to see how everything from new high-yield corn varieties to birth control were adopted in various cultures. While there are not a lot of examples purely from the consumer marketplace, the generalized observations beg to be applied to marketing campaigns pushing new (and hopefully improved) products.

Consider these five innovation-specific variables that affect how quickly a new idea is adopted:

1)   Relative advantage – How much of a true advantage does the new innovation offer over what is currently being used? Rogers offers an important caveat here: “The receiver’s perceptions of the attributes of an innovation – not the attributes as classified by experts or change agents, affect its rate of adoption.”

2)   Compatibility – How well does the innovation fit into the framework of the customer’s current situation? Is it an incremental innovation, easily added, or a discontinuous innovation, requiring significant pain on the part of the user to adopt?

3)   Complexity – What is the learning curve that comes bundled with the innovation? The steeper the curve, the slower the rate of adoption.

4)   Trialability – Is it possible to try the product firsthand to determine the relative advantage (see #1)?

5)   Observability – Being the herders we are, adoption is sometimes a matter of “monkey see, monkey do.”

These factors may seem fundamental, but every day new “innovative” products are turned loose on the market, there to wither and die, simply because one or several of these check boxes remain unchecked.

Rogers also spend significant time looking at the social dynamics of diffusion and adoption, including the role of early adopters, change agents, influencers, mass communication channels and interpersonal persuasion. I found amazing close correlations to the findings of my own research into buying behaviors in the B2B world.

At the risk of oversimplifying this seminal work, Rogers found that adoption balances at the intersection of risk and reward. Risk stalls adoption, reward drives it forward, and clarity of communicating this risk/reward balance in a relevant way is either the catalyst or the inhibitor that determines how steep the adoption curve is.

This is a textbook, so expect a small investment of effort to wade through the rather academic delivery, but if you persevere (and to be fair, I’ve suffered through much worse in other books) you’ll come away with perhaps the clearest summation of marketplace dynamics ever put in print.

Maybe We Need More Skin in the Game

First published August 15, 2013 in Mediapost’s Search Insider

I think our world, — or, more specifically, our marketplace — is a little too abstract. We — and by we, I mean the marketers, the suppliers to the market — live too far removed from the market itself: the consumers of the supplied goods.

It’s a point touched on by Nassim Nicholas Taleb in his most recent book, “Antifragile.” Marketers and manufacturers, he suggests, don’t have enough skin in the game to keep them honest. They’re too far removed from accountability. There are too many protective buffers between them and the consequences of their actions.

The law is supposed to provide the accountability — but let’s face it, when it comes to enforcing accountability in the marketplace, we’re a long way from the Code of Hammurabi (one of the first legal codes known), where sloppy workmanship enacted a pretty definite penalty: If a builder has built a house for a man, and has not made his work sound, and the house he built has fallen, and caused the death of its owner, that builder shall be put to death.

Or, consider if the actions of the captain of the Exxon Valdez would have been different if he would have been answerable to a law like this: If a man has hired a boat and boatman, and loaded it with corn, wool, oil, or dates, or whatever it be, and the boatman has been careless, and sunk the boat, or lost what is in it, the boatman shall restore the boat which he sank, and whatever he lost that was in it.

The world was a smaller and more intimate place back then. You couldn’t hide behind corporate lawyers, malpractice insurance and legal loopholes. If you screwed up, chances are you’d lose an eye, a hand or even your life. If you built a bridge that collapsed, you might as well have been under the bridge, because your fate would be the same.

Now, I’m not sure we’re ready to return to the brutal simplicity of an “eye for an eye” legal code, but it does bring up a rather thorny issue: If there are little to no consequences for shoddy or unethical work, what keeps us honest? There’s nothing like skin in the game to provide some pretty compelling motivation for ethical business practices. And there’s nothing like a consequence-free pass to encourage fast and loose corporate behavior.

The good news, I suppose, is that technology is once again making the world a little more intimate. McLuhan’s Global Village is coming to pass, and the unethical of the world are increasingly being held accountable for their actions. In fact, the speed at which this is happening is confounding the legal systems of many a nation, as vigilantism and frontier justice are increasingly springing up, unchecked by due process and judicial oversight.

I avoid trying to predict the future, but fairness and accountability are hardwired into us, so I suspect that as technology allows us to identify those responsible in the most egregious cases, we will be moved to demand action. We will force the market to have more skin in the game, as our opinions and beliefs, in aggregate, will define that market.

Reengineering Hiring

First published August 9, 2013 in Mediapost’s Search Insider

In all my years in business, the one thing I found consistently difficult was hiring good people. We spent a lot of time honing our screening skills, but I sometimes suspect we would have been just as far ahead by flipping a coin.

Over time, we found we achieved pretty good success rates with our lower-level hires, but the one area where consistent success eluded me was in our management recruitment. It seemed that the more senior the position, the worse our track record was. We had a few outright disasters.

When it comes down to it, hiring someone is making a prediction. You examine the evidence and try to foresee if that person will perform at an acceptable level in the position you have vacant. And, as I said in my last column, we humans don’t tend to be very good at making predictions. The more there is at stake in the position to be filled, the worse the consequences if our predictions are faulty. In looking at our past management hires, I realize that it wasn’t that our predictive powers were any less effective, it’s just that the pain of being wrong was more acute.

So, it was with some reassurance and more than a dollop of schadenfreude that I learned that Google has had exactly the same problem. That’s right, Google — the same company that has a zillion brilliant engineers working on every problem known to mankind. But those engineers have to come from somewhere, right? Someone has to hire them. And there, ay, there’s the rub!

In a recent interview in the New York Times, Laszlo Bock, senior vice president for people operations at Google, confessed that Google has tweaked, and, in some cases, massively overhauled its recruitment process.  Take, for example, Google’s famous early predilection for college G.P.A.s. According to Bock, based on actual performance, “G.P.A.s are worthless as a criteria for hiring, and test scores are worthless — no correlation at all except for brand-new college grads, where there’s a slight correlation. Google famously used to ask everyone for a transcript and G.P.A.s and test scores, but we don’t anymore, unless you’re just a few years out of school. We found that they don’t predict anything.”

Google has also slowly backed away from its ironclad requirement that every hire have a degree. Bock revealed, “The proportion of people without any college education at Google has increased over time as well. So we have teams where you have 14% of the team made up of people who’ve never gone to college.”

Sometimes, interviewers fall into the trap of over-playing their own cleverness and “expertise.” We spend more time trying to stroke our own ego by staging an impromptu show of power during the interview than in really listening to what the interviewee is saying. Google found that tricks like brainteasers, while they may make the interviewer feel clever, are worthless in screening out duds. The much less flashy but tried-and-true list of standardized behavioral questions (“Give me an example of when you…”) is a far better predictive indicator.

Finally, Bock admits that screening for leadership positions is the most difficult challenge, because leadership is something that defies easy definitions. “We’ve found that leadership is a more ambiguous and amorphous set of characteristics than the work we did on the attributes of good management, which are more of a checklist and actionable.” So you can ask questions, probing for effective leadership, but because leading people tends to fall into the category of ill-defined problems, you often have to do the best job you can in the hiring process, and then track performance religiously. In this case, “slow to hire, quick to fire” is a good principle to follow.

I found Bock’s last words, on the role of Big Data in management decisions, including those involving people’s performance, revealing: “Big Data — when applied to leadership — has tremendous potential to uncover the 10 universal things we should all be doing. But there are also things that are specifically true only about your organization, and the people you have and the unique situation you’re in at that point in time. I think this will be a constraint to how big the data can get because it will always require an element of human insight.”