Minding the Gap: How Amazon Mastered the Market by Being Physical

This week, two would-be challengers to Amazon’s e-tail crown were humbled in one fell swoop. When Walmart pulled their products off Google Express – the position of Amazon as the undisputed owner of online sales was further consolidated.

When Google introduced Express in 2013 and then expanded the delivery service to the primary US metro areas in 2014, they were aiming directly at Amazon’s Prime service. But in the past 5 years, Prime has flourished and Express – well – appears to be expiring. It may join a growing list of other shuttered Google projects: Google Plus, Google Glass, Google Waves, Google Buzz – you get the idea.

Walmart, for its part, has certainly grown their online sales – thanks to a buying spree to help beef up it’s online marketplace – but according to the most recent numbers I could find (July of 2018) Amazon owns 50% of all Retail ecommerce sales compared to just 3.7% for Walmart. What is probably even more discouraging for the Big Box from Bentonville is that Amazon’s Year over Year growth kept pace with theirs, so they weren’t able to make up any lost ground.

Why is Amazon dominating? In my humble opinion, this is not about technology or online platforms. This is about what happens on your doorstep. Amazon knows the importance of the Customer Moments of Truth.

The First Moment of Truth, as they were laid out in 2006 by the former CEO of Proctor Gamble, A.G. Lafley, is the moment a customer chooses a product over the other competitors’ offerings

The Second Moment of Truth was when the customer makes the purchase and gets their hand on the product for the first time.

The Third Moment of Truth is when the customer shares their experience through feedback or – today – through social media.

Since Lafley first defined these moments of truth, there have been a few others added that I will get to in a minute, but let’s focus on Moment One and Moment Two for now. Remember, a marketplace is really just a connection between producers and consumers. It is the home of the Moment One and Two – especially Moment Two. This is where Amazon is re-imagining the Marketplace.

Amazon has out “Walmarted” Walmart at their own game. It has been all about logistics and consumer convenience in the Second Moment of Truth. Amazon has assembled a potent consumer offer that is very difficult to compete against – based on making the gap between Moment One and Moment Two as seamless as possible.

That brings us to another addition to those Moments of Truth – The “Actual” Moment of Truth – as defined by Amit Sharma, CEO and founder of Narvar. According to Sharma, this is the gap in online retail between when you hit the buy button and when the package hits your doorstep. Sharma has some street cred in this department. He helped engineer Walmart’s next generation supply chain before heading to Apple in 2010 where he oversaw the shipping and delivery experience.

Why is this gap important? It’s because it is the black hole of customer intent – a pause button that has to be hit between purchase and physical fulfillment.  It’s this gap that Amazon has grabbed as their own.

Google hasn’t been able to do the same. Why? Because Google failed to connect the physical and digital worlds. Amazon did. They reinvented the marketplace. And they did it through branded fulfillment. That was the genius of Amazon – getting brown boxes with the ubiquitous Amazon Smile logo on your doorstep. Yes, they also ushered in the long tail of product selection, but that is an ephemeral ground to defend. It’s their branding of the moment of delivery that has made Amazon the most valuable brand in the world. And now they can extend that into new areas – seemingly at will.  This is not so much a pivot as a sprawl. It’s a digital land grab.

The final moment of Truth is the ZMOT – The Zero Moment of Truthdefined by Jim Lecinski who was with Google at the time. According to Jim, the Zero Moment of Truth is “the precise moment when they (the customers) have a need, intent or question they want answered online.” This is – and will continue to be – Google’s wheelhouse. But it remains firmly anchored in the digital world, far on the other side of the Actual Moment of Truth.

For Amazon, winning in online retail is all about Minding the Gap.

A Year on the Inside

(Note: This is the column this week for MediaPost’s Media Insider Column, which I write for every Tuesday. The references in this post are to that publication)

In the past 12 months, what have your Media insiders been talking about? I wondered that myself, so I did a tally. I grouped all last year’s columns into 10 broad categories. Here, roughly speaking, are the topics we’ve covered in 2018:

Disruption in Our Biz

At a whopping 63 columns, this was by far the most popular topic, accounting for a full 25% of all the Media Insider columns written last year. Authorship was pretty much split among all the insiders, including yours truly.

Editorial angles included disruption in TV ad buying, the future of the ad holding company and agencies, marketer distrust of their agencies, the rise of digital “frenemies” and the very nature of the relationship between advertisers and their market. We may have taken different approaches, but we all had this viewpoint in common with Stephen Stills when he wrote this song lyric for Buffalo Springfield: “Something’s happening here. What it is ain’t exactly clear.”

How Technology is Changing Us

The second most popular topic looked at disruption of a different sort: how tech is rewiring humans. This, of course, is my favorite topic, but I wasn’t alone. 37 columns were written on this issue, making up 15% of all the Media Insiders last year. The vast majority of these were cautionary in tone, worrying that tech may be leading us down a dystopian path.

Politically Charged Tech

The third most popular theme? No real surprise here. It was about the overlap of tech — especially social media — and politics. We collectively penned 34 columns on this topic, making up 13% of the total editorial calendar. The interesting aspect of this — for me, anyway — was the question of whether the relationship was simply correlational or causal. Did tech take us to where we are today? Or was it simply the channel we used to bitch about it?

Privacy and Data Concerns

Coming in as a close contender for the top three spots was the whole personal privacy mess, featuring the long-running Facebook debacle. The various security breaches, exposes of Russian hacking, the Cambridge Analytica scandal and Facebook’s consistently abysmal behavior were on our collective minds, generating 30 columns making up 12% of all Media Insider content. Facebook may have been the poster child of this particular theme, but the question of data privacy goes beyond that to a much more fundamental question: Who should own our data?

Marketing Strategy and Execution

Rounding out the top five was probably the most helpful topic of the bunch: How the hell should you market anyway? In 2018, 27 columns were written on the topic, representing 11% of all columns which ran. A hat tip to fellow Insider Cory Treffiletti here, who wrote most of those.

New Consumer Tech

As you can see, the top five topics were mostly negative in nature, mainly concerned with worrying about what was happening. The next two topics were a little more starry-eyed, starting with dreaming about a richer tech future. Eighteen times in 2018 we wrote about new consumer tech (making up 7% of all columns), including voice-enabled, AR, VR and AI. While we sometimes hit a negative note, most of the time we adopted a “Gee Whiz” enthusiasm about what this new tech could bring.

New MarTech

The number seven theme had us putting on our marketer’s hat and enthusing about how tech will improve marketing. We wrote about this 13 times, representing 5% of the content. Again, while we realize that this is one of the contributing factors to disruption in our business, we remained overwhelming positive about the possibilities. We also saw consolidation of this market in our collective crystal balls.

A Glimpse Inside Our Personal Worlds

Tied for the 7th spot — with 13 columns — was a bit of a catchall category I called personal insights. The topics were varied, but they all touched on who we were as humans and how we saw the world. Often we used our own experiences as our narrative devices.

The Evolution of Entertainment and Content Publishing

The Insiders occasionally mused — 11 times last year, to be exact — about how the very notion of entertainment and content publishing was changing. Again, we were monitoring another disruptive trend. How we are reinventing the way we consume video — thanks to streaming and binge-watching — was the most popular topic in this category, but we also wondered about the future of the printed word as well.

The New Definition of Branding

Finally, we wondered what will become of the notion of branding in an increasingly polarized, digitally mediated market place. This was our topic for seven columns last year, making up 3% of the total Insider pie. We saw the continuing rise of brand activists, slacktivists and overtly political brand messaging. In short, we saw branding mirror what was happening in the real world.

As a sample of where our heads are at, there were no real surprises when I tallied up the numbers. This showed that we Insiders, just like everyone else, are trying to make sense of an increasingly nonsensical world and industry. We feel the earth moving under our feet, often in seismic jolts. We worry about the future. We remain cautiously optimistic about the promise of technology in general. We get mad when corporations behave badly. And we use our own lives to help frame our perspective of the world where we live and work.

It will be interesting to see what we write about in 2019.

Dear Facebook. It’s Not Me, It’s You

So, let’s say, hypothetically, one wanted to get break up with Facebook? Just how would one do that?

I heard one person say that swearing off Facebook was a “position of privilege.” It was an odd way of putting it, until I thought about it a bit. This person was right. Much as I’d like to follow in retired tech journalist Walter Mossberg’s footsteps and quit Facebook cold turkey, I don’t think I can. I am not in that position. I am not so privileged.

This is no way condones Facebook and its actions. I’m still pretty pissed off about that. I suspect I might well be in an abusive relationship. I have this suspicion because I looked it up on Mentalhealth.net, a website offered by the American Addictions Centers. According to them, an abusive relationship is

where one thing mistreats or misuses another thing. The important words in this definition are “mistreat” and “misuse”; they imply that there is a standard that describes how things should be treated and used, and that an abuser has violated that standard.

For the most part, only human beings are capable of being abusive, because only human beings are capable of understanding how things should be treated in the first place and then violating that standard anyway.”

That sounds bang on when I think about how Facebook has treated its users and their personal data. And everyone will tell you that if you’re in an unhealthy relationship, you should get out. But it’s not that easy. And that’s because of Metcalfe’s Law. Originally applied to telecommunication networks, it also applies to digitally mediated social networks. Metcalfe’s Law states that states that the value of a telecommunications network is proportional to the square of the number of connected users of the system.”

The example often used is a telephone. If you’re the only person with one, it’s useless. If everyone has one, it’s invaluable. Facebook has about 2.3 billion users worldwide. That’s one out of every three people on this planet. Do the math. That’s a ton of value. It makes Facebook what they call very “sticky” in Silicon Valley.

But it’s not just the number of users that makes Facebook valuable. It’s also the way they use it. Facebook has always intended to become the de facto platform for broad based social connection. As such, it is built of “weak ties” – those social bonds defined by Mark Granovetter almost 50 years ago which connect scattered nodes in a network. To go back to the afore-mentioned “position of privilege” comment, the privilege in this case is a lack of dependence on weak ties.

 

My kids could probably quite Facebook. At least, it would be easier for them then it would be for me. But they also are not in the stage of their life where weak ties are all that important. They use other platforms, like Snapchat, to communicate with their friends. It’s a channel built for strong ties. If they do need to bridge weak ties, they escalate their social postings, first to Instagram, then – finally – to their last resort: Facebook. It’s only through Facebook where they’ll reach parents, aunts, cousins and grandmas all at once.

It’s different for me. I have a lifetime of accumulated weak ties that I need to connect with all the time. And Facebook is the best way to do it. I connect with various groups, relatives, acquaintances and colleagues on an as needed basis.  I also need a Facebook presence for my business, because it’s expected by others that need to connect to me. I don’t have the privilege of severing those ties.

So, I’ve decided that I can’t quit Facebook. At least, not yet. But I can use Facebook differently – more impersonally. I can use it as a connection platform rather than a channel for personal expression. I can make sure as little of my personal data falls into Facebook’s hands as possible. I don’t need to post what I like, how I’m feeling, what my beliefs are or what I do daily. I can close myself off to Facebook, turning this into a passionless relationship. From now on, I’ll consider it a tool –  not a friend, not a confidante, not something I can trust – just a way to connect when I need to. My personal life is none of Facebook’s business – literally.

For me, it’s the first step in preventing more abuse.

The Strange Polarity of Facebook’s Moral Compass

For Facebook, 2018 came in like a lion, and went out like a really pissed off  Godzilla with a savagely bad hangover after the Mother of all New Year’s Eve parties.  In other words, it was not a good year.

As Zuckerberg’s 2018 shuddered to its close, it was disclosed that Facebook and Friends had opened our personal data kimonos for any of their “premier” partners. This was in direct violation of their own data privacy policy, which makes it even more reprehensible than usual. This wasn’t a bone-headed fumbling of our personal information. This was a fully intentional plan to financially benefit from that data in a way we didn’t agree to, hide that fact from us and then deliberately lie about it on more than one occasion.

I was listening to a radio interview of this latest revelation and one of the analysts  – social media expert and author Alexandria Samuel – mused about when it was that Facebook lost its moral compass. She has been familiar with the company since its earliest days, having the opportunity to talk to Mark Zuckerberg personally. In her telling, Zuckerberg is an evangelist that had lost his way, drawn to the dark side by the corporate curse of profit and greed.

But Siva Vaidhyanathan – the Robertson Professor of Modern Media Studies at the University of Virgina –  tells a different story. And it’s one that seems much more plausible to me. Zuckerberg may indeed be an evangelist, although I suspect he’s more of a megalomaniac. Either way, he does have a mission. And that mission is not opposed to corporate skullduggery. It fully embraces it. Zuckerberg believes he’s out to change the world, while making a shitload of money along the way. And he’s fine with that.

That came as a revelation to me. I spent a good part of 2018 wondering how Facebook could have been so horrendously cavalier with our personal data. I put it down to corporate malfeasance. Public companies are not usually paragons of ethical efficacy. This is especially true when ethics and profitability are diametrically opposed to each other. This is the case with Facebook. In order for Facebook to maintain profitability with its current revenue model, it has to do things with our private data we’d rather not know about.

But even given the moral vacuum that can be found in most corporate boardrooms, Facebook’s brand of hubris in the face of increasingly disturbing revelations seems off-note – out of kilter with the normal damage control playbook. Vaidhyanathan’s analysis brings that cognitive dissonance into focus. And it’s a picture that is disturbing on many levels.

siva v photo

Siva Vaidhyanathan

According to Vaidhyanathan, “Zuckerberg has two core principles from which he has never wavered. They are the founding tenets of Facebook. First, the more people use Facebook for more reasons for more time of the day the better those people will be. …  Zuckerberg truly believes that Facebook benefits humanity and we should use it more, not less. What’s good for Facebook is good for the world and vice-versa.

Second, Zuckerberg deeply believes that the records of our interests, opinions, desires, and interactions with others should be shared as widely as possible so that companies like Facebook can make our lives better for us – even without our knowledge or permission.”

Mark Zuckerberg is not the first tech company founder to have a seemingly ruthless god complex and a “bigger than any one of us” mission. Steve Jobs, Bill Gates, Larry Page, Larry Ellison; I could go on. What is different this time is that Zuckerberg’s chosen revenue model runs completely counter to the idea of personal privacy. Yes, Google makes money from advertising, but the vast majority of that is delivered in response to a very intentional and conscious request on the part of the user. Facebook’s gaping vulnerability is that it can only be profitable by doing things of which we’re unaware. As Vaidhyanathan says, “violating our privacy is in Facebook’s DNA.”

Which all leads to the question, “Are we okay with that?” I’ve been thinking about that myself. Obviously, I’m not okay with it. I just spent 720 words telling you so. But will I strip my profile from the platform?

I’m not sure. Give me a week to think about it.

Is Google Politically Biased?

As a company, the answer is almost assuredly yes.

But are the search results biased? That’s a much more nuanced question.

Sundar Pinchai testifying before congress

In trying to answer that question last week, Google CEO Sundar Pinchai tried to explain how Google’s algorithm works to Congress’s House Judiciary Committee (which kind of like God explaining how the universe works to my sock, but I digress). One of the catalysts for this latest appearance of a tech was another one of President Trump’s ranting tweets that intimated something was rotten in the Valley of the Silicon:

Google search results for ‘Trump News’ shows only the viewing/reporting of Fake New Media. In other words, they have it RIGGED, for me & others, so that almost all stories & news is BAD. Fake CNN is prominent. Republican/Conservative & Fair Media is shut out. Illegal? 96% of … results on ‘Trump News’ are from National Left-Wing Media, very dangerous. Google & others are suppressing voices of Conservatives and hiding information and news that is good. They are controlling what we can & cannot see. This is a very serious situation-will be addressed!”

Granted, this tweet is non-factual, devoid of any type of evidence and verging on frothing at the mouth. As just one example, let’s take the 96% number that Trump quotes in the above tweet. That came from a very unscientific straw poll that was done by one reporter on a far right-leaning site called PJ Media. In effect, Trump did exactly what he accuses of Google doing – he cherry-picked his source and called it a fact.

But what Trump has inadvertently put his finger on is the uneasy balance that Google tries to maintain as both a search engine and a publisher. And that’s where the question becomes cloudy. It’s a moral precipice that may be clear in the minds of Google engineers and executives, but it’s far from that in ours.

Google has gone on the record as ensuring their algorithm is apolitical. But based on a recent interview with Google News head Richard Gingras, there is some wiggle room in that assertion. Gingras stated,

“With Google Search, Google News, our platform is the open web itself. We’re not arbiters of truth. We’re not trying to determine what’s good information and what’s not. When I look at Google Search, for instance, our objective – people come to us for answers, and we’re very good at giving them answers. But with many questions, particularly in the area of news and public policy, there is not one single answer. So we see our role as [to] give our users, citizens, the tools and information they need – in an assiduously apolitical fashion – to develop their own critical thinking and hopefully form a more informed opinion.”

But –  in the same interview – he says,

“What we will always do is bias the efforts as best we can toward authoritative content – particularly in the context of breaking news events, because major crises do tend to attract the bad actors.”

So Google does boost news sites that it feels are reputable and it’s these sites – like CNN –  that typically dominate in the results. Do reputable news sources tend to lean left? Probably. But that isn’t Google’s fault. That’s the nature of Open Web. If you use that as your platform, you build in any inherent biases. And the minute you further filter on top of that platform, you leave yourself open to accusations of editorializing.

There is another piece to this puzzle. The fact is that searches on Google are biased, but that bias is entirely intentional. The bias in this case is yours. Search results have been personalized so that they’re more relevant to you. Things like your location, your past search history, the way you structure your query and a number of other signals will be used by Google to filter the results you’re shown. There is no liberal conspiracy. It’s just the way that the search algorithm works. In this way, Google is prone to the same type of filter-bubble problem that Facebook has.  In another interview with Tim Hwang, director of the Harvard-MIT Ethics and Governance of AI Initiative, he touches on this:

“I was struck by the idea that whereas those arguments seem to work as late as only just a few years ago, they’re increasingly ringing hollow, not just on the side of the conservatives, but also on the liberal side of things as well. And so what I think we’re seeing here is really this view becoming mainstream that these platforms are in fact not neutral, and that they are not providing some objective truth.”

The biggest challenge here lies not in the reality of what Google is or how it works, but in what our perception of Google is. We will never know the inner workings of the Google algorithm, but we do trust in what Google shows us. A lot. In our own research some years ago, we saw a significant lift in consumer trust when brands showed up on top of search results. And this effect was replicated in a recent study that looked at Google’s impact on political beliefs. This study found that voter preferences can shift by as much as 20% due to biased search rankings – and that effect can be even higher in some demographic groups.

If you are the number one channel for information, if you manipulate the ranking of the information in any way and if you wield the power to change a significant percentage of minds based on that ranking – guess what? You are the arbitrator of truth. Like it or not.

The Psychology Behind My NetFlix Watchlist

I live in Canada – which means I’m going into hibernation for the next 5 months. People tell me I should take up a winter activity. I tell them I have one. Bitching. About winter – specifically. You have your hobbies – and I have mine.

The other thing I do in the winter is watch movies. And being a with it, tech-savvy guy, I have cut the cord and get my movie fix through not one, but three streaming services: Netflix, Amazon Prime and Crave (a Canadian service). I’ve discovered that the psychology of Netflix is fascinating. It’s the Paradox of Choice playing out in streaming time. It’s the difference between what we say we do and what we actually do.

For example, I do have a watch list. It has somewhere around a hundred items on it. I’ll probably end up watching about 20% of them. The rest will eventually go gentle into that good Netflix Night. And according to a recent post on Digg, I’m actually doing quite well. According to the admittedly small sample chronicled there, the average completion rate is somewhere between 5 and 15%.

When it comes to compiling viewing choices, I’m an optimizer. And I’m being kind to myself. Others, less kind, refer to it as obsessive behavior. This is referring to satisficing/optimizing spectrum of decision making. I put an irrational amount of energy into the rationalization of my viewing options. The more effort you put into decision making, the closer you are to the optimizing end of the spectrum. If you make choices quickly and with your gut, you’re a satisficer.

What is interesting about Netflix is that it defers the Paradox of Choice. I dealt with this in a previous column. But I admit I’m having second thoughts. Netflix’s watch list provides us with a sort of choosing purgatory..a middle ground where we can save according to the type of watcher we think we are. It’s here where the psychology gets interesting. But before we go there, let’s explore some basic psychological principles that underpin this Netflix paradox of choice.

Of Marshmallows and Will Power

In the 1960’s, Walter Mischel and his colleagues conducted the now famous Marshmallow Test, a longitudinal study that spanned several years. The finding (which currently is in some doubt) was that children who had – when they were quite young – the willpower to resist immediately taking a treat (the marshmallow) put in front of them in return for a promise of a greater treat (two marshmallows)  in 15 minutes would later do substantially better in many aspects of their lives (education, careers, social connections, their health). Without getting into the controversial aspects of the test, let’s just focus on the role of willpower in decision making.

Mischel talks about a hot and cool system of making decisions that involve self-gratification. The “hot” is our emotions and the “cool” is our logic. We all have different set-points in the balance between hot and cool, but where these set points are in each of us depends on will power. The more willpower we have, the more likely it is that we’ll delay an immediate reward in return for a greater reward sometime in the future.

Our ability to rationalize and expend cognitive resources on a decision is directly tied to our willpower. And experts have learned that our will power is a finite resource. The more we use it in a day, the less we have in reserve. Psychologists call this “ego-depletion” And a loss of will power leads to decision fatigue. The more tired we become, the less our brain is willing to work on the decisions we make. In one particularly interesting example, parole boards are much more likely to let prisoners go either first thing in the morning or right after lunch than they are as the day wears on. Making the decision to grant a prisoner his or her freedom is a decision that involves risk. It requires more thought.  Keeping them in prison is a default decision that – cognitively speaking – is a much easier choice.

Netflix and Me: Take Two

Let me now try to rope all this in and apply it to my Netflix viewing choices. When I add something to my watch list, I am making a risk-free decision. I am not committing to watch the movie now. Cognitively, it costs me nothing to hit the little plus icon. Because it’s risk free, I tend to be somewhat aspirational in my entertainment foraging. I add foreign films, documentaries, old classics, independent films and – just to leaven out my selection – the latest audience-friendly blockbusters. When it comes to my watch list additions, I’m pretty eclectic.

Eventually, however, I will come back to this watch list and will actually have to commit 2 hours to watching something. And my choices are very much affected by decision fatigue. When it comes to instant gratification, a blockbuster is an easy choice. It will have lots of action, recognizable and likeable stars, a non-mentally-taxing script – let’s call it the cinematic equivalent of a marshmallow that I can eat right away. All my other watch list choices will probably be more gratifying in the long run, but more mentally taxing in the short term. Am I really in the mood for a European art-house flick? The answer probably depends on my current “ego-depletion” level.

This entire mental framework presents its own paradox of choice to me every time I browse through my watchlist. I know I have previously said the Paradox of Choice isn’t a thing when it comes to Netflix. But I may have changed my mind. I think it depends on what resources we’re allocating. In Barry Schwartz’s book titled the Paradox of Choice, he cites Sheena Iyengar’s famous jam experiment. In that instance, the resource was the cost of jam. In that instance, the resource was the cost of jam. But if we’re talking about 2 hours of my time – at the end of a long day – I have to confess that I struggle with choice, even when it’s already been short listed to a pre-selected list of potential entertainment choices. I find myself defaulting to what seems like a safe choice – a well-known Hollywood movie – only to be disappointed when the credits roll. When I do have the will power to forego the obvious and take a chance on one of my more obscure picks, I’m usually grateful I did.

And yes, I did write an entire column on picking a movie to watch on Netflix. Like I said, it’s winter and I had a lot of time to kill.

 

Why Disruption is Becoming More Likely in the Data Marketplace

Another weak, another breach. 500 million records were hacked from Marriott, making it the second largest data breach in history, behind Yahoo’s breach of 3 billion user accounts.

For now. There will probably be a bigger breach. There will definitely be a more painful breach. And by painful, I mean painful to you and me.  It’s in that pain – specifically, the degree of the pain – that the future of how we handle our personal data lies.

Markets innovate along paths of least resistance. Market development is a constantly evolving dynamic tension between innovation and resistance. If there is little resistance, markets will innovate in predictable ways from their current state. If this innovation leads to push back from the market, we encounter resistance.  When markets meet significant resistance, disruption occurs, opening the door for innovation in new directions to get around the resistance of the marketplace.  When we talk about data, we are talking about a market where value is still in the process of defining itself. And it’s in the definition of value where we’ll find the potential market resistance for data.

Individual data is a raw resource. It doesn’t have value until it becomes “Big.” Personal data needs to be aggregated and structured to become valuable. This creates a dilemma for us. Unless we provide the raw material, there is no “big” data possible. This makes it valuable to others, but not necessarily to ourselves.

Up to now, the value we have exchanged our privacy for has been convenience. It’s easier for us to store our credit card data with Amazon so we can enable one-click ordering. And we feel this exchange has been a bargain. But it remains an asymmetrical exchange. Our data has no positive value to us, only negative. We can be hurt by our data, but other than the afore-mentioned exchange for convenience, it doesn’t really help us. That is why we’ve been willing to give it away for so little. But once it’s aggregated and becomes “big”, it has tremendous value to the people we give it to. It also has value to those who wish to steal that data from those who we have entrusted it with. The irony here is that whether that data is in the “right” hands or the “wrong” ones, it can still mean pain for us. The differentiator is the degree of that pain.

Let’s examine the potential harm that could come from sharing our data. How painful could this get? Literally every warning we write about here at Mediapost has data at the core. Just yesterday, fellow Insider Steven Rosenbaum wrote about how the definition of warfare has changed. The fight isn’t for land. War is now waged for our minds. And data is used to target those minds.

Essentially, sharing our data makes us vulnerable to being targeted. And the outcome of that targeting can range from simply annoying to life-alteringly dangerous. Even the fact that we refer to it as targeting should raise a red flag. There’s a reason why we use a term typically associated with a negative outcome for those on the receiving end. You’re very seldom targeted for things that are beneficial to you. And that’s true no matter who’s the one doing the targeting. At its most benign, targeting is used to push some type of messaging – typically advertising – to you. But you could also be targeted by Russian hackers in an attempt to subvert democracy. Most acutely, you could be targeted for financial fraud. Or blackmail. Targeting is almost never a good thing. The degree of harm can vary, but the cause doesn’t. Our data – the data we share willingly – makes targeting possible.

We are in an interesting time for data. We have largely shrugged off the pain of the various breaches that have made it to the news. We still hand over our personal data with little to no thought of the consequences. And because we still participate by providing the raw material, we have enabled the development of an entire data marketplace. We do so because there is no alternative without making sacrifices we are not willing to make. But as the degree of personal pain continues to get dialed up, all the prerequisites of market disruption are being put in place. Breaches will continue. The odds of us being personally affected will continue to climb. And innovators will find solutions to this problem that will be increasingly easy to adopt.

For many, many reasons, I don’t think the current trajectory of the data marketplace is sustainable. I’m betting on disruption.