As promised, I’m picking up the thread from last week’s column on why we seem okay with trading privacy for convenience. The simple – and most plausible – answer is that we’re really not being given a choice.
As Mediapost Senior Editor Joe Mandese pointed out in an very on-point comment, what is being creating is an transactional marketplace where offers of value are exchanged for information.:
“Like any marketplace, you have to have your information represented in it to participate. If you’re not “listed” you cannot receive bids (offers of value) based on who you are.”
Amazon is perhaps the most relevant example of this. Take Alexa and Amazon Web Services (AWS). Alexa promises to “make your life easier and more fun.” But this comes at a price. Because Alexa is voice activated, it’s always listening. That means that privacy of anything we say in our homes has been ceded to Amazon through their terms of service. The same is true for Google Assist and Apple Siri.
But Amazon is pushing the privacy envelope even further as they test their new in-home delivery service – Amazon Key. In exchange for the convenience of having your parcels delivered inside your home when you’re away, you literally give Amazon the keys to your home. Your front door will have a smart door lock that can be opened via the remote servers of AWS. Opt in to this and suddenly you’ve given Amazon the right to not only listen to everything you say in your home but also to enter your home whenever they wish.
How do you feel about that?
This becomes the key question. How do we feel about the convenience/privacy exchange. But it turns out that our response depends in large part on how that question is framed. In a study conducted in 2015 by the Annenberg School for Communications at the University of Pennsylvania, researchers gathered responses from participants probing their sensitivity around the trading of privacy for convenience. Here is a sampling of the results:
- 55% of respondents disagreed with the statement: “It’s OK if a store where I shop uses information it has about me to create a picture of me that improves the services they provide for me.”
- 71% disagreed with: “It’s fair for an online or physical store to monitor what I’m doing online when I’m there, in exchange for letting me use the store’s wireless internet, or Wi-Fi, without charge.
- 91% disagreed that: “If companies give me a discount, it is a fair exchange for them to collect information about me without my knowing”
Here, along the spectrum of privacy pushback, we start to see what the real problem is. We’re willing to exchange private information, as long as we’re aware of all that is happening and feel in control of it. But that, of course, is unrealistic. We can’t control it. And even if we could, we’d soon learn that the overhead required to do so is unmanageable. It’s why Vint Cerf said we’re going to have to learn to live with transparency.
Again, as Mr. Mandese points out, we’re really not being given a choice. Participating in the modern economy required us anteing up personal information. If we choose to remain totally private, we cut ourselves off from a huge portion of what’s available. And we are already at the point where the vast majority of us really can’t opt out. We all get pissed off when we hear of a security breach a la the recent Equifax debacle. Our privacy sensitivities are heightened for a day or two and we give lip service to outrage. But unless we go full out Old Order Amish, what are our choices?
We may rationalize the trade off by saying the private information we’re exchanging for services is not really that sensitive. But that’s where the potential threat of Big Data comes in. Gather enough seemingly innocent data and soon you can start predicting with startling accuracy the aspects of our lives that we are sensitive about. We run headlong into the Target Pregnant Teen dilemma. And that particular dilemma becomes thornier as the walls break down between data siloes and your personal information becomes a commodity on an open market.
The potential risk of trading away our privacy becomes an escalating aspect here – it’s the frog in boiling water syndrome. It starts innocently but can soon develop into a scenario that will keep most anyone up at night with the paranoiac cold sweats. Let’s say the data is used for targeting – singling us out of the crowd for the purpose of selling stuff to us. Or – in the case of governments – seeing if we have a proclivity for terrorism. Perhaps that isn’t so scary if Big Brother is benevolent and looking out for our best interests. But what if Big Brother becomes a bully?
There is another important aspect to consider here, and one that may have dire unintended consequences. When our personal data is used to make our world more convenient for us, that requires a “filtering” of that world by some type of algorithm to remove anything that algo determines to be irrelevant or uninteresting to us. Essentially, the entire physical world is “targeted” to us. And this can go horribly wrong, as we saw in the last presidential election. Increasingly we live in a filtered “bubble” determined by things beyond our control. Our views get trapped in an echo chamber and our perspective narrows.