A Troubling Prognostication

It’s that time of year again. My inbox is jammed with pitches from PR flacks trying to get some editorial love for their clients. In all my years of writing, I think I have actually taken the bait maybe once or twice. That is an extremely low success rate. So much for targeting.

In early January, many of the pitches offer either reviews of 2019 or predictions for 2020.  I was just about to hit the delete button on one such pitch when something jumped out at me: “The number-one marketing trend for 2020 will be CDPs: customer data platforms.”

I wasn’t surprised by that. It makes sense. I know there’s a truckload of personal data being collected from everyone and their dog. Marketers love platforms. Why wouldn’t these two things come together?

But then I thought more about it — and immediately had an anxiety attack. This is not a good thing. In fact, this is a catastrophically terrible thing. It’s right up there with climate change and populist politics as the biggest world threats that keep me up at night.

To close out 2019,  fellow Insider Maarten Albarda gave you a great guide on where not to spend your money. In that column, he said this: “Remember when connected TVs, Google Glass and the Amazon Fire Phone were going to provide break-through platforms that would force mass marketing out of the box, and into the promised land of end-to-end, personalized one-on-one marketing?”

Ah, marketing nirvana: the Promised Land! The Holy Grail of personalized marketing. A perfect, friction-free direct connection between the marketer and the consumer.

Maarten went on to say that social media is one of the channels you shouldn’t be throwing money into, saying, “It’s also true that we have yet to see a compelling case where social media played a significant role in the establishment or continued success of a brand or service.”

I’m not sure I agree with this, though I admit I don’t have the empirical data to back up my opinion. But I do have another, darker reason why we should shut off the taps providing the flow of revenue to the usual social suspects. Social media based on an advertising revenue model is a cancerous growth — and we have to shut off its blood flow.

Personalized one-to-one marketing — that Promised Land —  cannot exist without a consistent and premeditated attack on our privacy. It comes at a price we should not be prepared to pay.

It depends on us trusting profit-driven corporations that have proven again and again that they shouldn’t be trusted. It is fueled by our darkest and least admirable motives.

The ecosystem that is required to enable one-to-one marketing is a cesspool of abuse and greed. In a pristine world of marketing with players who sport shiny ideals and rock-solid ethics, maybe it would be okay. Maybe. Personally, I wouldn’t take that bet. But in the world we actually live and work in, it’s a sure recipe for disaster.

To see just how subversive data-driven marketing can get, read “Mindf*ck” by Christopher Wylie. If that name sounds vaguely familiar to you, let me jog your memory. Wylie is the whistleblower who first exposed the Cambridge Analytica scandal. An openly gay, liberal, pink-haired Canadian, he seems an unlikely candidate to be the architect of the data-driven “Mindf*ck” machine that drove Trump into office and the Brexit vote over the 50% threshold.

Wylie admits to being blinded by the tantalizing possibilities of what he was working on at Cambridge Analytica: “Every day, I overlooked, ignored, or explained away warning signs. With so much intellectual freedom, and with scholars from the world’s leading universities telling me we were on the cusp of ‘revolutionizing’ social science, I had gotten greedy, ignoring the dark side of what we were doing.”

But Wylie is more than a whistleblower. He’s a surprisingly adept writer who has a firm grasp on not just the technical aspects, but also the psychology behind the weaponization of data. If venture capitalist Roger McNamee’s tell-all expose of Facebook, “Zucked,”  kept you up at night, “Mindf*ck” will give you screaming night terrors.

I usually hold off jumping on the year-end prognostication bandwagon, because I’ve always felt it’s a mug’s game. I would like to think that 2020 will be the year when the world becomes “woke” to the threat of profit-driven data abuse — but based on our collective track record of ignoring inconvenient truths, I’m not holding my breath.

'Twas the Night Before the Internet

Today, just one day before Christmas, my mind swings to the serendipitous side. I don’t know about you, but for me, 2019 has been a trying year. While you would never know it by the collection of columns I’ve produced over the past 12 months, I have tried to find the glimpses of light in the glowering darkness.

Serendipity Sidetrack #1: “Glowering” is a word we don’t use much anymore. It refers to someone who has a dark, angry expression on their faces. As such, it’s pretty timely and relevant. You’d think we would use it more.

One of my personal traditions during the holidays is to catch one of the fourteen billion airings of “It’s a Wonderful Life.” Yes, it’s quintessentially Capraesque. Yes, it’s corny as hell. But give me a big seasonal heaping helping of Jimmy Stewart, Donna Reed and that “crummy little town” known as Bedford Falls.

Serendipity Sidetrack #2: The movie “It’s a Wonderful Life” is based on a 1939 short story by Philip Van Doren Stern. He tried to get it published for several years with no success. He finally self-published it and sent it to 200 friends as a 24-page Christmas card. One of these cards ended up on the desk of an executive at RKO pictures, who convinced the studio to buy the rights in 1943 as a vehicle for its star Cary Grant.

That movie never got made and the project was shelved for the rest of World War II. After the war, director Frank Capra read the script and chose it as his first Hollywood movie after making war documentaries and training films.

The movie was panned by critics and ignored by audiences. It was a financial disaster, eventually leading to the collapse of Capra’s new production company, Liberty Films. One other stray tidbit: during the scene at the high school dance where the gym floor opens over the pool (which was shot at Beverly Hills High School), Mary’s obnoxious date Freddie is played by an adult Carl “Alfalfa” Switzer, from the “Our Gang” series.

But I digress. This seasonal ritual got me thinking along “what if” lines. We learn what Bedford Falls would be like if George Bailey was never born. But maybe the same narrative machinery could be applied to another example: What would Christmas (or your seasonal celebration of choice) be like if the Internet had never happened?

As I pondered this, I realized that there’s really only one aspect of the internet that materially impacts what the holidays have become. These celebrations revolve around families, so if we were going to look for changes wrought by technology, we have to look at the structure and dynamics of the family unit.

Serendipity Sidetrack #3: Christmas was originally not a family-based celebration. It became so in Victorian England thanks to Queen Victoria, Prince Albert and Charles Dickens. After the marriage of the royal couple, Albert brought the German tradition of the Christmas tree to Windsor Castle. Pictures of the royals celebrating with family around the tree firmly shifted the holiday towards its present warm-hearted family center.

In 1843, Dickens added social consciousness to the party with the publication of “A Christmas Carol.” The holiday didn’t take its detour towards overt consumerism until the prosperity of the 1950s.

But back to my rapidly unraveling narrative thread: What would Christmas be like without the Internet?

I have celebrated Christmas in two different contexts: The first, in my childhood and the second with my own wife and family.

I grew up with just my immediate family in rural Alberta, geographically distant from aunts, uncles and cousins. For dinner there would be six of us around the table. We might try to call an aunt or uncle who lived some 2,000 miles away, but usually the phone lines were so busy we couldn’t get through.

The day was spent with each other and usually involved a few card games, a brief but brisk walk and getting ready for Christmas dinner. It was low-key, but I still have many fond memories of my childhood Christmases.

Then I got married. My wife, who is Italian, has dozens and dozens and dozens of relatives within a stone’s throw in any direction. For us, Christmas is now a progressive exercise to see just how many people can be crammed into the same home. It begins at our house for Christmas morning with the “immediate” family (remember, I use the term in its Italian context). The head count varies between 18 and 22 people.

Then, we move to Christmas dinner with the “extended” family. The challenge here is finding a house big enough, because we are now talking 50 to 75 people. It’s loud, it’s chaotic — and I couldn’t imagine Christmas any other way.

The point here is how the Internet has shifted the nature of the celebration. In my lifespan, I have seen two big shifts, both to do with the nature of our personal connections. And like most things with technology, one has been wonderful while the other has been troubling.

First of all, thanks to the Internet, we can extend our family celebrations beyond the limits of geography. I can now connect with family members who don’t live in the same town.

But, ironically, the same technology has been eroding the bonds we have with the family we are physically present with. We may be in the same room, but our minds are elsewhere, preoccupied with the ever-present screens in our pockets or purses. In my pre-Internet memories of Christmas, we were fully there with our families. Now, this is rarely the case.

And one last thought. I find — sadly — that Christmas is just one more occasion to be shared through social media. For some of us, it’s not so much who we’re with or what we’re doing, but about how it will look in our Instagram post.

The Ruts of Our Brain

We are not – by nature – open minded. In fact, as we learn something, the learning creates neural pathways in our brain that we tend to stick to. In other words, the more we learn, the bigger the ruts get.

Our brains are this way by design. At its core, the brain is an energy saving device. If there are two options open to it, one requiring more cognitive processing and one requiring less, the brain will default to the less resource intensive option.

This puts expertise into an interesting new perspective. In a recent study, researchers from Cold Spring Harbor Laboratory, Columbia University, University College London and Flatiron Institute found that when mice learn a new task, the neurons in their brain actually change as they move from being a novice to an expert. At the beginning as they’re learning the task, the required neurons don’t “fire” until the brain makes a decision. But, as expertise is gained, those same neurons start responding before they’re even needed. It’s essentially Hebbian Theory (named after neurologist Donald Hebbs) in action: the neurons that fire together eventually wire together.

We tend to think of experts as bringing a well-honed subset of intellectual knowledge to a question. And that is true, as long as the question is well within their area of expertise. But the minute an expert ventures outside of their “rut” they begin to flounder. In fact, even when they are in their area of expertise but are asked to predict where that path that may lead in the future – beyond their current rut – their expertise doesn’t help them. In 2005 psychologist Phillip Tetlock published “Expert Political Judgement” – a book showing the results of a 20-year long study on the prediction track record of experts. It wasn’t good. According to a New Yorker review of the book, “Human beings who spend their lives studying the state of the world…are poorer forecasters than dart-throwing monkeys”

Why? Well, just like those mice in the above-mentioned study, once we have a rut, our brains like to stick to the rut. It’s just easier for us. And experts have very deep ruts. The deeper the rut, the more effort it takes to peer above it. As Tetlock found, when it comes to predicting what might happen in some area in the future, even if you happen to be an expert in that area, you’d probably be better off flipping a coin than relying on your brain.

By the way, for most of human history, this has been a feature, not a bug. Saving cognitive energy is a wonderful evolutionary advantage. If you keep doing the same thing over and over, eventually the brain pre-lights the neuronal path required, saving itself time and energy. The brain is directing anticipated traffic at faster than the speed of thought. And it’s doing it so well, it would take a significant amount of cognitive horsepower to derail this action.

Like I said, in a fairly predictably world of cause and effect, this system works. But in an uncertain world full of wild card complexity, it can be crippling.

Complex worlds require Foxes, not Hedgehogs. This analogy also comes from Tetlock’s book. According to an old Greek fable, “The fox knows many things but the hedgehog knows just one thing.” To that I would add; the fox knows a little about many things, but the hedgehog knows a lot about one thing. In other words, the hedgehog is an expert.

In Tetlock’s study, people with “fox” qualities had a significantly better track record then “hedgehogs” when it came to predicting the future. Their brains were better able to take the time to synthesize the various data inputs required to deal with the complexity of crystal balling the future because they weren’t barrelling down a pre-ordained path that had been carved by years of accumulated expertise.

But it’s not just expertise that creates these ruts in our brains. The same pattern plays out when we look at the impact of our beliefs play in how open-minded we are. The stronger the belief, the deeper the rut.

Again, we have to remember that this tendency of our brains to form well-travelled grooves over time has been crafted by the blind watchmaker of evolution. But that doesn’t make it any less troubling when we think about the limitations it imposes in a more complex world. This is especially true when new technologies deliberately leverage our vulnerability in this area. Digital platforms ruthlessly eliminate the real estate that lies between perspectives. The ideological landscape in which foxes can effectively operate is disappearing. Increasingly we grasp for expertise – whether it’s on the right or left of any particular topic – with the goal of preserving our own mental ruts.

And as the ruts get deeper, foxes are becoming an endangered species.

Why Quitting Facebook is Easier Said than Done

Not too long ago, I was listening to an interview with a privacy expert about… you guessed it, Facebook. The gist of the interview was that Facebook can’t be trusted with our personal data, as it has proven time and again.

But when asked if she would quit Facebook completely because of this — as tech columnist Walt Mossberg did — the expert said something interesting: “I can’t really afford to give up Facebook completely. For me, being able to quit Facebook is a position of privilege.”

Wow!  There is a lot living in that statement. It means Facebook is fundamental to most of our lives — it’s an essential service. But it also means that we don’t trust it — at all.  Which puts Facebook in the same category as banks, cable companies and every level of government.

Facebook — in many minds anyway – became an essential service because of Metcalfe’s Law, which states that the effect of a network is proportional to the square of the number of connected users of the system. More users = exponentially more value. Facebook has Metcalfe’s Law nailed. It has almost two and a half billion users.

But it’s more than just sheer numbers. It’s the nature of engagement. Thanks to a premeditated addictiveness in Facebook’s design, its users are regular users. Of those 2.5 billion users, 1.6 billion log in daily. 1.1 billion log in daily from their mobile device. That means that 15% of all the people in the world are constantly — addictively– connected to Facebook.

And that’s why Facebook appears to be essential. If we need to connect to people, Facebook is the most obvious way to do it. If we have a business, we need Facebook to let our potential customers know what we’re doing. If we belong to a group or organization, we need Facebook to stay in touch with other members. If we are social beasts at all, we need Facebook to keep our social network from fraying away.

We don’t trust Facebook — but we do need it.

Or do we? After all, we homo sapiens have managed to survive for 99.9925% of our collective existence without Facebook. And there is mounting research that indicates  going cold turkey on Facebook is great for your own mental health. But like all things that are good for you, quitting Facebook can be a real pain in the ass.

Last year, New York Times tech writer Brian Chen decided to ditch Facebook. This is a guy who is fully conversant in tech — and even he found making the break is much easier said than done. Facebook, in its malevolent brilliance, has erected some significant barriers to exit for its users if they do try to make a break for it.

This is especially true if you have fallen into the convenient trap of using Facebook’s social sign-in on sites rather than juggling multiple passwords and user IDs. If you’re up for the challenge, Chen has put together a 6-step guide to making a clean break of it.

But what if you happen to use Facebook for advertising? You’ve essentially sold your soul to Zuckerberg. Reading through Chen’s guide, I’ve decided that it’s just easier to go into the Witness Protection Program. Even there, Facebook will still be tracking me.

By the way, after six months without Facebook, Chen did a follow-up on how his life had changed. The short answer is: not much, but what did change was for the better. His family didn’t collapse. His friends didn’t desert him. He still managed to have a social life. He spent a lot less on spontaneous online purchases. And he read more books.

The biggest outcome was that advertisers “gave up on stalking” him. Without a steady stream of personal data from Facebook, Instagram thought he was a woman.

Whether you’re able to swear off Facebook completely or not, I wonder what the continuing meltdown of trust in Facebook will do for its usage patterns. As in most things digital, young people seem to have intuitively stumbled on the best way to use Facebook. Use it if you must to connect to people when you need to (in their case, grandmothers and great-aunts) — but for heaven’s sake, don’t post anything even faintly personal. Never afford Facebook’s AI the briefest glimpse into your soul. No personal affirmations, no confessionals, no motivational posts and — for the love of all that is democratic — nothing political.

Oh, one more thing. Keep your damned finger off of the like button, unless it’s for your cousin Shermy’s 55th birthday celebration in Zihuatanejo.

Even then, maybe it’s time to pick up the phone and call the ol’ Shermeister. It’s been too long.

The Hidden Agenda Behind Zuckerberg’s “Meaningful Interactions”

It probably started with a good intention. Facebook – aka Mark Zuckerberg – wanted to encourage more “Meaningful Interactions”. And so, early last year, Facebook engineers started making some significant changes to the algorithm that determined what you saw in your News Feed. Here are some excerpts from Zuck’s post to that effect:

“The research shows that when we use social media to connect with people we care about, it can be good for our well-being. We can feel more connected and less lonely, and that correlates with long term measures of happiness and health. On the other hand, passively reading articles or watching videos — even if they’re entertaining or informative — may not be as good.”

That makes sense, right? It sounds logical. Zuckerberg went on to say how they were changing Facebook’s algorithm to encourage more “Meaningful Interactions.”

“The first changes you’ll see will be in News Feed, where you can expect to see more from your friends, family and groups.

As we roll this out, you’ll see less public content like posts from businesses, brands, and media. And the public content you see more will be held to the same standard — it should encourage meaningful interactions between people.”


Let’s fast-forward almost two years and we now see the outcome of that good intention…an ideological landscape with a huge chasm where the middle ground used to be.

The problem is that Facebook’s algorithm naturally favors content from like-minded people. And surprisingly, it doesn’t take a very high degree of ideological homogeneity to create a highly polarized landscape. This shouldn’t have come as a surprise. American Economist Thomas Schelling showed us how easy it was for segregation to happen almost 50 years ago.

The Schelling Model of Segregation was created to demonstrate why racial segregation was such a chronic problem in the U.S., even given repeated efforts to desegregate. The model showed that even when we’re pretty open minded about who our neighbors are, we will still tend to self-segregate over time.

The model works like this. A grid represents a population with two different types of agents: X and O. The square that the agent is in represents where they live. If the agent is satisfied, they will stay put. If they aren’t satisfied, they will move to a new location. The variable here is the level of satisfaction determined by what percentage of their immediate neighbours are the same type of agent as they are. For example, the level of satisfaction might be set at 50%; where the X agent needs at least 50% of its neighbours to also be of type X. (If you want to try the model firsthand, Frank McCown, a Computer Science professor at Harding University, created an online version.)

The most surprising thing that comes out of the model is that this threshold of satisfaction doesn’t have to be set very high at all for extensive segregation to happen over time. You start to see significant “clumping” of agent types at percentages as low as 25%. At 40% and higher, you see sharp divides between the X and O communities. Remember, even at 40%, that means that Agent X only wants 40% of their neighbours to also be of the X persuasion. They’re okay being surrounded by up to 60% Os. That is much more open-minded than most human agents I know.

Now, let’s move the Schelling Model to Facebook. We know from the model that even pretty open-minded people will physically segregate themselves over time. The difference is that on Facebook, they don’t move to a new part of the grid, they just hit the “unfollow” button. And the segregation isn’t physical – it’s ideological.

This natural behavior is then accelerated by the Facebook “Meaningful Encounter” Algorithm which filters on the basis of people you have connected with, setting in motion an ever-tightening spiral that eventually restricts your feed to a very narrow ideological horizon. The resulting cluster then becomes a segment used for ad targeting. We can quickly see how Facebook both intentionally built these very homogenous clusters by changing their algorithm and then profits from them by providing advertisers the tools to micro target them.

Finally, after doing all this, Facebook absolves themselves of any responsibility to ensure subversive and blatantly false messaging isn’t delivered to these ideologically vulnerable clusters. It’s no wonder comedian Sascha Baron Cohen just took Zuck to task, saying “if Facebook were around in the 1930s, it would have allowed Hitler to post 30-second ads on his ‘solution’ to the ‘Jewish problem’”. 

In rereading Mark Zuckerberg’s post from two years ago, you can’t help but start reading between the lines. First of all, there is mounting evidence that disproves his contention that meaningful social media encounters help your well-being. It appears that quitting Facebook entirely is much better for you.

And secondly, I suspect that – just like his defence of running false and malicious advertising by citing free speech – Zuck has an not-so-hidden agenda here. I’m sure Zuckerberg and his Facebook engineers weren’t oblivious to the fact that their changes to the algorithm would result in nicely segmented psychographic clusters that would be like catnip to advertisers – especially political advertisers. They were consolidating exactly the same vulnerabilities that were exploited by Cambridge Analytica.

They were building a platform that was perfectly suited to subvert democracy.

The Tourification of Our World

Who wouldn’t want to be in Venice? Gondolas drift by with Italian gondoliers singing “O Sole Mio.” You sit at a café savoring your espresso as you watch Latin lovers stroll by hand in hand on their way to the Bridge of Sighs. The Piazza San Marco is bathed in a golden glow as the sun sets behind the Basilica di San Marco. The picture? Perfect.  

Again, who wouldn’t want to live in Venice?

The answer, according to the latest population stats, is almost everyone. The population of Venice is one third what it was in 1970.

The sharp-eyed among you may have noticed that I changed the sentence slightly in the second version. I replaced “be” with “live.”  And that’s the difference. Venice is literally the “nice place to visit but I wouldn’t want to live there.”  

A lot of people do visit, well over 5 million a year. But almost nobody lives there. The permanent population of Venice has shrunk to below 60,000.

Venice has become tourified. It’s a false front of a city, one built for those who are going to be there for 48 to 72 hours. In the process, everything needed to make it sustainable for those who want to call it home has been stripped out. It has become addicted to tourist dollars — and that addiction is killing it.

We should learn from Venice’s example. Sometimes, in trying to make a fantasy real, you take away the very things needed to let it survive.

Perfection doesn’t exist in nature. Imperfections are required for robustness. Yet, we are increasingly looking for a picture of perfection we can escape to.

The unintended consequences of this are troubling to think about.

We spent a good part of the last century devising new ways to escape. What was once an activity that lived well apart from our real lives has become increasingly more entwined with those lives.

As our collective affluence has grown, we spend more and more time chasing the fantastical. Social media has accelerated this chase. Our feeds are full of posts from those in pursuit of a fantasy.

We have shifted our focus from the place we live to the “nice place to visit.” This distorts our expectations of what reality should be. We expect the tourist-brochure version of Venice without realizing that in constructing exactly that, we set in motion a chain of events resulting in a city that’s unlivable.

The rise of populist politics is the broken-mirror image of this. Many of us have mythologized the America we want — or Britain, or any of the other countries that have gone down the populist path.. And myths are, by definition, unsustainable in the real world. They are vastly oversimplified pictures that allow us to create a story that we long for. It’s  the same as the picture I painted of Venice in the first paragraph: a fantasy that can’t survive reality.

In our tendency to “tourify” everything, there are at least two unintended consequences: one for ourselves and one for our world.

For us, the need to escape continually draws our energies and attentions from what we need to do to save the world we actually live in, toward the mythologization of the world we think we want to live in. We ignore the inconvenient truths of reality as we pursue our imagined perfection.

But it’s the second outcome that’s probably more troubling. Even if we were successful in building the world we think we want, we could well find that built a bigger version of Venice, a place sinking under the weight of its own fantasy.

Sometimes, you have to be careful what you wish for.

Looking Back at a Decade That’s 99.44% Done

Remember 2010? For me that was a pretty important year. It was the year I sold my digital marketing business. While I would continue to actively work in the industry for another 3 years, for me things were never the same as they were in 2010. And – looking back – I realize that’s pretty well true for most of us. We were more innocent and more hopeful. We still believed that the Internet would be the solution, not the problem.

In 2010, two big trends were jointly reshaping our notions of being connected. Early in the year, former Morgan Stanley analyst Mary Meeker laid them out for us in her “State of the Internet” report. Back then, just three years after the introduction of the iPhone, internet usage from mobile devices hadn’t even reached double digits as a percentage of overall traffic. Meeker knew this was going to change, and quickly. She saw mobile adoption on track to be the steepest tech adoption curve in history. She was right. Today, over 60% of internet usage comes from a mobile device.

The other defining trend was social media. Even then, Facebook had about 600 million users, or just under 10% of the world’s population. When you had a platform that big – connecting that many people – you just knew the consequences will be significant. There were some pretty rosy predications for the impact of social media.

Of course, it’s the stuff you can’t predict that will bite you. Like I said, we were a little naïve.

One trend that Meeker didn’t predict was the nasty issue of data ownership. We were just starting to become aware of the looming spectre of privacy.

The biggest Internet related story of 2010 was WikiLeaks. In February, Julian Assange’s site started releasing 260,000 sensitive diplomatic cables sent to them by Chelsea Manning, a US soldier stationed in Iraq. According to the governments of the world, this was an illegal release of classified material, tantamount to an act of espionage. According to public opinion, this was shit finally rolling uphill. We revelled in the revelations. Wikileaks and Julian Assange was taking it to the man.

That budding sense of optimism continued throughout the year. By December of 2010, the Arab Spring had begun. This was our virtual vindication – the awesome power of social media was a blinding light to shine on the darkest nooks and crannies of despotism and tyranny. The digital future was clear and bright. We would triumph thanks to technology. The Internet had helped put Obama in the White House. It had toppled corrupt regimes.

A decade later, we’re shell shocked to discover that the Internet is the source of a whole new kind of corruption.

The rigidly digitized ideals of Zuckerberg, Page, Brin et al seemed to be a call to arms: transparency, the elimination of bureaucracy, a free and open friction-free digital market, the sharing economy, a vast social network that would connect humanity in ways never imagined, connected devices in our pockets – in 2010 all things seemed possible. And we were naïve enough to believe that those things would all be good and moral and in our best interests.

But soon, we were smelling the stench that came from Silicon Valley. Those ideals were subverted into an outright attack on our privacy. Democratic elections were sold to the highest bidder. Ideals evaporated under the pressure of profit margins and expanding power. Those impossibly bright, impossibly young billionaire CEO’s of ten years ago are now testifying in front of Congress. The corporate culture of many tech companies reeks like a frat house on Sunday morning.

Is there a lesson to be learned? I hope so. I think it’s this. Technology won’t do the heavy lifting for us. It is a tool that is subject to our own frailty. It amplifies what it is to be human. It won’t eliminate greed or corruption unless we continually steer it in that direction. 

And I use the term “we” deliberately. We have to hold tech companies to a higher standard. We have to be more discerning of what we agree to. We have to start demanding better treatment and not be willing to trade our rights away with the click of an accept button. 

A lot of what could have been slipped through our fingers in the last 10 years.  It shouldn’t have happened. Not on our watch.