Machine Ignorance and Our Future

Data models and lack of judgment are steering us toward disaster.

As I read Avinash latest newsletter — “Machine Intelligence, And Your Future!”, I think about my own career path and how, in recent months, I have become much more concerned and worried about the impeding risks of data abuse at scale.

Earlier this year I spent days preparing for my SuperweekHamel Hell” presentation, and even more for the Marketing Festival. I dug into fascinating stories of rogue machine learning experiments and other data horror stories which I summed up in “The Doomsday Upon Us.” Some attendees said I scared the shit out of them… The truth is I was scared myself!

Before interviewing Christopher Wylie I also read all I could, watched hours of testimonials from Wylie and Zuckerberg, and realized so little have changed since Cambridge Analytica; so messed up and evil Facebook was/is; and how many other Cambridge Analytica-like there are out there.

A quote from “The new dot-com bubble is here: it’s called online advertising” struck a chord:

There’s a Great Future in Data

“One word: data. There’s a great future in data.”

This adaptation of the famous “plastics” quote from The Graduate was spot on — I used it myself in some old presentations. That’s what we were told, and what I believed. Myself, and so many of my fellow digital analysts have great careers so far.

But the future isn’t so bright, or only partially, if we trust the Great Avinash.

I share Avinash’s assessment that data and prediction-related jobs could very soon be replaced by machine learning and AI, and one has to aim for the “judgment” jobs to remain valuable.

Data As No Judgment

What I’m really worried about — and what I witness almost daily — is a generalized lack of judgment on the part of marketers and analysts. A lack of judgment in how the data is collected and how it is turned into some religious Holly Truth. How “the masses” have been conditioned to accept their every action, every move and every mood is being tracked. After all, most people think they have nothing to hide, and since they are going to get ads anyway, better get “targeted” ads. You got it! Targeted — selected as an object of attention or attack…

The masses have been conditioned to accept their every action, every move and every mood is being tracked.

Vendors with our “best interest” in mind and leading voices in our industry said we should poor all the data we could find into some big data swamp; hand the reins over to a superior intelligence; and it would automagically give us the weapons to deceive and manipulate target audiences at scale — squeeze a little more attention, gain a little more profit. Repeat until… until what?

As I was writing this article, I stumbled upon a LinkedIn thread (in French) about the integration of Google Cloud AutoML with Kaggle. Michael Albo, CEO of a local data science consulting firm and community advocate, shared his concern that people without any real expertise in data science would be able to rival against tried and true data scientists. Where some see an inevitable democratization of data science, others see the danger of using those very powerful tools as black boxes. If we accept data, machine learning and AI ultimately serve the greater purpose of informing decisions which could affect real human beings, there’s also a growing risk those systems will fail, mislead, and deceive. The term “machine learning” has two words and I fear the “learning” aspect, which is an essential component of developing judgment, is too easily overlooked.

If we accept data, machine learning and AI ultimately serve the greater purpose of informing decisions which could affect real human beings, there’s also a growing risk those systems will fail, mislead, and deceive.

Dangerous Curve Ahead

I think the blind faith in data, those machines and models which dictate our daily lives, is dangerous. I think the hyper-targeted personalization at scale which can wreak havoc in democracies is dangerous. I think the risks of doing evil — without even realizing it — is becoming seriously dangerous.

I can’t judge for other industries, but the digital marketing & analytics space is particularly at risk of falling on the dark side. Despite the advent of GDPR, CCPA and ITP — all measures built to prevent abuse and protect consumers — vendors are engaged in a game of cat & mouse with regulators. As I write this, every single one of those vendors — be it analytics or ad networks — are seeking ways to collect more data more accurately. Some of them are walking on the edge of compliance and pretending to be transparent while trying very hard to circumvent the right of consumers to free will and respect. My fellow marketers and analysts are seeking ways to continue to collect as much data while being compliant because they largely see data as a business right.

After all, hasn’t marketing always been a game of persuasion? Yes, of course, except people are now the target of propaganda and manipulation with unprecedented accuracy and scale.

Despite a “cookie-less world” and Intelligent Tracking Prevention which should seriously limit their ability to track you, not a single vendor will be seriously impaired. To the contrary, those initiatives have only forced the creation of more clever ways to collect even more data. Can you imagine a single vendor saying, “Our product isn’t as good as before because of GDPR/CCPA/ITP”? Of course not, they will boast to their clients the ability to track you better than ever before!

This Is Your Digital Life

The famous Facebook app “This Is Your Digital Life” was the vector by which Cambridge Analytica was able to collect details about millions of people.

Do you realize right now, at this very moment, most ad network possess more data than Cambridge Analytica ever had?

As a digital marketing or analyst, what’s your limit? Where is the red line that would make you say, “No, I don’t want to be a cog in the system?”

In many ways, I feel I have reached this breaking point.

For example, is it ethical for a financial organization to include ad networks trackers when you are viewing your financial details? A simple question with a fascinating range of answers!

For me, the dilemma is easily solved:

When in a signed-in status where you consult highly sensitive personal data, very tight data practice is essential and sharing behavioural data with 3rd party ad networks isn’t what I consider to be ethical.

This is one of many stories I want to investigate further. There are other experts raising the flag on questionable ethics and malpractice, particularly in the channel “data-protection-priv” on #Measure Slack. There’s an opportunity to collaborate, define best practices, methods and tools. There might be occasions, however, where we should tear out the red tape, stop avoiding the hard questions and call out those who are detrimental to an ethical and honest use of data. Over the next couple of weeks, I hope to share my take on several data and privacy-related stories, some recommended best practices, tools and services.

One final note: if you are a digital marketer or analyst who witnesses the evil within, please get in touch, in confidence.

Written by

All the world is made of faith, and trust, and pixie dust. Digital marketer & analyst with a strong interest for privacy and the ethical use of data.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store