Personalization After Cambridge Analytica: Building Products People Actually Trust

In March, the Cambridge Analytica story broke. By May, GDPR went into effect across Europe. In the span of a few weeks, the conversation around personalization shifted from "how do we do more of it?" to "wait, should we be doing this at all?"

If you work in product, you felt the ground move.

Protesters in Parliament Square following the Cambridge Analytica and Facebook data scandal, March 2018
Protesters gather in Parliament Square after the Cambridge Analytica revelations, March 29, 2018. Photo by Jwslubbock, CC BY-SA 4.0.

The trust crisis, in numbers

The scale of the Cambridge Analytica breach was staggering. 87 million Facebook profiles were harvested through a quiz app that most of those people never even used. The data was repurposed for political micro-targeting without anyone's meaningful consent. When the news hit, the #DeleteFacebook campaign surged. Engagement on the platform dropped roughly 20% in the first full month after the story broke.

But the fallout went well beyond Facebook. A broader survey found that 86% of consumers now say they worry about how companies use their personal data. That's not a niche privacy-advocate crowd. That's nearly everyone.

I remember talking about it over dinner with friends that week. These are people who don't work in tech, who never think about data policies. And yet every one of them had an opinion. One friend had gone through her phone deleting apps she hadn't opened in months, just because she didn't trust what they might be collecting. That's when it hit me that this wasn't just an industry story. It had become personal for people in a way that data breaches usually don't.

Mark Zuckerberg delivering the F8 2018 keynote at the San Jose Convention Center, weeks after testifying before Congress
Mark Zuckerberg at the F8 2018 Developer Conference — his first major public appearance after testifying before Congress on the Cambridge Analytica scandal. Photo by Anthony Quintano, CC BY 2.0.

GDPR changes the rules

GDPR went live on May 25th, and it broadened consent requirements dramatically. Companies now need explicit, informed consent before collecting personal data in cases that previously flew under the radar. The penalties are real: fines can reach 20 million euros or 4% of global revenue, whichever is higher.

For product teams, this isn't just a legal checkbox. It's a design constraint. Every data collection touchpoint, every cookie, every tracking pixel now needs to be rethought from the user's perspective.

Where product managers come in

Product managers sit at the intersection of user needs, business goals, and technical constraints. That makes them uniquely positioned to lead on trust, if they choose to.

The practical steps aren't glamorous, but they matter. Audit what data you actually collect and ask honestly whether you need all of it. Make consent flows clear and straightforward, not buried in dark patterns designed to trick people into opting in. Give users real control over their data, not a settings page that takes fifteen clicks to find.

The best framing I've heard came from a PM at a fintech company I met at a conference this summer. She said her team started treating privacy as a product feature rather than a compliance requirement. They put it on the roadmap, assigned it story points, and iterated on it like they would any other user-facing experience. The result? Their opt-in rates actually went up.

Personalization and privacy can coexist

Spotify is a good proof point here. They deliver deeply personalized experiences while being relatively transparent about how they use listening data. The key difference is that the personalization is built on behavioral data that users knowingly generate by using the product, not data scraped from a quiz app they took three years ago.

The era of collecting everything and asking questions later is over. But personalization itself isn't dead. The companies that earn trust in 2018 will be the ones building personalization that users actually opted into, with their eyes open.

Cambridge Analytica didn't kill personalization. It killed the version of it that people never agreed to.

← All writing Follow on LinkedIn →