Opinions

After Cambridge Analytica, Brittany Kaiser has a solution for personal data privacy

Brittany Kaiser exposed Facebook’s biggest act of data misuse. It’s led her to find new ways to solve society’s pressing problem with personal data.

Share article

brittany kaiser interview

In 2018 she hit the headlines as the whistle-blower exposing data privacy violations between Cambridge Analytica and Facebook. Trained as a human rights lawyer, Brittany Kaiser since testified at the UK government’s fake news inquiry.

Kaiser has now established a foundation helping the next generation live a safer digital life to experience the benefits of technology, and has explored her experiences in her book, Targeted, and the Netflix documentary The Great Hack. I spoke with Brittany about how personal data use could shift from harming to saving society, and what business can do.

Brittany Kaiser, Co-Founder, Own Your Data Foundation

Susi O’Neill: How can legislation best protect our data rights?

Brittany Kaiser: There’s a dichotomy in the privacy community: Many say data rights should be human rights. But we’ve had human rights protecting privacy enshrined in law for decades. During my legal training, I was frustrated by how hard it is to uphold human rights law in court. People can’t easily get compensation for human rights abuses.

Privacy isn’t real – suing to protect it under human rights has never worked. A legal framework won’t help when data is the world’s most valuable asset class. It’s a multi-trillion dollar industry – we can’t stop it from functioning.

Access to data makes organizations more effective. We should legislate so people are protected from bad actors and abuse, but in a way that lets artificial intelligence, data science and the internet of things (IoT) help us solve the world’s greatest problems.

How can we change the status quo of organizations, rather than people, owning our data?

The framework I like to work with is data ownership. Property is the most potent form of law. If someone damages your property, you’ll win compensation, even with fractional ownership. If I own 20 percent of Facebook’s data about me, I should get 20 percent of the revenue from it.

Data ownership as a business model could look a lot like Airbnb. If data is my property like my house, if you want to use it, I say how you can use it, for how long, and we agree a price before I hand over the keys. It should be transparent. Perhaps I give a license for six months or a year. Then I get compensated, either upfront or as a dividend based on use.

This solves the world’s most significant human rights problem: Extreme poverty. Many supply chains are damaged after the pandemic – the United Nations says 270 million are at risk of starvation. If everyone owns their data, some could earn enough to feed themselves.

Zuckerberg (Mark Zuckerberg, Facebook CEO) says everyone’s data is only worth 17 US dollars a quarter. I think it’s worth far more and that’s just Facebook data, but let’s work with this figure. Two billion people live on less than 2 US dollars a day, so 17 dollars every three months would make a huge difference. Data could solve poverty.

It’s a bold and far-reaching idea that personal data could be turned into personal value. How can we improve privacy standards at the same time?

Right now, data protection and privacy legislation is an ideal. It’s new, and we’re in a pandemic where some countries have given exemptions, so laws haven’t been fully ratified. We’re heading in the right direction to provide more transparency and rights. The goal should be, once people are educated, we have the legislation in place and technologies to secure our data. After that, no one should access it until we give consent.

And we consent because we’re incentivized. Now we have no incentive, so people shout about privacy. There’s no benefit except to open ourselves up to abuse. Business and government have exposed us to this. What if we flip the incentive structure? I could share my data with a diabetes research company, earn money and help cure diabetes. People would opt-in.

How would a personal data market work in practice?

Personal data is the world’s most valuable industry, but we can’t see or participate in it, even though our data is in these markets. We should control and decide which data markets to take part in and what to consent to. Like sharing your data with the government so they can help prevent a terrorist attack. Or a pharma researcher, so medicines work better on women and minorities, rather than the 18-year-old white guys that sign up to medical trials for beer money. You could consent to smart cities and IoT (internet of things), so there are fewer traffic accidents.

You could use your time to do surveys or watch targeted content for a universal earned income. Most policymakers propose a universal basic income as a handout from the government. The income from your data you’ve earned from its value – it’s not a handout. It could pay for a few groceries, or in some economies, for your apartment. A ‘data wallet’ could show the data you’re producing and opportunities to join different marketplaces or research. You decide which opportunities to participate in.

Thousands of companies are building solutions to solve parts of this problem – digital identity management, smart contracts and automated monetization like token rewards or digital money. Soon, we’ll see enterprise-level solutions that we can use reliably.

After your experience with the Cambridge Analytica scandal, you’ve talked about data being used as a weapon in democracy. What does this mean?

Data weaponization means using data to target people for purposes they didn’t consent to. It could be considered violent. Data has been used to target people who break social norms, to incite racial hatred and violence, and by political organizations to suppress voters.

That’s where it gets scary – data can be used for a cyberwar. Not with physical weapons, but undermining the way democracy functions and how we integrate peacefully. If you’re disturbing the peace with a data program, that’s a form of war.

We need to solve this problem in a more systemic way. Put pressure on big tech companies who are doing too little, too late.

We need to pass regulation that demands they invest. With the money they make, they could invest more in solving this problem. Zuckerberg says he’s got thousands of people working on it. Why don’t you have hundreds of thousands of people solving it? It’s nothing compared to efforts made towards shareholder profits.

What role can regulation and cyber standards play in safeguarding our data?

I’m doing legal work about the custodian responsibilities and standards expected of those holding our data. If they prioritize fiduciary responsibilities before obligations to shareholders, there will be more ethical practices. Right now, the greatest responsibility is consumer protection, but organizations have little incentive to protect your data.

Facebook has already been fined over 5 billion US dollars for failing to protect its customers. But it isn’t enough. We need stricter legal protocols. In 2019, Senator Elizabeth Warren put the Corporate Executive Accountability Act bill to the US government. It’s proposed that if a company allows a data breach or hack through negligence, its executives are criminally liable. It’s a vital step. If Zuckerberg or Sheryl Sandberg (Facebook COO) thought they were going to jail or lose their job, they’d have more incentive to protect people.
In both the Cambridge Analytica and Equifax scandals, it’s gross negligence. It’s nothing to do with cybersecurity protocols. For Equifax, 3,000 security certificates weren’t updated, then they got hacked. And their business is holding and sharing personal data. I was an expert witness in the case against them. A lawyer said not installing the update was like leaving the house with the doors open and lights on, then being surprised to get robbed.

We need to enforce higher cybersecurity standards if you want to hold personal data. If you’ve shown good practice and have upheld standards, then no, you shouldn’t be held criminally liable as you’ve done everything you reasonably can. Some hackers are really good, and some vulnerabilities too hard to predict.

What can governments and regulators do to help businesses comply?

I believe most businesses want to do the right thing. We need to make it easy and affordable for them. That’s one of the most important things about California’s Proposition 24. They’ve established the first data protection commission. 50 staff will use their expertise to help companies comply, so it’s not a burden. We need more of this worldwide, particularly as many businesses and economies have taken a hit with the pandemic.

What advice would you give an employee if they see a potential data breach in their company?

Always raise the red flag internally first. Your company may not know they’re doing something wrong, or it’s a decision by one rogue employee. They could fix it. But if you continually raise flags and suddenly you’re being cut out of meetings, getting cold-shouldered, or you get fired, it’s time to go public. A whistleblowing lawyer is one path, or you could directly approach a data regulator. Hold the company to account to make sure they comply and can’t misuse their power.

What’s the philosophy of your Own Your Data Foundation?

I co-founded the Own Your Data Foundation in 2019 when I realized it would take time to get the right data protection legislation, and the enterprise-level tech isn’t yet available.

The sustainable way is to empower people to protect themselves. We give digital literacy training for the Digital Intelligence (DQ) curriculum, which this year became an Institute of Electrical and Electronics Engineers (IEEE) global standard. Leading technology universities, government ministries, think tanks and Microsoft developed it. It’s supported by UNICEF, the World Economic Forum and OECD.

It begins by assessing your digital intelligence and giving you a DQ score – like IQ or EQ – to improve upon. Training topics include your data rights, cybersecurity protocols and media literacy. Tests include spotting fake news and phishing, how to show emotional intelligence online and protecting yourself and others from cyberbullying. We also explore mental and physical health and stop device addiction.

After I’d trained to be a human rights lawyer with Amal Clooney, my first paid job was at Amnesty International in London. Over ten years, they implemented a human rights education program through 13 countries’ education ministries.

For the DQ program, I’m excited as the IEEE standard was only released this year. The Kingdom of Saudi Arabia announced at the G20 Summit they’re adopting it and will include it in their national education curriculum by 2022. In ten years, I hope every country is teaching this.

What problem does the DQ curriculum help solve?

This year we’ve been thrown into a new digital life unprepared, with far more screen time and more platforms. Many people aren’t yet digitally literate. We have to educate fast. That’s why we’re delivering our courses low-cost, or free in some cases, with virtual classes and resources to make it accessible across the world.

DQ is essential. It should be part of everyone’s education – whether you’re a child, parent or worker. Older people are also vulnerable as some are using this tech for the first time. Our foundation delivers training of trainers, supporting the teachers, to establish good habits early for young people.

We’re taught how to use a keyboard or type an email. But if we’re told that this device records what you do, are you ok with that? Then, you may change behavior, like using a different search engine or have conversations about sensitive topics in person.

Many companies’ most significant cybersecurity risk is a staff member who doesn’t recognize a phishing attempt. At work, HR should deliver professional programs, training new staff to protect themselves and your organization.

To find out more about digital intelligence (DQ) training, contact Own Your Data Foundation.

Comments represent the opinions of the interviewee.

Kaspersky Enterprise Security Solutions

Predict, prevent, detect and respond to cyberattacks with award-winning solutions tailored to the needs of enterprise.

About authors

Susi O’Neill is the Editor-in-Chief of Secure Futures and host of business tech podcast Insight Story. She’s a seasoned creative who’s led business content programs for brands including EY, Mastercard and Unilever. Off the clock, she’s a musician and performer who gives international performances playing theremin, the world’s first electronic instrument.