We know some people and businesses will, given the chance, use personal location data in unethical ways. When automotive cybersecurity experts Karamba Security posted a fake vehicle control unit online, they saw 25,000 attempted breaches in three days. Stalkers have used AirTag, Apple’s coin-sized device for finding lost keys or other property, to track victims.
Regulations like the EU’s General Data Protection Regulation (GDPR) define minimum standards of data ethics, but the case for going further is strong. To that end, what can help those tech businesses who want to stay ahead of the data exploitation curve by upholding the highest data ethics standards?
Learning to foresee the unforeseen
International non-profit the Locus Charter promotes a higher standard of responsibility for professionals and businesses when it comes to location data. They want to avoid unforeseen privacy breaches by working out what can go wrong and pinpoint big problems before they’re exploited. They champion public interest by showing businesses how to prevent unintentional harm.
The charter is based on ten principles that balance moral behavior with economic benefit, like realizing opportunities and understanding impacts.
Building brand reputation beyond fear
Cyber pride without lions
Despite cybercrime realities, there’s no need to build security policy on fear. Here’s what to do instead.
Location data is one of many frontiers. As veterans of data policy in and out of government, the charter’s founders want to be part of all kinds of data discussions.
Automation challenging data ethics
“The most important focus in data ethics is exploring the power granted by the accumulation of data so everyone can recognize how that power can be used and misused,” Ben Hawes, Technology Policy Consultant and co-author of the Locus Charter, told me.
He continues, “There’s an assumption automated systems and digital technologies are always better. Economic pressures mean companies prefer the promise of greater productivity at all costs, which can be alarming when people are unaware of many potential data misuses, accidental or deliberate.”
When data markets are interested in aggregating (grouping) all personal information for commercial targeting and sometimes state monitoring, we must take data ethics seriously.
“Some risks in business going digital crop up in all sectors,” said Hawes. “We often see ‘designer biases,’ where developers center their own experiences, leading to alienating people who don’t fit the same racial, economic or physical profile. Another issue is automation making back-end processes more complex until they’re opaque. There are also sector-specific issues.”
With the digital world’s size and potential, regulating the space is an uphill battle. New ways to exploit data are appearing faster than lawmakers can address them.
Is self-regulation enough?
Hawes doesn’t support a relaxed approach to data ethics. “It’s unwise to avoid oversight and critical regulation in the hope self-regulation may be enough, as it rarely is.”
The self-regulating model assumes the cut-throat world of international economics will somehow materialize a gold standard of consumer protection.
A ‘wild west’ model is attractive for short-term gains by freeing emerging markets from red tape, but loose protections don’t create conditions for long-term growth or stable markets, especially in an age of social pressure on brands to be ethical.
Can whitewashing be cleaned up?
With limited regulative authority, a non-profit trade initiative likely isn’t eager to bicker with companies they’re working with. Organizations like Locus may be better placed, approaching economic ethics from a collaborative angle.
Of course, some companies are only interested in the PR boost of appearing ethical by whitewashing their marketing because they don’t want to be the only ones following the rules in a competitive arena. They end up paying lip service alone, without taking the issues seriously.
Hawes thinks despite this, there’s still benefit in working in good faith. “To get ethically reluctant companies to behave well with data, is it better to give them the benefit of the doubt or be critical and accusing? I think, better to have a company aspire to a good standard, even if it’s empty promises, because they can at least be held to those standards in some way.”
Less direct ways of influencing
A cooperative strategy may create the best conditions in a self-regulating atmosphere. When ethics groups collaborate with business, they influence thousands of staff members, creating internal pressure for higher standards in the boardroom.
It’s also worth championing an ethical company for consumers looking to do the right thing with their money. Though strategies like these can’t do it all, they have worked, motivating large companies to produce net-carbon zero plans and banks to create green investment options.
But data misuse is not the same as pollution. “It’s harder to see and analyze compared with other high-profile scandals, so it’s hard to translate into a common grievance that can unite people,” says Hawes.
Will something have to go badly wrong for the world to take it seriously? Hawes considers, “If authorities avoid developing solutions until a problem causes substantial harm, they’ll still need about a decade to develop tools that would be effective in this complex market. Governments should be working on this yesterday.”
In other words, have a fire extinguisher ready when playing with matches, rather than trying to invent one as your house burns down.
Reforming data use culture
The task of reforming fast-evolving data industries is daunting. Hawes’ recommended method involves slowly, steadily establishing ethical practice as the norm.
“The UK has the Digital Regulation Cooperation Forum for sharing information and approaches. It lets regulators gain knowledge of the risks data and digitalized business generate, so they can manage them sector by sector. We need a forum like that on an international scale,” says Hawes. “Understanding risks like explainability, transparency and bias creates a foundation for building standards that address the problem and brings stability to the data economy.”
This pressure can also come from outside industry, and often flavors the expectations that filter into business. “More public debate on how we can avoid driving digital inequalities would go a long way,” says Hawes. “This is particularly important in social media, which is the frontline of data ethics in many ways.”
While data privacy issues haven’t seen the emphasis Hawes and his peers would like, they’re eager to change the culture. Business can choose to be part of the solution and be heralded as ethical data use champions for their efforts.