Data regulation is bitty. Different countries, regions, and even different companies have their own policies, and implement those policies with a level of subjectivity (and often deceptive design) that makes it very difficult for digital citizens to get a handle on when, why, how, and by whom their data is used.
At #LEAP22 we brought four thought leaders together for a conversation about privacy. We wanted to get to the bottom of why data issues are so complex, and what the future might look like. Three key points emerged:
Privacy must be by design: built into the fabric of our digital tech.
Data must shift from company-controlled to user-controlled, with the potential for users to monetise their own data too.
We need to change our approach to data privacy so that instead of looking at it nationally, we’re looking at it on a global scale.
The tech-centric approach
Both privacy by design, and a shift in data control away from corporations and towards individual users, fall under a tech-centric approach to data privacy. Instead of always taking our data away, tech itself can be created to give back control and power to users.
Su Le (Chief Digital and Strategy Officer at NEOM Tech & Digital Company) said that when we think about data right now, “we start with protecting our citizens.” But that’s not enough anymore: “we’re really being asked to move to empowering our citizens.”
As Le pointed out, today’s digital users are more concerned than ever before about privacy, and they’re also sharing more data than ever before. So at NEOM, tech is being put at the forefront of addressing this privacy paradox: the company is using AI to give users a big picture view of what’s happening to their data, so they can begin to control it themselves without being overwhelmed by thousands of data sharing choices every day.
For Richard Benjamins (Chief AI and Data Strategist at Telefónica), privacy isn’t a tool to be added after data sharing is already underway, but something to be built into digital services from the start. “If you look into the future you can have two big views,” Benjamins said. “One is privacy is dead, get over it. Or, privacy becomes more user-controlled. And also user-monetised. I hope and I believe that is the way we’ll go – privacy by design.”
But if we zoom out a little, away from the tech itself, and take a human-centric view of our scattered experience of digital services and data in 2022…a deeper question arises. Are we ready for global data policy that can be applied across countries and cultures – and is that even possible?
Global privacy governance, grounded in value
According to Nnenna Nwakanma (Chief Web Advocate at World Wide Web Foundation), this is what we need to focus on now. “Where we work at national levels, it is now time to look at a global trust mechanism, and that I think is where we’ll be going,” she said.
Ivana Bartoletti (Global Chief Privacy Officer at Wipro) noted that the pandemic has raised important new possibilities when it comes to collective decision-making around data. As apps and contact-tracing have increased public knowledge of the spread of the COVID-19 virus, “I think something might have shifted in public opinion, because to an extent, we now realise that there’s nothing more valuable from a collective perspective than a piece of personal information.” Knowing that someone else has tested positive for the virus, for example, enables us to protect ourselves.
And Nwakanma agreed: “I am because we are,” she said. “I think the pandemic has proved it.”
But forming a trust mechanism or a governance system that makes sense for the entire world is anything but an easy task. It would have to be rooted in some kind of agreed logic – and for Nwakanma, that logic lies in the exchange of value.
“If I may use an imagery, a woman doesn’t give birth to her baby while being clothed. She accepts the nudity because she is going to give life. So the value of what is coming is enough for her to give her privacy.”
And this idea can be applied to data sharing. If users know why their data is being used, and have a clear understanding of the value they’re getting in return, and can easily give and take away their consent for that data to be collected and used, then we have the basis for a trust mechanism that everyone can (theoretically) get on board with. “The question is, what is the value you are giving me, for which I will give you my data?”
As a starting point, it does make sense. But there’s a long road to travel before we come close to a global standard for data privacy – not least because different governments benefit (or not) from varying degrees of trust from their citizens. While one country’s population might trust their government’s promises about data very willingly, another country’s population might be tempered by profound, multi-generational mistrust.
Today, more than 120 countries around the world do already engage in international privacy laws for data protection, largely guided by five principles: notice (advising users of policies), consent (offering the choice to opt in or out), access and participation, integrity and security, and enforcement (ensuring that services are connected with some kind of regulation that enforces compliance).
So we are already some of the way there. Countries have begun to look beyond their own borders to tap into a more interconnected approach to data privacy, and this suggests that a global mechanism is possible in the future. There’s willingness to cooperate, collaborate, and build a functional strategy for data sharing that works across borders. But how we’ll move to the next level is yet to become clear. And in the meantime, tech designers and individual users must take a proactive approach to understanding and harnessing their data.