Tech ethics: Are you asking the right questions?

Tech ethics: Are you asking the right questions?

Over the last two years at LEAP, we’ve had the privilege of meeting activists and thought leaders who are pushing technology towards ethical systems that are good for everyone. They’re asking the difficult questions that open up space for discussion about why tech products are created, who they’re helping (or harming), and how the industry can develop technology that has a truly positive impact.

So let’s look at three key areas of ethical concern for activists and regulatory bodies – and the questions your tech company can ask, to begin building a picture of how you can support the movement towards more ethical technology.

User privacy

Everyone (including the majority of tech users) understands the critical importance of privacy, as more and more aspects of our lives are mediated by technological experiences and data storage.

Research published in the medical journal Frontiers in Surgery in 2022, for example, looked at the legal and ethical considerations of AI in healthcare, and highlighted a mismatch between the legal obligations that doctors have to adhere to when it comes to maintaining confidentiality and patient privacy, and the obligations that technologists have. In short, while doctors might be required by law to be accountable for how they collect and store data, technologists don’t work with the same requirements – so there are potentially serious implications for patients when third-party data management systems are implemented in healthcare settings.

But the ethics of user privacy reach far and wide, across every segment of the tech industry:

  • Research by Tableau found that 63% of global consumers think companies aren’t honest about how they use people’s personal information, and almost 48% have stopped buying from certain companies due to privacy concerns.
  • Truata found that 60% of customers say they’d spend more money with a brand they trust to manage their personal data responsibly.
  • A 2022 survey by Cisco showed that companies see an average ROI of 1.8% on spend related to improving privacy, and 92% agree they have a moral obligation to use consumer data transparently.

As Richard Benjamins (Chief AI and Data Strategist at Telefónica) said at LEAP 2022, privacy should never be an afterthought. It should be built into digital services and tech products from the start.

“If you look into the future,” he said, “you can have two big views. One is privacy is dead, get over it. Or, privacy becomes more user-controlled. And also user-monetised. I hope and I believe that is the way we’ll go – privacy by design.”

Bias and discrimination

As AI has exploded into our daily lives in a big way, the potential harm caused by bias within AI systems has become a major concern.

  • In 2022, researchers at USC Viterbi School of Engineering studied the content of two AI databases, and identified bias in 38.6% of ‘facts’ used by AI.
  • In DataRobot’s latest State of AI Bias report, 54% of respondents said they’re ‘very’ or ‘extremely’ concerned about AI bias – and 81% called for government regulation to define and prevent that bias.
  • A report by the American National Institute of Standards and Technology (NIST) reminded us that AI bias isn’t just about data. The humans who develop AI systems, and the societal systems within which they’re working, are also embedding bias deeply into the way AI is programmed. AI is created within the context of the world around it. So if we don’t root out the bias, we risk fixing the limitations of our society right now into AI; and then those limitations will stay in the AI for a very long time.

But AI-powered tech isn’t the only area where bias and discrimination come into play. All technology services and products rely on data – whether they’re adapting to data in real-time, or using existing data to inform design and deployment decisions. And all data can hold biases that are hard to spot, if you’re not looking for them.

When we interviewed Edosa Odaro (Chief Data Analytics Officer at Tawuniya), we asked what data diversity means to him.

He said:

“There are far too many viewpoints about diversity – with a lot of these causing either offence or unhelpful polarisation. Yet, I believe that something most of us can all agree on is that, regardless of where we stand on the problems of diversity, there is far more we can do to focus on techniques for achieving tangible solutions.”

And data is fundamental to creating these solutions. Why?

Because:

  1. “We need good data to measure and assess where we are – to be clear about our current situation and our starting point.”
  2. “We need to define in data our target state – our ‘what good looks like’ – and there is no one-size-fits-all as this differs from company to company; community to community; and even country to country.”
  3. “We need data and advanced analytics to test hypotheses, predict outcomes, create intelligent interventions – to ensure our desired target state.”

“The caveat,” Odaro added, “is that there really cannot be a single silver bullet solution, but it is absolutely essential to start with data and utilise data to enable tangible outcomes.”

Environmental impact

How we produce and dispose of technology – as well as how people are using that tech – can have a serious environmental impact. As climate change becomes an urgent concern for communities and governments around the world, the tech industry has to look at its relationship with the planet at every level.

  • Pollution. Tech production creates air and water pollution by releasing greenhouse gases from power plants, factories, and distribution operations. It’s certainly not the worst sector – but the UN environment program estimates tech is responsible for about 3% of global greenhouse gas emissions.
  • Resource depletion. Producing and disposing of tech demands that natural resources are extracted from Earth – including metals, minerals, and fossil fuels. The common metals used in technology, for example, include cobalt, lithium, tantalum, gallium, niobium, indium, selenium and zirconium.
  • E-waste. As tech companies rush to create newer, more advanced products, a huge amount of electronic waste is produced. Much of it is non-biodegradable, or contains toxic substances that can cause environmental harm if they’re not managed properly. According to data from The World Counts we generate about 40 million tonnes of e-waste each year – which equates to throwing out 800 laptops every second.

There’s no doubt that technology will be a huge part of our future. But there’s also no doubt that climate change is a huge part of our future – and the tech industry has to work hard to minimise its impact.

Asking questions can support ethical decision-making in tech

These are just three of many areas that tech ethicists are focused on right now – from the potential for global governance for data, to the future of human employment and labour; and a whole host of issues in between.

So what questions could tech entrepreneurs and innovators ask to expand their ethical perspective?

Here are our suggestions:

  • When you look at the ethical issues around your tech company, what’s the target state? Where do you want to be – and how could you achieve that?
  • Can you define your responsibility to your users’ safety and well-being; to the planet; to transparency, and to society?
  • In what ways could your tech have the biggest positive impact?
  • How can you recognise bias in yourself, your company, and your products – and then how can you reduce that bias?
  • What applications of your tech (or of other technologies within your company) will you choose not to pursue?
  • What’s the relationship between your ethics and your sales/profits?
  • In the relationship between you/your company/your products and your customers, who has the power – and why?

And finally, it’s important to remember that tech isn’t the only industry facing intense ethical pressure in the current geo-political landscape. We’re living in uncertain times; so all industries have to take a close look at themselves, and make decisions which integrate good ethics with good business.

Related
articles

Developers: 3 Programming puzzle platforms to challenge your brain

Programming puzzles are short programming challenges designed to help developers improve and evaluate their skills. If you’re a developer, programming puzzles are a valuable resource at any stage in your career – but they’re particularly useful for newer coders who are actively building their skill sets.  Often written in

Twenty years, one company: Lessons on talent retention in tech

Leon Winkler (Senior Director of International Events at Ubisoft HQ) has been with his company for nearly 20 years. In the wider tech industry talent retention is a major challenge, with the average employee tenure estimated to be around three years.   So when we speak with a tech leader who’

Where do tech startup ideas come from?

Every tech entrepreneur has experienced the rush of a new idea. Sometimes it happens suddenly: you’re just going about your daily life and it hits you. And sometimes it’s a slower, quieter process; the idea develops gradually, inspired by your experiences in life and work, and your confidence