If you haven’t already, subscribe and join our community in receiving weekly AI insights, updates and interviews with industry experts straight to your feed.
If you spent years writing a novel, crafting each sentence with care, building ideas from experience and imagination (and endless cups of coffee), and then discovered that your life’s work might have become training data for an AI model – it’s reasonable to assume that you wouldn’t feel great about it.
That tension, between human creativity and machine learning, is back in the spotlight.
At the London Book Fair, thousands of authors have published an unusual protest: an empty book titled Don’t Steal This Book. The only content is a list of contributors’ names – around 10,000 writers, including prominent figures such as Kazuo Ishiguro, Philippa Gregory, and Richard Osman.
It’s a symbolic pushback from the creative industry against the way generative AI systems are trained. In a report by The Guardian, campaign organiser Ed Newton-Rex argued that the AI industry has been “built on stolen work … taken without permission or payment”.
From the authors’ perspective, if AI companies use books to train models, they should seek permission and pay for that work.
This protest comes at a crucial moment in the UK.
Right now, the government is assessing proposals that could allow AI developers to train systems on copyrighted material unless creators explicitly opt out – a move that has sparked strong opposition from writers, musicians, and artists.
It’s a debate that has been building for a few years now. From musicians releasing silent albums in protest, to lawsuits over training datasets, the creative industries have been asking a fundamental question:
Who benefits when AI learns from human creativity?
But the answer isn’t as simple as ‘AI bad, artists good’. Because the same technologies that raise ethical concerns are also opening entirely new creative possibilities.
Many artists now have a more nuanced perspective on AI. They’re already experimenting with AI as a creative collaborator. Designers are finding they can use generative tools to prototype ideas in minutes, and writers are using AI as brainstorming partners.
The tech itself isn’t inherently exploitative – but how we govern it is incredibly important.
So the real challenge is designing ethical frameworks that allow creativity and tech to thrive together.
In practice, that could mean:
In fact, the publishing industry has already started exploring collective licensing schemes that would allow AI companies to access large datasets legally while ensuring writers are paid. And that kind of model could transform a conflict into a partnership.
The ethical debate around AI didn’t start with generative art. Researchers and advocates have been raising these questions for years.
When we spoke to DeepFest speaker Mia Shah-Dand (CEO of Lighthouse3 and founder of Women in AI Ethics™), she emphasised that diverse perspectives are essential when designing AI systems.
“Technologies reflect the values and priorities of those funding and building them. The underrepresentation of women and other minority groups in the tech industry, especially in leadership roles, shows up as bias in AI models.”
She pointed to researchers whose work has fundamentally influenced the field.
As Shah-Dand told us:
“Women are at the forefront of AI Ethics because many scholars and researchers who interrogated these systems found that they are not designed or optimised for us.”
And she offered a reminder that ethical AI isn’t just about avoiding harm:
“Investing in women and including their perspectives in AI will spur innovation, generate economic opportunities, and enable us to minimise harms while maximising benefits for all.”
The empty-book protest reflects a deeper turning point. AI is already changing the future of creativity, so now we’re under pressure to decide what the rules should be.
We need to keep digging deeper into the hard questions, like:
We think history is encouraging here. Every major technological shift (from photography to synthesizers to digital publishing) has come with a rush of fear and resistance, before settling into creative industries in useful ways (sometimes in ways no one expected).
AI could follow the same path. Creatives are, as you might expect, creative people. And when creative people gain access to new tools and opportunities, amazing things can happen.
But getting there will require thoughtful governance, fair economic models, and voices from across society – not just technologists.
How should the creative industries and AI developers work together on copyright and training data? Open this newsletter on LinkedIn and share your perspective in the comments.
Join us at DeepFest from 31 August – 3 September 2026 to hear from the people shaping the future of AI and its real-world impact.