Governments rarely apologise, but the Secretary of State, Peter Kyle MP, has expressed his regret.
He has committed to working collaboratively with all sides in the copyright debate to find solutions, exactly what UKAI has been calling for.
Credit where credit is due. We welcome the opportunity to move forward with the working groups and timelines that they have proposed. Rebuilding trust has never been more important.
The main problem always centred on the Government presenting a preference before their own copyright consultation had ended.
Now, they have said that they do not have a preferred position and are willing to review the consultation submissions before establishing a preference.
They want to actively engage with both the creative and AI industries in working groups to find solutions. These commitments are crucial.
UKAI is uniquely positioned to help with this collaborative approach. We went on a journey, listening to all sides, hosting roundtables, attending debates.
In the end, we realised that the interests of the UK’s AI industry are the same as the interests of the creative sectors: we all want UK businesses to be successful and we all recognise that the UK has unique qualities and assets that we need to value and protect.
The conflict has been unnecessary, it has wasted time and dissipated energy. Working together, we can achieve so much more. However, the first step will be to identify workable solutions, find compromise and build agreement.
The copyright debate centres on how generative AI businesses can access training data. Current copyright law protects rightsholders, requiring GenAI developers to obtain permission for any copyright materials they wish to access as training data. As a principle, this remains sound.
The Government has outlined four possible alternatives: do nothing; strengthen copyright; broad TDM exception; opt-out. Previously, they had preferred the opt-out position, meaning GenAI developers would have default access to materials unless copyright holders opted out. This caused deep concern amongst the creative industry.
Now, the Government has no stated preference and will review consultation findings while actively engaging all parties. In some ways, we are back to square one, but now, the Government understands the strength of support for copyright and genuinely seems willing to engage.
As we scope out the challenges ahead, we must consider two different timeframes: the past and the future.
We can determine what the future looks like and should actively consider the principles we want to uphold to build an industry that is responsible and fair. However, we have limited control over what has already happened. Much data has already been scraped by GenAI companies.
Some argue we should treat these as historic cases “forgiven” in an amnesty to secure future agreement and transparency. This will be one of the major debates we need to address, combining pragmatism with responsibility.
There are several major technical areas our working groups must address, including the below.
Detection: How can rightsholders tell when their copyrighted materials have been used as training data?
Attribution: When copyrighted materials have been used, how can this be attributed back to specific rightsholders?
Remuneration: How does a rightsholder get paid when their material is used by a GenAI company?
Transparency: The ability to see how specific copyrighted material is used in training processes.
Accountability: How will rules and regulations be enforced?
I believe we can find technology-based solutions for detection, attribution, and remuneration. Our members are already building tools to address these areas.
However, transparency and accountability may require governmental and international agreement, and regulation. Whilst this is not simple, it is important.
We believe that building a responsible industry is the only way to build consumer confidence in AI.
Trust matters. Consumers choose products from companies they trust.
If we want the UK’s AI businesses to grow and win customers for their innovative, world-leading products, we need to work together to educate and engage the public. Building consumer trust is fundamental to economic growth.
UK businesses need to trust each other as well. We are partners and customers sharing the same resources, wanting the same economic success. This is why the polarised copyright debate was so disappointing. When we brought people together, we found more common ground than disagreement.
The creative sector uses technology extensively, and the technology sector is inherently creative. When we outlined challenges around detection, licensing, and attribution, people leaned in.
Humans want to find solutions: this truth is at the heart of the creativity that powers both creative industries and technology entrepreneurs.
Trust is also important for the Government. Getting this right could be the ‘big-bang’ moment for the UK’s AI industry, a once-in-a-generation opportunity for the UK to lead globally, establishing the UK as a global centre of responsible and ethical AI.
Just as the EU’s GDPR put Europe at the centre of global data regulation, leading in responsible AI could make the UK a global centre in the next wave of AI development.
This plays to our historic strengths in language, culture, education, law, and globally respected service industries.
The economic growth that the Government wants from AI is possible, but they need to better understand the UK’s domestic AI industry.
UK businesses are building services and tools on top of GenAI. To drive adoption, we require broader AI literacy and consumer trust.
Success relies upon building trust between all parties. If the Government is serious about finding solutions, UKAI is here to bring all sides together.
We are ready to support the Government, the creative industry, and our members to find workable solutions, compromise, and agreements. Let’s make the UK the global leader in responsible AI innovation.
Featured image via Steve Travelguide / Shutterstock.
				
															


