In a recent report issued by the House of Lords Digital and Communications Report, a summary of comments by its chair, Baroness Stowell of Beeston is as follows:
The rapid advancement of AI Large Language Models (LLMs) is poised to shape society profoundly, similar to the internet’s introduction. The government’s approach to this development is crucial, emphasising the need to strike a balance between addressing potential risks and seizing opportunities. Caution against remote and improbable threats should not hinder the nation from capitalising on the potential AI goldrush. Learning from technology market evolution, there’s a warning against market dominance by a few tech giants. The government is urged to avoid policies influenced by exaggerated AI doomsday predictions, ensuring open-source development thrives and innovative smaller players participate.
Concerns arise about the widespread use of LLMs, with fears of facilitating malicious activities such as cyber-attacks and image manipulation for exploitation. The government is encouraged to focus on practical solutions rather than getting entangled in sensationalised doomsday scenarios. Swift action is advocated in addressing the use of copyrighted material for LLM training, emphasising that massive datasets ingestion should not excuse unauthorised usage without compensating rightsholders. The authors anticipate significant consequences in the coming years, urging the government to heed their concerns and take necessary steps to maximise the emerging opportunities.
Read the full report ‘Large Language Models and Generative AI’.
In an overview, the Digital and Communications committee said that the Government’s approach to Artificial Intelligence and large language models (LLMs) has become too focused on a narrow view of AI safety. The UK must rebalance towards boosting opportunities while tackling near-term security and societal risks. It will otherwise fail to keep pace with competitors, lose international influence and become strategically dependent on overseas tech firms for a critical technology.
The report issues a stark warning about the “real and growing” risk of regulatory capture, as a multi-billion-pound race to dominate the market deepens. Without action to prioritise open competition and transparency, a small number of tech firms may rapidly consolidate control of a critical market and stifle new players, mirroring the challenges seen elsewhere in internet services.
The Design Artists Copyright Organisation issued the following statement:
“DACS welcomes the findings in the House of Lord’s report on Large Language Models and generative AI published today. The call on Government to “not sit on their hands” while LLM developers exploit the works of rightsholders is very timely, as more rightsholders seek guidance from the courts. Amongst general concerns about competition between open and closed models, market dominance and an international race to create the best marketplace for AI companies to establish business, ethical and legal concerns about the unauthorised use of creators works for training purposes were highlighted.
The report echoes findings from DACS’ AI survey, that some tech firms are using copyrighted material without permission and reaping vast financial rewards, with no royalties going back to creators. The Lords’ assertion that the reason for copyright is to reward creators for their efforts, prevent others from using works without permission, and incentivise innovation, in particular by creators, is paramount. DACS supports the view that the Government has a duty to act and cannot simply refer rightsholders to lengthy and costly court processes.”
ACID CEO Dids Macdonald OBE., said:
We are fully supportive of DACs statement and, whilst welcoming the emergence of AI and all its possibilities, we are very much aware of the challenges faced by IP creators within the Creative Industries when their work is used without permission. It is timely that the government offers clear guidance on this. Innovation cannot wait for the emergence of decisions by Court judgments when it moves at such a pace and not fast enough to keep up with the exponential growth of AI.
Decision makers will require nimble thinking translated into visionary futureproofing of creative work via carefully thought-out policy. Agile regulation is essential to establish the relationship between IP and generative AI. Whilst balancing the needs and expectations of consumer choice and the UK’s government ambition for digital growth, there must be safeguards to provide confidence to innovators and investors. Maybe it’s time for a UK IP regulator?”
See a previous ACID article on AI further reading here, here and here.