Advocating for Illustrators in the AI Regulatory Landscape – The AOI
[ad_1]
The Association of Illustrators is dedicated to protecting the rights of illustrators amidst the evolving AI regulations by the UK government. We have been actively contributing towards regulatory discussions through our work with Creator’s Rights Alliance and the British Copyright Council, as well as submitting several written responses to government, to ensure that the voices of our members are heard. In this news piece, we will bring you up to speed on three important AI regulatory updates.
Intellectual Property Office Statement on AI Code of Conduct
The Intellectual Property Office (IPO) has halted their efforts to create a set of rules, known as the AI Code of Conduct, due to disagreements among stakeholders. The responsibility has now been transferred to the Department for Science, Innovation and Technology (DSIT) and Department for Digital, Culture, Media & Sport (DCMS) to further consult with the creative and AI sectors.
While collaboration is essential, we must continue to advocate for the protection of our rights under existing Intellectual Property laws. We are disappointed that the IPO did not explicitly recognise the unauthorised use of our creative works to train AI systems as a violation of these laws.
There is, however, a consensus on the need for transparency. It is widely agreed that tech companies should clearly disclose which artworks they use to develop AI tools and to label AI-generated content. We urge continued dialogue toward establishing a code of conduct on these matters.
The House of Lords Report on Large Language Models
The House of Lords recently published their report, following an inquiry into the impact of Large Language Models (LLMs) used to train AI. The report highlights the pressing need for government intervention to ensure AI developers adhere to existing UK law. This includes the obligation to seek permission for the use of copyrighted work in AI training, as well as offering fair remuneration to the copyright holders.
The report emphasises that copyright exists to reward creators for their work, prevent unauthorised usage, and encourage innovation. It criticises government for not effectively protecting these rights and calls on them to take action. The need for transparency is also highlighted, allowing creators to identify if their work has already been used without permission.
A full summary of the report can be read here. It is a positive endorsement of the protections the AOI are campaigning for, and we encourage Government to implement these recommendations.
Government Response to ‘Pro-innovation Approach to AI Regulation’ Consultation
The UK Government has issued its response to their ‘A pro-innovation approach to AI regulation’ consultation. The AOI is encouraged by the Government’s acknowledgement of the significance of human creativity in AI development. We also welcome their stance on transparency, advocating for AI developers to disclose the sources used to train their systems.
However, we believe the Government’s commitment requires much stronger reinforcement to ensure AI platforms and users comply with existing laws. The Government must articulate a clear commitment to hold AI developers accountable, including for past uses of copyrighted works. As the House of Lords report highlights, the creative industries unanimously call for compensation for such uses, which have occurred without consent or proper attribution, and in violation of UK copyright law.
To truly support the growth of our world-leading creative industries, the Government must move beyond expressions of intent and actively enforce regulations that prevent the continued undermining of copyright laws. Read the full Government response here.
The AOI will continue to monitor these developments and advocate for the rights and recognition of the illustration community. We welcome individual illustrators’ perspectives on this topic, please feel free to contact [email protected] with your views, questions, or insights.
[ad_2]
Source link