AI generated content in gaming
-
Insight Article 30 July 2025 30 July 2025
-
UK & Europe
-
Tech & AI evolution
The gaming industry is rapidly changing as artificial intelligence (“AI”) becomes an increasingly common tool for developers.
Introduction
From creating procedurally generated worlds to crafting dynamic soundtracks and realistic non-player characters (“NPC(s)”), AI is reshaping how games are made and experienced. This shift promises greater creativity and efficiency but also raises thorny legal questions that many studios are only beginning to confront.
One of the most pressing challenges involves copyright: who owns content created or heavily influenced by AI? Platforms like Steam have started requiring developers to disclose how AI was used, reflecting growing concern about transparency and accountability. At the same time, courts and regulators remain uncertain about how existing laws apply to AI-generated content.
This article aims to unpack these issues by examining recent legal developments, industry policies, and best practices for managing the risks associated with AI in game development. For developers, publishers, and legal advisors alike, understanding this evolving landscape is critical to navigating the future of gaming without stumbling into costly disputes.
How AI is Used in Video Games
AI has become a cornerstone of modern game development, transforming how visual, auditory, and narrative elements are created and deployed. Tools like Recraft now enable developers to generate game assets - sprites, textures, environments - through simple text prompts, integrating seamlessly into engines such as Unity, Unreal Engine, and Godot. These efficiencies have enabled developers at all levels - from indie creators to established AAA studios - to produce high-quality content with greater speed and consistency.
Beyond asset generation, AI is being deployed to produce dynamic soundtracks, realistic NPC dialogue, and adaptive gameplay environments. For example, Activision Blizzard patented a machine learning approach designed to generate adaptive in-game music tailored to each player's unique behaviour, using real-time data from their movements, decisions, and playstyle to dynamically shape the soundtrack.[1] In Red Dead Redemption 2 and Assassin’s Creed: Odyssey, AI-driven NPCs follow realistic daily routines and dynamically respond to player behaviour, creating immersive, living worlds. In The Last of Us Part II and F.E.A.R., enemy NPCs demonstrate adaptive tactics, such as flanking and taking cover, adding depth to combat scenarios. Procedural generation tools have also enabled titles like Diablo and No Man’s Sky to deliver endless content by dynamically creating levels and encounters. Meanwhile, games such as Grand Theft Auto V leverage AI to enhance realism through advanced lighting, shadows, and real-time upscaling technologies like NVIDIA’s DLSS.[2]
These examples are merely a glimpse into the vast and rapidly evolving role of AI in game development. Despite its advantages, the fast-paced growth of AI has led developers and platforms to put risk-based policies in place, particularly to navigate the legal grey areas around copyright and ensure transparency with users.
Legal Framework: US, EU and UK
At the heart of the debate surrounding AI-generated content in the gaming industry is the question of whether such works are eligible for copyright protection. Under both United States and United Kingdom law, copyright is traditionally predicated on human authorship. In the United States, the Copyright Office has repeatedly affirmed that only works reflecting sufficient human authorship may qualify for protection[3], a standard reinforced in Allen v. U.S. Copyright Office (2024), where registration was denied for an AI-generated image on the grounds that it lacked the requisite human creative input.[4] Most recently in March 2025, the Copyright Office further clarified that while AI may serve as a tool in the creative process, the resulting work must ultimately be the product of human authorship to be eligible for copyright protection.[5]
Similarly, under European law, the Court of Justice of the European Union (“CJEU”) has held that copyright protection requires a work to reflect “the author’s own intellectual creation” - a standard that, in practice, excludes works generated solely by autonomous AI systems.[6]
In the United Kingdom, copyright protection is generally grounded in the requirement of human authorship, with the Copyright, Designs and Patents Act 1988 protecting “original literary, dramatic, musical or artistic works” as the product of human skill, judgment, or labour. The UK Intellectual Property Office (“IPO”) has consistently affirmed that works generated autonomously by AI systems, without meaningful human creative input, do not qualify for copyright protection - except under a unique provision: Section 9(3) of the Act, which currently allows “computer-generated works” created without direct human authorship to be protected, with the author deemed to be the person who made the necessary arrangements for the work’s creation.[7] Computer-generated works include AI-generated content produced by inputting basic prompts without further human involvement. This exception sets the UK apart from many other jurisdictions, though its application to contemporary AI-generated content is under review and its future remains uncertain. Consequently, while game developers must generally demonstrate clear and sufficient human creative contribution to secure copyright protection, there remains a narrow statutory pathway for purely computer-generated works in the UK.
The ongoing legal ambiguity creates operational risks for the stakeholders. Developers using AI-generated assets risk disputes over ownership and enforcement. Where AI-generated assets do not qualify for copyright protection, developers may find that third parties reuse their work with little legal recourse. Infringement disputes can delay game releases, disrupt commercial timelines, or compromise competitive advantage. This challenge is further complicated by the lack of harmonised copyright standards across jurisdictions. Not only that, even where relevant laws exist, courts offer limited guidance, as few disputes involving AI-generated content have progressed to trial. This makes cross-border enforcement difficult and adds further legal uncertainty to an already complex landscape.
Legal Grey Area: Copyright Ownership of AI-Generated Content
The lack of clear copyright protection for purely AI-generated works and limited legal precedents mean stakeholders operate with heightened exposure to infringement claims and ownership disputes. By way of example, in SAG-AFTRA v. Llama Productions (2025)[8], the actors’ union filed an unfair labour practice charge against Llama Productions, a subsidiary of Epic Games. The union alleged that the company used AI technology to replicate the voice of Darth Vader in Fortnite without prior notice or collective bargaining, arguing this deprived union members of work and violated existing labour agreements.
While the immediate allegations relate to employment law, the implications extend to copyright and personality rights. As generative AI systems become increasingly capable of emulating human expression, the legal frameworks protecting performer identity, voice, and likeness are being tested in new ways. The case also highlights how disputes involving AI-generated content rarely stay confined to one legal domain. As questions over copyright, labour rights, and identity protection increasingly intersect, developers are left to navigate a legal landscape that offers little consistency or clear precedent.
Industry Policies: The Case of Steam
Commercial platforms have now begun to adopt their own enforcement mechanisms. Valve Corporation (“Valve”), the developer and operator of Steam, the world’s largest PC gaming platform, has recently updated its policies to address the rise of AI-generated content. Valve now requires developers, who wish to publish their games on Steam, to disclose how AI was used throughout the development process, distinguishing between two categories[9]:
- Pre-generated AI content: includes any art, code, or sound created with AI tools during development; and
- Live-generated AI content: refers to content created by AI tools while the game is running.[10]
The review of those contents intends to ensure it does not include illegal or infringing material, such as assets derived from copyrighted works owned by third parties. In the case of live-generated content, developers are required to demonstrate that appropriate safeguards are in place to prevent the creation of unlawful material.[11] This disclosure is made visible to players on the game’s Steam page, promoting transparency and informed choice. Valve has also released a report system to further monitor the live-generated content in games.[12]
As a result, games containing AI-generated content may be rejected if the developer cannot prove ownership or sufficient rights to the underlying training data, highlighting the platform’s cautious approach to copyright and liability.[13]
This policy reflects broader industry concerns over liability and the potential for copyright infringement, effectively acting as a form of self-regulation by putting the onus on developers to prove ownership or meaningful human input - or risk their games being taken down. As of 2025, Valve’s disclosure requirements have already brought greater transparency to the platform. Nearly 8,000 games - roughly 7% of Steam’s library -now openly declare the use of generative AI, with around one in five new releases featuring disclosures primarily related to asset generation and audio content.[14]
Potential Solutions and Best Practices
As legal frameworks struggle to keep pace with generative AI, industry stakeholders are adopting a combination of contractual, technical, and compliance-oriented strategies to navigate this landscape:
- Human Authorship Requirements: Ensuring that AI-generated content is subject to significant human input can help satisfy copyright requirements and reduce legal risk which is particularly important if the stakeholder operates in the EU or the US. This may involve post-editing AI-generated art, combining AI outputs with human-authored components or documenting human decision-making throughout the content creation. This strategy aligns with the Copyright Office’s “sufficient input” standard and strengthens claims to protectability.
- Platform and Regulatory Compliance: The EU Digital Services Act imposes transparency and due diligence obligations on developers who allow user-generated or AI-generated content in their games. These include notice-and-action procedures, proactive content moderation, and transparent labelling. Steam’s AI disclosure policy aligns with this regulatory trajectory, effectively setting a baseline for industry compliance.
- Documentation and Record Keeping: Following Valve’s lead, studios could adopt internal documentation standards that record when and how AI tools are used, the extent of human involvement and the origin of training datasets. Embedding metadata into AI-generated files and retaining versioned development records can support copyright claims and serve as critical evidence in infringement or labour-related disputes.
- Contract Clauses: Developers and publishers can include specific provisions in contracts to clarify ownership and licensing of AI-generated assets, include indemnity clauses for third-party infringement claims or define acceptable uses of AI.
- Internal Governance and Risk Controls: Developers and publishers may consider establishing internal policies governing the use of AI, including ethical guidelines, audit mechanisms, and escalation procedures for disputed content. Aside from assisting with compliance, such policies promote accountability and builds trust with players and creators.
Conclusion
The use of AI-generated content in gaming is reshaping the industry, offering new opportunities for creativity and efficiency. However, the creative boom comes with significant legal and policy challenges, particularly around copyright ownership and the rights of human creators. Industry leaders like Steam are setting precedents with disclosure requirements and cautious approval processes, while courts and regulators continue to grapple with the question of human authorship. To mitigate exposure and maintain trust, developers and publishers must adopt legally robust practices: ensuring meaningful human input, documenting AI use, securing rights to training data, and embedding these obligations into contracts. Ultimately, the sustainable use of AI in gaming will depend not just on technological innovation, but on the industry's willingness to build systems that respect human contribution, ensure traceability, and align with evolving legal standards.
[1] Open Data Science, “8 Modern Examples of Artificial Intelligence in Gaming” (July 2023) <8 Modern Examples of Artificial Intelligence in Gaming>
[2] Inworld, “Best AI games 2024: Released and upcoming” (August 2023) <Best AI games 2024: Released and upcoming games with AI>
[3] Federal Register, “Copyright Registration Guidance: Works Containing Material Generated by Artificial Intelligence” (March 2023) <Federal Register :: Copyright Registration Guidance: Works Containing Material Generated by Artificial Intelligence>
[4] Reuters, “Artist sues after US rejects copyright for AI-generated image” (September 2024) <Artist sues after US rejects copyright for AI-generated image | Reuters>
[5] Stephen Thaler v Shira Perlmutter (2025) No. 23-5233 (Court of Appeals for the District of Columbia Circuit) <23-5233.pdf> accessed 20 July 2025
[6] Case C-5/08 Infopaq International A/S v Danske Dagblades Forening ECLI:EU:C:2009:465 <CURIA - Documents> accessed 10 July 2025
[7] Secretary of State for Science, Innovation and Technology, Copyright and Artificial Intelligence (Gov.uk, 17 December 2024) <Copyright and Artificial Intelligence - GOV.UK> accessed 10 July 2025
[8] Babl, “SAG-AFTRA Files Labor Complaint Over AI Darth Vader Voice in Fortnite” (May 2025) <SAG-AFTRA Files Labor Complaint Over AI Darth Vader Voice in Fortnite - BABL AI>
[9] PC Gamer, “Valve is scrutinizing games with AI assets on Steam, says avoiding copyright violation ‘is the developer’s responsibility’” (July 2023) <Valve is scrutinizing games with AI assets on Steam, says avoiding copyright violation 'is the developer's responsibility' | PC Gamer>
[10] Michalsons, “Unpacking Steam’s AI disclosure requirements for listed games” (January 2024) <Unpacking Steam's AI disclosure requirements for listed games - Michalsons>
[11] Developer, “Valve clarifies what AI content it will allow on Steam” (January 2024) <Valve clarifies what AI content it will allow on Steam>
[12] Mystery Gamedev, “Steam’s New Generative AI Policy: Analysis & Predictions” (January 2024) <Steam's New Generative AI Policy: Analysis & Predictions>
[13] The Decoder, “Valve rejects Steam games with AI content over copyright concerns” (July 2023) <Valve rejects Steam games with AI content over copyright concerns>
[14] 80LV, “The Number of Steam Games Feauturing Gen AI Has Increased Eightfold in a Year” (July 2025) <The Number of Steam Games Featuring AI Increased Eightfold in a Year>
End