Game changer: Managing legal risks in the AI and Web3 powered video gaming revolution (by Linklaters)
By Jennifer Calver (Counsel at Linklaters) with contributions from Linklaters’ global gaming team.
Video games have long spearheaded technological innovation, driving advancements that extend far beyond entertainment. The multibillion pound games and interactive entertainment industry is now starting to harness the transformative power of generative AI (GenAI), bringing new efficiencies to game development and creating more immersive experiences for players.
Meanwhile, the growth in Web3 games being built on blockchain infrastructure allow players to own, sell and trade their digital assets with other players – also driving more immersive gaming and community engagement across gaming platforms. As these technological innovations evolve and converge, so do the legal and reputational risks – and the regulatory frameworks which are developing in response. This presents a complex risk matrix that gaming companies and developers must navigate.
Player’s personal data will need to be protected
AI can be highly effective at profiling players and analysing gameplay, and it is helping developers create diverse and more adaptive non-player characters (NPCs) that blend naturally into the game environment making interactions more dynamic and engaging.
GenAI is also revolutionising user-generated content, creating new features, genres and game mechanics. It can improve playability for experienced players, dynamically adjusting difficulty levels by adapting in real-time to their game activity. This is made possible by collecting vast amounts of data and tracking various aspects of player behaviour, such as their actions, time spent on different activities, and interactions with both the game environment and other players.
However, there are risks inherent in the nature and volume of personal data processed through video games (e.g. name and surname details required for online account creation, and gaming habits), which engage privacy rights under the UK’s General Data Protection Regulation (UK GPDR). The Information Commissioner's Office (ICO) - as the enforcer of UK GDPR - has indicated that it will be particularly interested in any use of AI with children’s data.
In order to avoid significant fines (of up to 4% global annual turnover) and the reputational damage that can arise from the misuse of players personal data, gaming companies and developers must ensure that AI tools comply with data protection laws and address any privacy risks, right from the design phase. Undertaking a data protection impact assessment (DPIA) before deploying AI in-game can help provide a framework to identify, analyse and mitigate these risks.
Children’s behaviour online in focus
Aided by advancements in virtual reality (VR) and augmented reality (AR) hardware capabilities such as the Apple Vision Pro and Meta Quest 3, developers are exploring new, more immersive gameplay with GenAI-enhanced features. A key selling point of global VR games is that they allow players to feel as though they are in the same room, no matter where they actually are. However, the upsurge in immersive gameplay can lead to concerns about over-engagement and a negative impact on children.
AI has the potential to amplify “dark patterns” (manipulative deceptive design processes that nudge the player towards potentially harmful behaviours), such as excessive playing time or disproportionate use of microtransactions and loot boxes. AI algorithms may also be deployed to allow for real-time targeting of individuals and facilitate the fine-tuning of dark patterns, enhancing their effectiveness.
Given that in-game spending has overtaken traditional revenue streams such as video game purchases and now represents the biggest share of the gaming industry’s revenues, gaming companies will be aware that in-game purchases, dark patterns, and their related online harms are a growing concern for consumers, particularly parents of gaming children [read more].
And regulators are responding – in the UK dark patterns and the effect they can have on consumer decision-making are a priority area for the UK’s Competition and Markets Authority (CMA) in the context of its consumer protection powers. These are set to be transformed by the Digital Markets, Competition and Consumer (DMCC) Act 2024, expected to come into force this Autumn. The DMCC will give the CMA the power to impose fines of up to 10% of global turnover for breaches of UK consumer protection laws (meaning that cases will no longer need to be brought by individuals through the courts), drastically increasing the potential costs of non-compliance.
Online safeguarding for children
The UK’s new Online Safety Act 2023 (OSA) aims to keep users, particularly children, “safe” online. As the substantive duties gradually come into force over the coming year or so, the OSA will require gaming services that allow users to interact with other users to take a range of measures. In particular, they will be required to put in place systems and processes to protect all users from illegal content and interactions and to protect under-18s from accessing harmful content. Ofcom has already launched consultations on how it will implement these duties in the OSA [read more].
Ofcom have said that the types of gaming services that may be bound by the rules include those that allow users to interact by creating or manipulating avatars, objects and their environments and/or by using voice and text chat. Violations of the OSA can result in material fines of up to 10% of global annual turnover.
To comply with the OSA, gaming services will have to identify the risks of their users encountering illegal and/or harmful content and implement appropriate safeguards to protect players from these risks. Many gaming services already deploy AI solutions as part of these risk mitigation measures and it is likely that these measures will need to be enhanced and adapted as the OSA comes into full effect [read more]. Ofcom will be launching a consultation later this year on how automated tools, including AI, can be used to proactively detect illegal and harmful content.
The ownership challenge in decentralised gaming
GenAI is empowering players to personalise their gameplay experience by allowing them to seamlessly generate, customise and personalise in-game digital assets such as weapons, skins and game environments in real time, and supports monetisation models such as pay-to-earn.
The Web3 vision for an open source, decentralised internet based on public blockchains – to support the transfer of such digital assets including tokens and NFTs between games via decentralised wallets – also brings personalisation in gaming a step further.
However, the potential commercialisation of in-game player generated items outside the environment where it was created raises challenging technical and legal questions. NFTs represent unique digital assets, but owning a NFT usually does not mean owning the IP rights protecting the underlying asset. For instance, while a player might own a NFT of a game character, the publisher or game developer may retain the rights to the character’s design and functionality.
The use of GenAI to help game development also raises IP related challenges, notably to the protection and ownership of the outputs (i.e. the results generated by the AI).
Given that intellectual property rights are the main assets of video game companies [read more], developers and publishers are encouraged to design IP policies to regulate the use of AI for game development and keep control over their games content. In addition, robust clearance processes – although increasingly complex to execute, must be implemented given that AI outputs may also infringe third-party IP rights.
Looking ahead – and beyond the UK
Regulators are keeping a watchful eye on whether AI in gaming is being used to increase engagement in a way that is unfair or detrimental to players and whether games companies are taking steps to prevent users encountering content that is harmful to children. Meanwhile thorny IP issues need to be addressed.
Designing and enforcing data protection, consumer protection and child safety strategies as well as robust IP policies is complicated in video gaming due to the worldwide distribution model typical in the industry. In each case the relevant analysis will vary from jurisdiction to jurisdiction. Video game companies must navigate their way through the law of many jurisdictions and take a risk-based approach to decisions.
Given increasing potential for reputational harm and regulatory scrutiny for video game companies and developers deploying innovative AI solutions, understanding the relevant risks and the regulatory landscape, which is evolving to address those risks, is more critical than ever.