How to mitigate legal risks when using generative AI in Europe
Published on 9th Apr 2023
AI within different legal frameworks can carry legal risks, but businesses can take steps to mitigate these risks
Generative artificial intelligence (AI) is transforming the way people learn and create. Used right, this technology has the potential to create content, products, and experiences that were once unimaginable. However, its rapid advancement has raised legal concerns, including issues of copyright infringement, data privacy, and liability. These challenges are not limited to any type of business, as generative AI tools can be used in many settings. However, making use of these new tools can come at a steep price tag if AI use runs afoul of legal requirements. What steps can companies take to leverage the power of generative AI while mitigating the associated legal risks?
IP and copyright infringement
The ability of generative AI to produce original content, such as music, images, and text, has created new challenges in intellectual property (IP) law. Companies must ensure that their use of AI-generated content does not infringe on the rights of copyright holders, and it is currently unclear to what extent the output of such models is protected by copyright.
To mitigate these risks, companies should carefully evaluate the use cases of generative AI and consider using dedicated AI models trained on data that is legally obtained with appropriate licenses in place. Lawsuits have already been filed alleging that the use of images generated by AI models infringed the copyright in images contained in the training data.
Companies using content created by AI tools should consider establishing guidelines for the use of AI-generated content, since it may not be protected by copyright everywhere. This can present an issue especially if the output is crucial to the company's product, since it will be harder to take legal action against copycats and counterfeiters. The law is still developing on this point and the outcome may be different in different jurisdictions. In the EU, copyrightable work generally needs to be the (human) author's intellectual creation, a condition not met by AI. The US Copyright Office has issued guidelines stating that the output of generative AI tools is generally not protected, whereas UK copyright law does potentially protect computer generated works where there was no human involvement, but this area is under review.
Data privacy and security
Data privacy is a critical issue when training, developing, and using AI tools. Generative AI tools carry high risks because of the vast amount of data used to train them. There is a risk that personal data used to train these models was not used lawfully or could be reverse engineered by asking AI the right questions, creating both privacy and security risks.
Any business developing or using generative AI will need to ensure that they are doing so in compliance with local laws such as the General Data Protection Regulation (GDPR) in the EU and the UK GDPR in the UK. The first step in this is to identify whether personal data (which is defined widely to include information relating to an identified or identifiable natural person) is being used as at all.
Where personal data is used for development, this should be for a specific purpose and under a specific legal basis. The personal data will need to be used in line with legal principles and special considerations will need to be made as to how an individual could exercise their data rights. For example, would it be possible to provide any individual with access to information about them.
When using AI to create outputs, these should be monitored for any potential data leakages that could amount to a data breach. For example, where an individual has published information about themselves on social media, it does not necessarily mean it is legal to use that information for other purposes (for example, to create a report about potential customers to target for an advertising campaign).
Contracts and confidentiality
Before implementing or permitting the use of any generative AI tool, companies should also check the terms under which the tool is provided. These terms may restrict how the output can be used or give the provider of the tool broad rights in anything used as a prompt or other input. This is particularly important if tools are used to translate, summarize, or modify longer internal documents, which, aside from personal data, may also contain information that the company would rather keep proprietary or confidential. Uploading such information to a third-party service could breach non-disclosure agreements and trigger serious liability risks.
AI and sector-specific regulation
In addition to laws surrounding AI, international businesses should be aware that there are specific laws being developed that cover the use of AI in the EU. The current draft legislation creates obligations for companies based on the risk that AI creates. Where it is used in a high-risk scenario, the providers and users of these systems will need to do more to meet compliance requirements (while some applications are deemed to be an unacceptable risk). In contrast, the UK has recently put out a white paper stating that AI will not have specific regulation, but instead it will be up to sector specific regulators.
How generative AI falls within either of these frameworks will depend on the context in which it is used. Any business planning to use generative AI to offer international products or services should consider EU and UK legal stances early on in the development to mitigate the risks of potential fines or the requirement to redevelop that product or service.
Osborne Clarke comment
Generative AI offers tremendous potential for companies to innovate, streamline, and increase their efficiency. However, businesses must be diligent in addressing the legal risks associated with the technology. By implementing, monitoring, and enforcing policies based on the guidelines outlined above, companies can harness the power of generative AI while mitigating potential legal pitfalls.