It’s no surprise that AI is becoming a fairly regular issue in all types of commercial contracts, not just tech focused ones. The ability of employees to use AI tools to help deliver services, especially in more creative industries, means that the use of AI needs to be addressed by suppliers and customers.
What are the issues?
In both the UK and the US, there are still some legal uncertainties relating to AI created works. These are around:
These uncertainties cause a number of knock-on issues.
Providers of some AI tools deliberately waive their responsibility for these issues – i.e. they give no protection in relation to infringement or ownership of IP rights – effectively ‘use our tool at your own risk’.
If you’re a service provider….
You need to be careful about what AI tools your employees are using and what they’re using them for. For example, if you’re a creative business, developing content for your clients, those clients will probably want to own the IP in the content being developed. You may not be able to do this with AI created content.
If you’re a technology company using your own AI tools, you will probably want to qualify elements of your services. For example, it is not unusual to see wording that states all AI generated outputs should be checked by a human.
If you’re a customer….
You’ll be concerned about the IP ownership issue detailed above and any other qualifications. You may also be concerned about your content being used to train AI models and will want to cover this with restrictions in your contracts with service providers.
How to resolve?
For suppliers, you need to have an internal policy detailing what AI tools employees can and can’t use, based on a review of the tools’ terms & conditions and their respective strengths & weaknesses. You may also need to amend your T's & C's to take into account any AI related issues.
For customers, make sure you address any concerns around use of AI by your service providers.
It’s a generalisation, but some of the issues can be avoided by using paid for/enterprise versions of AI tools. These generally give more protection on IP issues and some operate in closed environments, meaning for example that data won’t be used to train the model.
If you have any questions on this topic, contact Ian Grimley.