ChatGPT might’ve broken the internet – and it could also own anything you create with it
Kim Kardashian would be jealous of the attention ChatGPT, the AI content creator from tech firm OpenAI, is getting. Every newspaper is writing about it, front cover of Time Magazine. The reality-tv star once wanted to “break the internet” with a picture of her curves, but the chatbot might’ve just usurped her.
A multi-year, multibillion dollar investment package from Microsoft in late January, quickly followed by the launch of its first subscription-based service, this is a company with massive latent potential, gearing itself up for rapid growth and profitability. “The next major wave of computing”, as it was described by Microsoft CEO Satya Nadella, is both exciting and unnerving in equal measure and, no doubt, businesses will be considering the commercial opportunities such sophisticated content automation could open up for them.
Microsoft will be hoping it can level the playing field between Bing and Google. But ChatGPT has applications for a whole host of business functions, from creating marketing and social media content, to drafting technical manuals and, to keep us lawyers on our toes, even contracts.
Perhaps less well thought through however, are the legal ramifications of deploying this type of technology – of which there are many.
AI-generated content somewhat blurs the lines when it comes to who owns a piece of work once it’s completed. Even a platform as sophisticated as ChatGPT relies on what it can find online and what is fed to it during use to learn from and base its output on. With a less sophisticated AI, there is even a risk that the system could copy something that came before.
This makes the question of who owns that content much more complex than with a bespoke piece of intellectual property created by a human – either as an individual or on behalf of an organisation.
In UK law, the first and most obvious answer is that the person or organisation that developed the AI system owns the intellectual property rights in the content it creates. However, if the AI is fine-tuned and trained using other datasets or inputs from another user, then that person would also have a claim.
It gets even more difficult to prove ownership if using someone else’s AI-as-a-service – if you’re one of those intrepid digital frontiersmen about to use ChatGPTs new £16 a month subscription service, for example. The contract for use of such systems will be key to ensuring ownership of intellectual property created by the AI is clear.
Training an AI often requires massive quantities of data to be fed into the programme so it can digest, learn and hone its linguistic accuracy. These data sets might be protected by copyright and database rights, meaning that use of them to train the AI without a license could infringe on the intellectual property rights of the database owner.
UK law is changing to make some commercial text and data mining easier but it won’t be a catch all. Businesses will need to careful about the datasets they use to train AIs and seek permission where applicable.
There is also the risk of defamation – if the AI based content on false and damaging information about a person or business – or negligence claims if a business were to use the AI to provide information or advice to customers. If that advice turned out to misinformed or inaccurate and led to loss or damage, the customer could well make a case for negligence.
Microsoft’s boss might be right, and we are entering an exciting era of computing. But, we must also acknowledge that this brave new AI-enabled world has its pitfalls. There are countless legal grey areas with computer-generated content that, while not necessarily dealbreakers, need proper consideration by any business that wants to employ an AI copywriter instead of a human one.