A AI firm based in London has prevailed in a significant high court case that examined the lawfulness of machine learning systems using vast amounts of protected data without permission.
Stability AI, whose directors includes Academy Award-winning director James Cameron, successfully defended against allegations from the photo agency that it had violated the global photo company's copyright.
Industry observers view this decision as a setback to copyright owners' exclusive right to profit from their artistic output, with a senior lawyer warning that it demonstrates "Britain's secondary copyright regime is not adequately robust to safeguard its artists."
Court documentation revealed that the agency's images were indeed employed to train the company's system, which allows individuals to generate images through written prompts. Nonetheless, Stability was also found to have infringed Getty's trademarks in some cases.
The presiding justice, Mrs Justice Joanna Smith, remarked that establishing where to strike the balance between the interests of the artistic industries and the artificial intelligence sector was "of very real societal concern."
The photo agency had originally sued the AI company for violation of its intellectual property, alleging the AI firm was "entirely unconcerned to what they fed into the training data" and had scraped and replicated millions of its photographs.
However, the agency had to withdraw its initial copyright case as there was insufficient proof that the development took place within the United Kingdom. Alternatively, it continued with its legal action arguing that Stability was still using copies of its image content within its systems, which it called the "core" of its business.
Highlighting the complexity of AI copyright disputes, the company essentially contended that the firm's image-generation model, known as Stable Diffusion, amounted to an violating reproduction because its development would have constituted IP violation had it been carried out in the UK.
The judge determined: "An AI model such as Stable Diffusion which fails to retain or replicate any copyright works (and has never done) is not an 'violating reproduction'." She declined to rule on the misrepresentation allegation and found in support of some of Getty's arguments about trademark infringement related to digital marks.
In a official comment, the photo agency stated: "We remain profoundly concerned that even financially capable organizations such as our company encounter substantial difficulties in protecting their artistic works given the absence of transparency standards. Our company committed millions of currency to achieve this stage with only one company that we must proceed to address in a different venue."
"We urge governments, including the UK, to establish stronger transparency rules, which are essential to avoid costly legal battles and to enable creators to defend their interests."
The general counsel for the AI company commented: "We are pleased with the court's decision on the outstanding allegations in this proceeding. Getty's choice to willingly withdraw most of its IP claims at the conclusion of court testimony resulted in a subset of claims before the court, and this concluding decision ultimately addresses the copyright concerns that were the core issue. Our company is thankful for the attention and effort the court has put forth to resolve the significant questions in this proceeding."
The ruling comes during an continuing debate over how the current administration should legislate on the matter of intellectual property and AI, with creators and authors including several prominent individuals advocating for enhanced protection. Meanwhile, tech companies are calling for wide access to copyrighted material to enable them to build the most advanced and effective generative AI systems.
Authorities are presently consulting on IP and AI and have stated: "Uncertainty over how our copyright system functions is impeding development for our AI and artistic industries. That cannot continue."
Industry specialists following the issue indicate that regulators are considering whether to implement a "text and data mining exception" into UK copyright law, which would allow protected material to be utilized to train machine learning systems in the United Kingdom unless the owner chooses their content out of such training.
Lena is a passionate gamer and tech writer, specializing in indie games and esports coverage.