Getty Images sues Stability AI in landmark copyright case over unauthorised image use

Artificial intelligenceCopyrightAI6 months ago494 Views

Getty Images has initiated a landmark copyright lawsuit against Stability AI, a British artificial intelligence firm, accusing it of using millions of copyrighted images without permission to train its generative AI software. The case, being heard in the UK High Court, has been described as a pivotal moment in the ongoing battle between creative industries and AI companies over intellectual property rights.

Stability AI allegedly utilised 12 million images and videos from Getty’s library to train its Stable Diffusion software. This technology enables users to create images from written commands, a process that Getty argues directly infringes on creators’ copyright. According to Getty, Stability made no effort to secure the necessary permissions or payments for the use of these works, including those protected by watermarks or classified as unsuitable material.

Getty claims that Stability’s software not only reproduces unauthorised images but also generates AI outputs emblazoned with Getty’s own trademarks. Examples cited in court include AI-generated visuals of public figures such as Jürgen Klopp, former Liverpool FC manager, and the late musician Kurt Cobain. The High Court was told that this unlicensed use constitutes copyright infringement, database rights infringement, and trademark violations.

The case has brought broader issues of copyright regulation and AI ethics into focus. Ministers have signalled interest in revising existing laws to facilitate investment in the UK’s AI sector, a move that the £126 billion creative industries warn could legitimise intellectual property breaches. Meanwhile, parliamentary tensions continue, with the House of Lords campaigning for stronger protections for creators.

For its part, Stability AI has defended its practices, stating that Getty views generative AI as an existential threat to its business model. Stability’s legal argument hinges on the claim that much of the data training occurred outside the UK, challenging the jurisdiction of the court’s ruling. The company argues that dismissing Getty’s claims is essential to ensuring the availability of generative AI tools to UK users.

The outcome of this trial is poised to have significant policy ramifications. Should Getty succeed, it could hasten the development of stricter copyright protections, raising questions about the long-term growth of AI innovation within the UK. Conversely, a ruling in favour of Stability AI could intensify calls from content creators for a regulatory overhaul to safeguard intellectual property against AI-driven use.

The case is scheduled to run until late June, with a ruling expected by the end of the year. Legal experts anticipate that the verdict could influence not only UK intellectual property law but also the broader global dialogue around the intersection of creativity and artificial intelligence.

Post Disclaimer

The following content has been published by Stockmark.IT. All information utilised in the creation of this communication has been gathered from publicly available sources that we consider reliable. Nevertheless, we cannot guarantee the accuracy or completeness of this communication.

This communication is intended solely for informational purposes and should not be construed as an offer, recommendation, solicitation, inducement, or invitation by or on behalf of the Company or any affiliates to engage in any investment activities. The opinions and views expressed by the authors are their own and do not necessarily reflect those of the Company, its affiliates, or any other third party.

The services and products mentioned in this communication may not be suitable for all recipients, by continuing to read this website and its content you agree to the terms of this disclaimer.

Our Socials

Recent Posts

Stockmark.1T logo with computer monitor icon from Stockmark.it
Loading Next Post...
Popular Now
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...