Is Your Work Training AI Without Your Permission?
By Enrique Cheang, E.C.V. & Asociados

Platforms like Midjourney, ChatGPT, and Stable Diffusion rely on billions of creative works—photos, artworks, articles, and books—to train their systems to generate new content. While these AI tools appear innovative, they are often built on the labor and creativity of countless individuals.
Many artists have discovered that their work, styles, or even specific creations have shown up in AI-generated content. One illustrator famously found AI-generated images that imitated his unique artistic style—without his knowledge or approval.
A Legal Gray Zone
This widespread use of protected works for AI training raises serious copyright concerns. However, the law has yet to catch up with the rapid development of AI technologies.
A notable case involves Getty Images, which sued Stability AI for allegedly using its copyrighted photos without permission or licensing. Similar legal actions are emerging across the creative industries, as more professionals question the ethics and legality of AI training practices.
How Can You Check if Your Work Is Being Used?
There are a couple of ways to investigate:
-Use tools like “Have I Been Trained?” – This website lets you search large AI training datasets to see if your work has been included.
-Review the terms of use on platforms where you upload content. Some sites grant broad permissions that may allow your work to be used for AI training without direct notice.
The Ongoing Debate
This situation has sparked a major debate:
-Should AI companies be required to license or pay for the content they use to train their models?
-Or is it fair use, since the AI transforms the original works into something new?
As the legal and ethical frameworks catch up, creators are increasingly advocating for transparency, consent, and compensation in the AI training process.