The advent of AI is a watershed moment for many industries.
And the question of how these platforms became so sophisticated is one that is heading to the courts.
That’s not shocking, given that so many of these generative AI trained themselves on the works of acclaimed masters in fields ranging from photography to literature. Just how much so is what remains to be determined and how creators will be compensated, if at all.
But figuring out whether or not AI infringed on copyright is only half of the problem, the other being “how can creators protect themselves moving forward?”
Some researchers at the University of Chicago in the United States think they’ve found the answer with a poison pill that protects your work from the copying eyes of AI platforms.
Called Nightshade, it literally “poisons” the way AI is able to learn and create content, Engadget reports.
Ben Zhao and his team also released a tool called Glaze that “subtly alters” the pixels on a digital work such that generative AI platforms see something different from what the image is. The addition of Nightshade is another layer of protection for creators using Glaze, Engadget notes.
So, the question is: Could this be abused? Of course, like anything, Zhao points out that such a thing would need to be done at scale; as such, these tools are envisioned as options for creators who are looking for a way to protect their digital works now. Given the relative youth of the industry, such tools are few and far between and provide creators with at least some options for protecting their work.
Would you protect your work with something like this? Let us know your thoughts on this and AI more broadly in the comments.
We have some more photography news for you at this link.