A major problem that modern media is grappling with is the rise of fake news, images, and even video.
As you can imagine, with the advance of generative AI, these media are even more sophisticated now than ever before. Some platforms can even generate voices, music, and whatever else you might need to paint a somewhat convincing scene.
For news outlets, such as the United Kingdom’s BBC, this is a big problem because their institutional integrity is on the line while the incentive to be the “first to report” something is omnipresent. Making sure whatever is published is accurate and, naturally, real is a task for journalists in 2024.
CEO of BBC News Deborah Turness said in a blog post announcing the initiative, “At BBC News we know that trust is earned. When our audiences know not just what we know, but how we know it, they feel they can trust our journalism even more. That’s why we are proud to lead the way with a brand new feature that will allow consumers to see how we have checked and verified that the images we use are authentic. In a world of deep fakes, disinformation and distortion, this transparency is more important than ever.”
The Content Credentials standard is a “free” technical system for establishing the provenance of media, BBC reports. Founded by the Coalition for Content Provenance and Authenticity (C2PA), C2PA is a partnership involving BBC Research & Development, Adobe, and Microsoft. Recent members include Google with an anticipated future participation by OpenAI and Meta.
Any thoughts you might have on deep fake imagery and AIG-generated content regulation are welcome in the comments.
We have some other photography news for you to read at this link.