AI is taking over everything and, with Microsoft’s help, it’ll soon be able to help authenticate photos.
In other words, Microsoft’s new tech can help spot a real photograph from a heavily edited one.
Per Microsoft’s own description of the technology:
“Video Authenticator can analyze a still photo or video to provide a percentage chance, or confidence score, that the media is artificially manipulated. In the case of a video, it can provide this percentage in real-time on each frame as the video plays. It works by detecting the blending boundary of the deepfake and subtle fading or greyscale elements that might not be detectable by the human eye.”
As the last line of the description details, the real power of this technology is in combating the deepfake epidemic that many media analysts suspect will plague us in the coming years. Being able to immediately debunk this kind of thing will be a useful technology for news outlets and others that purportedly are in search of the truth. After all, there’s probably no greater moment of “egg on your face” than a news outlet potentially accepting a deepfake as a real thing.
And to show the world how serious Microsoft is about this project, the Microsoft Video Authenticator will not be released to the public (in order to prevent bad actors from learning about and circumventing the technology). It will only be used in partnership with the AI Foundation’s Reality Defender 2020 (RD2020) initiative, PetaPixel reports.
What do you think of Microsoft’s AI-based photo authenticator? Let us know your thoughts in the comments below.
And be sure to look at some of our other photography news articles by clicking this link.
I can imagine the initial use of this technology in journalism and forensics. I wonder though whether there is a broader use elsewhere in the public sector.