Last Updated on by
AI and its impact on photography and videography is becoming a bigger issue of concern and if you thought it was all overblown, wait until you see the kinds of things that even the simple stuff is capable of producing when it comes to fake videos and speech.
The Verge highlighted research led by Adobe Research and joined by Stanford, Princeton, and the Max Planck Institute for Informatics which looked at how software can be used to edit the literal words coming out of someone’s mouth to make it look as if they said something else. Naturally, the group created a pretty cool video to illustrate their point and its results are pretty surprising. For fans of the film Apocalypse Now, you’ll notice how easy and seamless the change of one of that film’s famous quotes was using this software.
Claim Your Free Camera Craft Cheat Sheet
Print it out and keep it for when you really need it - when you're out shooting!
None of the efforts demoed by this consortium are going to be available for consumer use but that might not be the permanent state of things. After all, as The Verge points out, Adobe, for its part, has shown off tech in the past called VoCo that purported to make editing video clips and the speech in them as easy as editing a picture.
How it works is that the software analyzes the subject’s speech and isolates common sound patterns from that. The software can then link up these component sounds with the facial expressions the speaker makes when saying those sounds. They then combine these two together into a 3D model and impose that on the lower portion of the speaker’s face to make the appropriate facial movements.
Probably most frightening of all was a survey of a group of individuals found that the majority were fooled by the “deepfake” videos.
In a blog post discussing the research the team wrote, “Although methods for image and video manipulation are as old as the media themselves, the risks of abuse are heightened when applied to a mode of communication that is sometimes considered to be authoritative evidence of thoughts and intents. We acknowledge that bad actors might use such technologies to falsify personal statements and slander prominent individuals.”
As always, we would love to know your thoughts on this technology and how it could impact society in the future. Let us know in the comments section below.The Verge]