One of the scourges of the modern era is the deep fake or media generated using AI that looks real but, as the name implies, is actually counterfeit.
This media can be used for everything ranging from political attacks to propaganda.
While on the one hand, it might seem like an innocuous and comical phenomenon – such as when applied to Hollywood films that replace one actorās face with another – the underlying technology is pretty dangerous if left unchecked according to many experts.
Why? Primarily it is because, while we are at the beginning of the AI revolution now, later when the tech is refined it might be nearly impossible for the human eye to distinguish between what is real and what isnāt. When you couple this with technologies that mimic the human voice, you have an almost perfect storm regarding media and its integrity in the public eye.
US tech giant Intel thinks they have a solution and it involves human blood or, more specifically, how it flows through the face.
The technology is called FakeCatcher, the BBC reports, and uses Photoplethysmography (PPG) to monitor āchanges in blood flowā which is something that AI cannot fake (at least, for now).
Intel research scientist Ilke Demir explains ā[W]hen humans look at a point, when I look at you, it's as if I'm shooting rays from my eyes, to you. But for deep fakes, it's like googly eyes, they are divergentā and calculating this differentiation in traits results in a verdict on whether the video is fake or real. How accurate is FakeCatcher?
So far it is operating at 96%, BBC writes, but that comes with a caveat that notes the increasing complexity of deep fakes and the relatively conservative estimations of what is real and what isnāt which is one way of saying that everything is in its early stages. Of course, later iterations of FakeCatcher would improve hopefully in tandem with the deep fakes it is combating.
Any thoughts you might have on AI and deep fakes are welcome in the comments below.
We have some other photography news for you to read at this link.
[BBC]