Intel has announced what it claims to be the world’s first real-time deepfake detector, aptly named FakeCatcher. It analyses the subtle blood flow in video pixels to return results in milliseconds with a 96% accuracy rate.
FakeCatcher was designed by Ilke Demir, a senior staff research scientist in Intel labs, in collaboration with Umur Ciftci from the State University of New York at Binghamton. The technology uses Intel hardware and software and runs on a server with a web-based platform for interfacing.
What makes FakeCatcher different from most deepfake detectors is that while most detectors look at raw data to find signs of inauthenticity, FakeCatcher tracks blood flow in our skins instead. Our veins change colour when blood is pumped through them. These “blood flow signals” are collected from all over the face and then translated into spatiotemporal maps using algorithms. Finally, deep learning comes into play to detect whether a video is real or fake.
The teams working on the project used OpenVino to run AI models for face and landmark detection algorithms. Computer vision was taken care of by OpenCV and Intel Integrated Performance Primitives. Intel Deep Learning Boost, Advanced Vector Extensions 2 and Advanced Vector Extensions 512 were used to optimise inference and media blocks.
On the hardware side, the platform uses the third generation Intel Xeon scalable processors allowing it to run up to 72 detection streams simultaneously.
While Intel does claim the aforementioned 96% accuracy rate, the approach largely depends on detecting subtle changes happening under the skin. This can be a problem at times, especially when working with low-quality videos. The company did not say what datasets were used to train the model and whether or not there were any specific requirements for a deepfake video to be checked by the platform.
In the News: Iranian hackers breach US Federal Agency using Log4Shell exploit
Someone who writes/edits/shoots/hosts all things tech and when he’s not, streams himself racing virtual cars.
You can contact him here: [email protected]