As the rapid evolution of deepfake generation continues, producing some exceptionally convincing deepfakes (such as this viral video of ‘Morgan Freeman’) the average netizen may struggle to keep up.

Beyond facial abnormalities, anomalies in hand anatomy and video synchronisation emerge as potential red flags. AI image generators often falter in crafting realistic hands, while lip-syncing errors in videos may betray artificial origins.

However, this dual-use technology introduces a cat-and-mouse game, as tools detecting deepfakes inadvertently contribute to training AI models to evade detection.

In this landscape, employing AI image detectors and scrutinising source material become imperative, emphasising the need to look beyond the face to unmask the intricacies of AI-generated deception.

Pay attention to the face

Deepfakes typically involve facial transformations – that is, alternate faces transposed on another body.

Founder of prompt library company, AIPRM, Christopher Cemper says that examining fine skin textures and facial details is another important factor to consider.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

“While the deepfake generation has rapidly advanced, fully photorealistic reproduction of complex human skin and the minute muscular motions around our eyes, noses and mouths remains extremely challenging,” adds Cemper.

It’s all in the hand

While the complexity of hand anatomy stumps even the best artists, AI image generators are, too, notorious for their inability to produce a realistic hand. Often, they’re missing fingers or have one too many; maybe they have joins, maybe they don’t. 

A lack of spatial understanding in some AI models can result in unrealistic hand shapes and unnatural posing, especially in images of hands performing fine motor tasks, suck as grasping small objects. 

Look at lip-syncing 

For videos, Deepfakes may struggle with accurate synchronisation. Check for lip-syncing errors, where the audio and the movement of the lips do not match. 

There may be other audio anomalies such as unnatural pauses, glitches, or artefacts – or the speech may be stilted and off-pitch

Try an AI image detector

There are plenty of free AI image detectors including Everypixel Aesthetics and Illuminarty. These platforms use neural networks to analyse images for consistencies in AI generated images.

Jamie Moles, senior technical manager at ExtraHop, says that there are already detection algorithms in place to catch some Deepfakes, which work by scanning where the digital overlay connects to the actual face being masked.

“This tech is broadly called a ‘general adversarial network’ (GAN), and these tools have reported 99% accuracy in catching Deepfakes. The challenge is that GANs are used to train AI models to improve performance, meaning the tools that catch the Deepfakes are used to train those same models to avoid being caught in future,” says Moles.

Check the source

Compare the video with known source material, such as other videos or images of the same person. Look for discrepancies in appearance, voice, and behaviour. 

If the source cannot easily be found, check the metadata of the file. 

Metadata is automatically inserted into images or videos during their creation by the camera, and some media editing programs also incorporate it into files, so metadata can offer valuable insights into the origin of a video. 

However, relying solely on metadata is not enough alone to identify deepfakes. Embedded metadata can be easily manipulated by saving a video in a different format, processing it through editing software may erase the original metadata, a file can be re-uploaded without its initial title, and creators can manually alter the metadata.