Chinese face-swap app ZAO, which allows users to swap faces with film and TV characters, has opened a debate on privacy concerns.

ZAO uses AI to produce deepfake versions of well-known footage, enabling users to re-cast themselves in their favourite films. It became the most downloaded app in China’s iOS app store upon its launch last Friday.

The app was developed by instant messaging app company Momo – who also created popular Chinese dating app Tantan.

Privacy concerns over ZAO face-swap app

One main concern voiced by critics surrounded its questionable privacy policy. One policy clause was revealed to give the app’s developers the global rights to use any imagery created on the app for free, without giving users a way to opt-out of the agreement.

However, following backlash, the privacy policy was tweaked. Under the new agreement, any content created on ZAO will require the user’s prior consent to be used for other purposes.

Deleted content will also be completely wiped from ZAO’s databases. However, as users are responsible for the authorisation to use images in the app, ZAO has also stressed that is not responsible if someone’s photo is used without their permission.

In a statement via Chinese microblogging website Weibo ZAO said: “We thoroughly understand the anxiety people have towards privacy concerns.

“We have received the questions you have sent us. We will correct the areas we have not considered and require some time.”

Other social media platforms, such as WeChat also blocked the distribution of ZAO videos. Many others have highlighted how ZAO is a surprising advancement for China, where mass surveillance and facial recognition technology are widespread.

The uproar has not completely spoiled the app’s appeal, however, as it still remains the top free download in China.

Recently, privacy concerns surrounding deepfake apps have increased. In July this year, users of viral sensation FaceApp voiced privacy concerns regarding the use and security of images.


Read more: Deepfakes: AI video tool could make fake news easier to create and harder to spot