President Zelenskyy deepfake asks Ukrainians to ‘lay down arms’

President Zelenskyy deepfake asks Ukrainians to ‘lay down arms’
Ryan is a senior editor at TechForge Media with over a decade of experience covering the latest technology and interviewing leading industry figures. He can often be sighted at tech conferences with a strong coffee in one hand and a laptop in the other. If it's geeky, he’s probably into it. Find him on Twitter: @Gadget_Ry

A deepfake of President Zelenskyy calling on citizens to “lay down arms” was posted to a hacked Ukrainian news website and shared across social networks.

The deepfake purports to show Zelenskyy declaring that Ukraine has “decided to return Donbas” to Russia and that his nation’s efforts had failed.

Following an alleged hack, the deepfake was first posted to a Ukrainian news site for TV24. The deepfake was then shared across social networks, including Facebook and Twitter.

Nathaniel Gleicher, Head of Security Policy for Facebook owner Meta, wrote in a tweet:

“Earlier today, our teams identified and removed a deepfake video claiming to show President Zelensky issuing a statement he never did.

It appeared on a reportedly compromised website and then started showing across the internet.”

The deepfake itself is poor by today’s standards, with fake Zelenskyy having a comically large and noticeably pixelated head compared to the rest of his body.

It shouldn’t have fooled anyone, but Zelenskyy posted a video to his Instagram to call out the fake anyway.

“I only advise that the troops of the Russian Federation lay down their arms and return home,” Zelenskyy said in his official video. “We are at home and defending Ukraine.”

Earlier this month, the Ukrainian government posted a statement warning soldiers and civilians not to believe any videos of Zelenskyy claiming to surrender:

“​​Imagine seeing Vladimir Zelensky on TV making a surrender statement. You see it, you hear it – so it’s true. But this is not the truth. This is deepfake technology.

This will not be a real video, but created through machine learning algorithms.

Videos made through such technologies are almost impossible to distinguish from the real ones.

Be aware – this is a fake! The goal is to disorient, sow panic, disbelieve citizens, and incite our troops to retreat.”

Fortunately, this deepfake was quite easy to distinguish – despite humans now often finding it impossible – and could actually help to raise awareness of how such content is used to influence and manipulate.

Earlier this month, AI News reported on how Facebook and Twitter removed two anti-Ukraine disinformation campaigns linked to Russia and Belarus. One of the campaigns even used AI-generated faces for a fake “editor-in-chief” and “columnist” for a linked propaganda website.

Both cases in the past month show the danger of deepfakes and the importance of raising public awareness and developing tools for countering such content before it’s able to spread.

(Image Credit: President.gov.ua used without changes under CC BY 4.0 license)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo. The next events in the series will be held in Santa Clara on 11-12 May 2022, Amsterdam on 20-21 September 2022, and London on 1-2 December 2022.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

Tags: , , , , , ,

View Comments
Leave a comment

Leave a Reply