Andrzej Duda promises PLN 4,000 a day for everyone?  Beware of deepfakes featuring the president

Andrzej Duda promises PLN 4,000 a day for everyone? Beware of deepfakes featuring the president

An AI-generated recording (deepfake) has been circulating on the Internet for several days, in which President Andrzej Duda allegedly promises a passive income of PLN 4,000 a day. This is another artificial intelligence scam.

In recent months, the Internet has been flooded with fake photos and videos featuring famous politicians, journalists and celebrities. We are dealing here with so-called deepfakes, i.e. materials generated by increasingly popular AI (artificial intelligence) tools, which are disseminated primarily on social media.

Fraudsters use the image of popular people to attract the attention of potential victims, gain their trust and then extort money. In such a fake recording, a celebrity may, for example, encourage you to make a risky investment or to install an application with malware embedded in it.

A deepfake of President Duda is circulating on the Internet. Don’t be fooled.

Recently, the victims of deepfakes include: footballer Robert Lewandowski, journalist Monika Olejnik, actor Anna Mucha, presenter Elżbieta Jaworowicz, and even former president Aleksander Kwaśniewski. The incumbent president, Andrzej Duda, has now also joined this group.

Now every citizen of our country can earn up to PLN 4,000 a day by investing any amount

– we hear on the fake recording. Although we actually see President Andrzej Duda on the screen and his voice is heard from the speakers, we are dealing here with a prepared film.

This is a classic attempt at investment fraud “for passive income”. The creators of this practice, pretending to be an investment company, lure us with the promise of profits. In fact, if we decide to deposit any money into the given account, not only will we not earn anything, but we will also lose our “invested” funds.

An “advertisement” with the alleged participation of the president encouraging investment has been circulating online for several days. What’s worse, as Sekurak notes, the spot also appeared on YouTube, which may add credibility to it. What certainly doesn’t make it credible is the rather crude preparation of the film itself – for example, it is clear that the sound was not properly synchronized with the movement of the lips.

Unfortunately, it can be expected that as AI technology continues to develop, these types of fake videos will become more “professional” and it will be more difficult for us to distinguish a real recording from a deepfake. Therefore, it is even more important to be careful and carefully verify every information we come across on the Internet. Especially if our private data or money is involved.

Source: Gazeta

You may also like

Immediate Access Pro