news agency
AI video call scam: Businessman transfers $610,000 to alleged friend

AI video call scam: Businessman transfers $610,000 to alleged friend

A scammer in China used the artificial intelligence to change his appearance and pose as a friend of a businessman, from whom he managed to steal more than US$600,000, the authorities reported.

The victim, of whom only the last name was published, Guoexplained that he received a video call in April from someone whose face and voice closely resembled those of a person close to him.

But that person was actually a scammer who used “an artificial intelligence technology to modify your face”, according to an article published by a media outlet linked to the Fuzhou authorities (east).

The scammer claimed as a pretext that another friend urgently needed money to pay a security deposit for a tender.

That was how he managed to convince Guo to transfer 4.3 million yuan (US$610,000) from his company’s bank account.

During the video call, I was convinced that I had recognized the face and voice of the person who was calling me, so I was not suspicious.”, declared the businessman, quoted in the article.

After making the payment, Guo sent a message to the friend whose identity had been usurped. And given his incomprehension, since he was not aware of anything, the businessman realized the mistake and quickly called the police.

The police ordered the bank not to carry out the transfer and Guo he was able to recover 3.4 million yuan ($482,000), according to the text.

The perpetrators of the thymus were not identified.

The use of artificial intelligence, sometimes for malicious purposes, has raised mistrust around the world, especially after US company OpenAI launched ChatGPT, a conversational robot that can imitate the human voice, in November.

This interface is not accessible in China, but ChatGPT It’s a topic that’s talked about a lot on social media, and Chinese tech giants are doing their best to create similar devices.

Source: AFP

Source: Gestion

You may also like

Hot News

TRENDING NEWS

Subscribe

follow us

Immediate Access Pro