Deepfake featuring the most popular YouTuber in the world promoted a scam

scam.jpg

The world’s most popular YouTuber, MrBeast, and two BBC presenters have been used to promote scams. Cybercriminals created fake videos with Artificial Intelligence (AI), in which celebrities were urging fraudulent purchases and investments. The materials were appearing on TikTok and Facebook.

A video using the likeness of MrBeast appeared on TikTok this week. The “YouTuber” was offering new iPhones for 2 dollars. Meanwhile, materials with the likenesses of BBC stars Matthew Amroliwala and Sally Bundock were used to promote a known scam. A video on Facebook showed journalists “introducing” Elon Musk, a billionaire, owner of X, formerly Twitter, allegedly promoting an investment opportunity.

Deepfake materials use artificial intelligence to create movies or photos with the image of selected people, which are strikingly similar to real ones. Videos created by AI allow almost any person to be ascribed any words, even those they have never said. The only condition for creating such material is access to a larger number of video clips featuring the person. The AI creates new material based on these.

– The provision of materials for deepfakes is a threat associated with using social media that we rarely think about. It affects not only online creators and celebrities. Many of us have published video materials for years, which can now be used for stealing our image and creating our utterances. In this way, our loved ones can be deceived and even our reputation can be destroyed – says Beniamin Szczepankiewicz, analyst at the ESET antivirus lab. He adds – Five years ago, detecting fabricated films and pictures was relatively simple, but it has become much harder over time. It is now increasingly difficult to determine whether a given material is the work of AI. Therefore, it is worth seriously considering your online activity, especially the publication of videos involving children. Currently, deepfake creation tools are within reach for any cybercriminal. Soon, blackmail or fraud through videos may be as common as the so-called “grandparent theft”. Voice cloning technology is developing very rapidly. So, it is easy to imagine that someone, for example, receives a video in which a family member asks for help, or hears the voice of a close relative asking for money. Therefore, it is so important to verify all calls for help by contacting the person in question through a different communication channel.