Beware of AI-driven video calls and unknown contacts on WhatsApp

Artificial Intelligence has become integral part of our life with numerous applications utilising the latest technology in different forms. Just as every coin has two sides, AI can be employed to drive development and also to deceive the credulous.

Update: 2023-12-15 06:20 GMT

AI-driven video calls

Artificial Intelligence has become an integral part of our life with numerous applications utilizing the latest technology in different forms. Just as every coin has two sides, AI can be employed to drive development and also to deceive the credulous. 

Deepfakes, a portmanteau of "deep learning" and "fake," uses artificial intelligence (AI) and machine learning (ML) to create or manipulate audio and visual content to generate realistic but fabricated media. The images, audio, and video created using this technology look convincingly natural. The technique is used to replace or manipulate a person’s appearance or voice with something that they had never done or said. Deepfakes have been in existence for some time now but recent technological advancements have made them more sophisticated.

Of late, deepfake technology is being misused to defame celebrities and also use their reputation to trap the gullible and loot them.

In November 2023, a deepfake video of actress Rashmika Mandanna took India by surprise. The deepfake video was created basing on an original video shot by an influencer. The woman’s face in the influencer’s video was replaced with that of Mandanna using deepfake technology. Later, deepfake videos of actresses including Kajol and Katrina Kaif also went viral, causing discomfiture to them.

In addition, deepfake technology is being used by fraudsters to run scams like Laila Rao, which surfaced in September 2023. As part of it, several innocent women were duped after they made big investments harbouring hopes of heavy interests.

Laila Rao was promoted through several social media posts stating that she assists women in achieving financial independence by facilitating growth through her investment project.

Laila Rao is a fictitious character who was brought to life as a lookalike of Indian TV actress Smriti Khanna, who in fact, had no connection with the scam. The actress alerted her audience in a youtube video stating that her videos are being used in frauds

Full View

Videos of several social media influencers like Jaggi Vasudev promoting Laila Rao and her programs went viral on social media. It is significant to note that all those videos were altered using deepfake technology wherein the original video (of Jaggi Vasudev and actress Smriti Khanna) was edited and audio using AI technology was added.

Later, similar doctored videos of likes of Jaggi Vasudev and others were also used as promotions of various fake brands and persons. Recently, a deepfake video of renowned industrialist Ratan Tata luring individuals into an online betting scam also came to light wherein he endorses an online betting coach and asks people to join a Telegram channel run by the coach, Amir Khan.

Two deepfake videos of Infosys founder NR Narayana Murthy purportedly promoting a so-called investing platform “Quantum AI” are being shared on social media, amid the increasing concern over the deepfakes deceiving the public. One of the now-deleted videos on Facebook showed a morphed version of the software leader purportedly saying he and billionaire Elon Musk are working on a project called “Quantum AI”.

Meanwhile, some others are scamming people through video calls, especially on WhatsApp. Video calls come in handy to converse with family members, friends or colleagues across the world. Using artificial intelligence, fraudsters are duping the gullible and looting their money in many countries.

In one such incident in northern China, a man found himself to be the victim of an AI-driven video call scam involving deepfake technology. With the help of AI-powered face-swapping technology, the scammer posed as the victim's close friend during a video call and persuaded him to transfer an amount of 4.3 million yuan (over Rs 5 crore).

In another case, a man from Kozhikode in Kerala was cheated of Rs 40,000 in an AI-based scam. Radhakrishnan received a video call on WhatsApp from an unknown number. On receiving the call, the person on the other side looked like one of his former colleagues from Andhra Pradesh. To gain trust, the caller even mentioned the names of some common friends. The colleague asked for Rs 40,000 to help a relative in a hospital, which the victim paid online. However, he later found that the call was not genuine.

So, if you receive a video of a celebrity urging you to invest in a person or an organization, or if you get a video call from an unknown number, asking for monetary help, don’t act immediately. Take your time, get the facts straight. Research online about such investments, call others and confirm their authenticity.

Observe the facial expressions and movements keenly. Deepfake may have some glitches, some unnatural features like uneven eyes, more than 5 fingernails, etc.

If you get a call from an unknown number, be cautious not to give out your details, get them to move the camera to and fro from them. Ask them to wave or hold an object to understand the authenticity of the call.

Be vigilant so that you and the people around you are not victimized by these scams.

Tags:    

Similar News