Rising deepfake videos using artificial intelligence threat to content creators
About a month ago, Yamini Malhotra noticed her followers tagging her and sharing a video from an account, @rotahaler_, with her. To the 30-year-old’s shock, artificial intelligence was used to switch her face with that of actress Kajal Aggarwal in one of her videos. Malhotra was a victim of deepfake.
“Followers on Instagram just download my videos and repost them to increase their reach. This time, someone had used my body and Kajal Aggarwal’s face,” she told Mint.
The video started becoming popular. More than 32,000 followers have shared it in the past four weeks. After this incident, several more unidentified Instagram accounts created deepfake videos of Malhotra, adding unknown women’s faces to a video of her walking in a green saree.
Privacy invasion
“I cannot do anything and, at the most, send a copyright strike to Instagram. Whether they remove it or not, it is up to them,” says Malhotra, who has over 853,000 followers on Instagram, expressing her concerns over the infringement of her privacy and tampering with her content. Copyright strike is a legal copyright removal request for using their copyright-protected content.
Deepfakes, which are real-looking fake videos created using artificial intelligence, are not only on the rise but are also becoming more sophisticated with advances in AI and machine learning technologies.
According to a 25 April computer security company McAfee’s survey on the impact of AI, 3 in 4 Indians recently came across deepfake content, and almost 64% of the respondents feel AI has made it harder to spot online scams.
While some creators are leveraging the blurring lines between real and synthetically generated content to gain more traction, the industry, overall, views them as a threat to content creators.
“Deepfakes are dangerous, and content creators need to license their identities to prevent misuse of their content,” said Anshul Khandelwal, chief technology officer of text-to-video-generative AI platform, Invideo, at an HT Smartcast event last week.
Deepfake pornography
Data collected in March 2024 by Instagram engagement-enhancing platform Twicsy indicates that over 84% of influencers have been victims of deepfake pornography worldwide, with India being the second-most susceptible country after the United States. “The unlawful fabrication of pornographic material targeting influencers using deepfake technology is worrisome….It poses severe consent, digital safety, and online harassment concerns.” The report also specifies that 90% of the time, female influencers have fallen prey to deepfake pornography compared to male influencers.
A lack of regulation in the content creation space is one of the loopholes. “Because there is a lack of regulation and a boom in AI, there is a possible intellectual property threat,” cautioned Viraj Sheth, co-founder and chief executive officer of a new-media company, Monk Entertainment, which represents digital talent.
Companies are taking measures to curb the misuse of deepfakes, but sector watchers are calling for more self-governance.
“Deepfake is nothing but a cybercrime today that has been evolving for the last ten years,” said Amiya Swarup, partner and marketing advisory leader at consulting firm EY.
Self-regulation is key
Creators need to self-regulate, as there is a lack of regulation around generative AI technology and content creation. “Ultimately, content responsibility and self-regulation of content reside within a brand. Hence, all the checks and controls on influencer selection become critical,” Swarup added.
Some firms like Adobe have taken steps to bring in more checks and balances in content creation. Adobe’s content authenticity initiatives help people access the background information of the content when they run it through their website, including which application was used to create it, which model was used to create it, and who created it.
You are on Mint! India’s #1 news destination (Source: Press Gazette). To learn more about our business coverage and market insights Click Here!