Believe it or not, the world of deepfake has gone way too deep!

South First reports the issue from the perspective of South Indian celebs and gives an insight into what deepfake is all about.

ByShashiprasad S M | Arjun Ramachandran | Manigandan KR | Prakash Pecheti

Published Nov 08, 2023 | 3:22 PMUpdatedNov 08, 2023 | 6:28 PM

Deep Fake Technology

“With great power comes great responsibility!” — the adage from the popular Spiderman series seems to fit perfectly with deepfake technology in the modern world.

Artificial technology is a boon when used for good purposes and is a bane when used for wrong reasons — just like in the case of actress Rashmika Mandanna whose deepfake video became a serious cause of concern.

Following Rashmika’s case, celebrities and who-is-who from South Indian cinema have expressed deep concerns over the wide misuse of deepfake technology.

While the demand for an effective legal framework to tackle the menace and to identify the perpetrators for stringent punishments is on the rise, South First reports the issue from a Southwide perspective, including an insight into what deepfake is all about.

Fake to deepfake

Ever heard the term “photoshopped”? Well, that’s a basic feature of manipulating a photo image that even elderly people, who are not tech-savvy, can easily do on their smartphones.

Several images, right from US President Abraham Lincoln and Russian dictator Joseph Stalin to Italian dictator Benito Mussolini, have been the subject matter of such photo manipulation where the original images are altered.

Arguably, this is where it all started and now, the new term for an advanced version of faking it is called Deepfake Technology.

Akshay Kumar, a motion graphic designer who also runs a creative agency in Bengaluru with expertise in working for features, shares that hundreds of websites and mobile applications offer such services with the use of AI technology.

“Earlier, to alter/manipulate a photo or a video, it required the knowledge of an expert who was well-versed in multiple software technologies. Now, it is done in just three simple steps on a website/app,” he tells South First.

Akshay Kumar explains that it all just requires three steps — Upload a quality image/video, choose celebrity photos, and download the AI video.

Related: Union government issues advisory to social media companies on Rashmika deepfake video

Data readily available

Varun, a video editor, points out that some websites charge based on the rendering and customised services.

“A certain amount of money like $10 or so is charged monthly for limited services. AI learns from the information/data available on the Internet. The AI makes use of the images of Prime Minister Narendra Modi or any other popular celebrity, videos, and even their voices,” he elucidates.

Varun adds, “Websites like,, and many more do the job in just a few clicks.”

No more fun: Malayalam director Devan

In fact, the emergence of deepfake was used for harmless and entertainment purposes. The term deepfake is a combination of deep learning and fake. It runs on deep running techniques such as generative adversarial networks (GANs).

Malayalam director Devan says, “AI and deepfake were all fun for a while — but only until these technologies started to be used with utter carelessness. I have used the AI technology in the movie Valatty (2023) to make dogs talk. Little did I know then that it will be used to deepfake human beings without their consent.”

He says he is relieved to learn that the Union government has asked all social media platforms to remove deepfake content within 24 hours. “But the impact is minimal since damages would already be done in that time frame.”

Devan notes that beyond regulations, something should change within the culture. “People should be aware of the consequences when they sneak into other people’s privacy and use content without permission. Kids must be educated in the school itself. Otherwise, this will spread like a virus that no one will be able to contain.”

Meanwhile, actress Manjima Mohan took to X to share her deep concern about deepfake technology: “This is so scary! Immediate action should be taken! Or else this might become a bigger threat in the future!”

Related: Rashmika highlights the need to ‘address it as a community and with urgency’

A dangerous trend: Athulya Ravi

Tamil actress Athulya Ravi expressed her solidarity with Rashmika Mandanna on X.

She wrote: “The misuse of AI and making deepfake videos is becoming dangerous day by day! I know it’s inevitable but let’s raise awareness and work together to combat such misuse of technology! Whether it’s a celebrity or a common girl the impact will be the same! Don’t feel down and stay strong Rashmika!”

Also, Singer Chinmayi Sripada expressed on her X account that she truly hopes there is a nationwide awareness campaign to educate the public about the dangers deepfakes pose to girls.

Creative too

While deepfake is a double-edged sword, the same technology has been put to great use for good reasons too.

Akshay Kumar cites advertisements designed by a leading optical retail chain. The company made use of the technology involving the Bollywood actor Ayushmann Khurrana.

It was the next level of marketing using deepfake technology. It was a real-time interaction between the actor and shoppers, based on a combination of artificial intelligence, facial recognition, and augmented reality.

That’s not all, a recent advertisement by a leading producer of chocolate cookies involving Bollywood Badshah Shah Rukh Khan ran a creative campaign using AI technology.

The ad allowed anyone to upload an image which would replace the image with a person featured alongside the actor. The ad called it — “Let’s put you in our ad with SRK! We know it’s your fantasy too.”

Also Read: Teaser of Sunny Wayne-Shane Nigam’s ‘Vela’ released

Tollywood reacts

Rashmika Mandanna’s deepfake video has clearly left ripples in the film industry.

Besides the fact that an artificial intelligence tool can be used to make-believe a person with a morphed face, it poses some pertinent questions on the safety of individual rights in society.

Many Tollywood celebrities reacted to the video and expressed deep shock and worry over the misuse of artificial intelligence.

“It’s truly disheartening to see how technology is being misused and the thought of what this can progress to in the future is even scarier. Action has to be taken and some kind of law has to be enforced to protect people who have been and will be a victim of this. Strength to you,” wrote Akkineni Naga Chaitanya.

Don’t be silent: Mrunal Thakur

Meanwhile, Mrunal Thakur, who is starring in Nani’s upcoming film Hi Nanna, took to her Instagram handle and expressed her views on the deepfake video of Rashmika Mandanna.

“Shame on people who resort to such things, it shows there is no conscience left at all in such people. Thank you Rashmika Mandanna for speaking up about the issue we have seen glimpses of but a lot of chose to remain silent.”

Mrunal added, “Every day there are morphed images and edited videos of female actors floating around on the internet zooming into inappropriate body parts. Where are we headed as a community, and as a society? We may be actresses in the limelight but at the end of the day, each one of us is human. Why aren’t we talking about it? Don’t remain silent. Now is not the time?”

Vijay Deverakonda on deepfake

Actor Vijay Deverakonda reacts to deepfake video of Rashmika Mandanna. (Instagram)

Telugu actor Vijay Deverakonda, on his Instagram story, shared an article about the government cracking down on deepfakes after the viral video of his Geetha Govindam (2018) co-star Rashmika Mandanna.

Further, he wrote: “Extremely important steps for the future. This shouldn’t happen to anyone. Also, an efficient accessible cyber wing for quick crackdowns and punishment will make people more secure.”

The deepfake technology has far-reaching consequences and has gone way too deep in our modern society.

So, next time you come across something unusual and hard to believe and may look like the possible work of an AI trying to fool you, make sure to cross-check the same using available software tools and most importantly do not forward them at any cost and do not forget to report it to the cyber authorities.

Also Read: Anasuya upset over skewed write-ups about ‘Pushpa 2’