Reels glorify dominant-caste “heroes,” often recasting Tamil film villains, overlaying violent imagery with coded music. Amplification follows: algorithms reward engagement, pushing such content into sympathetic feeds.
Published Aug 17, 2025 | 2:00 PM ⚊ Updated Sep 12, 2025 | 11:35 AM
Caste in Tamil Nadu is not a relic of public politics or sporadic violence—it is the intimate, everyday architecture of life.
Synopsis: Caste in Tamil Nadu is not a relic of public politics or sporadic violence—it is the intimate, everyday architecture of life. It begins within the household and extends into language, food, marriage, ritual, spatial segregation, and even the imagination of “respectability” or “honour.” At the micro level, caste is narrated through stories of ancestral “valour,” warnings about whom to befriend, subtle ridicule of other communities, and gendered instructions about marriage and sexuality.
On 27 July 2025, in Tamil Nadu’s Tirunelveli district, 26-year-old software engineer Kavin Selva Ganesh was hacked to death in broad
daylight. His offence? Loving Subashini, a woman from the dominant Maravar caste.
Within hours, Instagram reels began flooding timelines—edited with mass-hero scores, blood-red text, and slow-motion visuals, they celebrated Subashini’s brother, Surjith, as a “brave protector” and “athlete who upheld honour.”
These videos amassed thousands of views before moderators intervened—and by then, the damage was done. This was not random online chatter. It was a coordinated digital glorification of murder, showing that social media in Tamil Nadu now does much more than record caste violence—it manufactures it.
Caste in Tamil Nadu is not a relic of public politics or sporadic violence—it is the intimate, everyday architecture of life. It begins within the household and extends into language, food, marriage, ritual, spatial segregation, and even the imagination of “respectability” or “honour.” At the micro level, caste is narrated through stories of ancestral “valour,” warnings about whom to befriend, subtle ridicule of other communities, and gendered instructions about marriage and sexuality.
Children absorb caste not only from explicit prohibitions but from who sits where, who eats what, and who is permitted in the kitchen.
This intimate socialisation builds into structured community life—caste-based associations, temples, and endowments—that have historically functioned as both cultural hubs and instruments of economic control. Even during the Dravidian social justice era, dominant castes rebranded caste pride as “heritage” and “tradition,” maintaining endogamy and resource control.
The digital age has done nothing to dismantle these structures. It has simply reconfigured them. Instagram, in particular, has become a curated safe space for caste identity performance—private accounts flaunt symbols, arrange marriage alliances, and circulate “pride” content away from outsiders’ scrutiny. Closed-group reels recycle casteist slogans, songs, and memes, reinforcing prejudice while recruiting new offline loyalists.
These echo chambers normalise hate as entertainment, especially among youth.
Caste in Tamil Nadu is not archaic—it is a living, self-renewing order, now fuelled by algorithms. Its endurance stems from its invisibility to beneficiaries and omnipresence to victims.
Instagram has become the new street corner where young men once gathered to talk politics and alliances. But this digital streetcorner has a global audience, no curfew, and no police presence unless a complaint is filed. Here, dominant caste networks find “safe spaces” in coded visuals—hashtags, colour schemes (red–yellow, green–yellow for Gounders, Thevars and other castes)—visible only to insiders, invisible to moderators and algorithms.
First, there is provocation. Reels glorify dominant-caste “heroes,” often recasting Tamil film villains, overlaying violent imagery with coded music. Amplification follows: algorithms reward engagement, pushing such content into sympathetic feeds.
Action then unfolds—offline violence, especially against Dalit youth in intercaste relationships, occurs amid a climate primed for it. Finally, comes glorification: reels appear celebrating the perpetrators, with mugshots, temples, “mass” scores, or repurposed anti-caste film scenes reframing violence as honour.
Instagram’s content recommendation system is primarily driven by machine learning models designed to maximise user engagement.
The algorithm analyses individual user behaviour—including likes, comments, watch time, shares, and account follows—to create a personalised content feed that keeps users on the platform longer.
Crucially, this system rewards content that triggers strong emotional responses, such as outrage, pride, or anger, which caste-pride and caste-violence reels expertly provoke.
When users interact with caste-coded reels—characterised by caste-specific colour schemes, symbols, Tamil-caste slang, and snippets of casteist songs—the algorithm infers affinity and relevance.
This triggers a feedback loop: the more engagement such content receives, the higher it ranks not only in the user’s feed but also in the Explore page and Reels recommendations. This reinforcement exponentially increases the content’s reach within caste-affiliated networks.
Instagram’s models leverage collaborative filtering, recommending content that similar users engage with. Given caste pride’s social clustering, users within a caste group are algorithmically funnelled into “echo chambers,” where casteist content recirculates and intensifies. The algorithm’s implicit assumption of shared interests fails to account for the social harm of such clustering.
While Instagram uses automated systems to detect harmful content, these are often based on visual pattern recognition and keyword scanning. However, Tamil caste slang, cultural symbolism (E.g., red-yellow threads or flags), and repurposed film songs embedded in videos often evade detection because automated classifiers are predominantly trained on English data and broad global content. This linguistic and cultural blind spot allows casteist reels to bypass moderation filters.
Instagram Reels and Stories disappear after 24 hours, minimising the window for content review and user reporting. This ephemerality reduces the likelihood of persistent evidence, making it difficult for human moderators or automated systems to take timely action.
The algorithm’s reward system encourages creators to escalate content intensity to gain followers, likes, and sponsorships. Provocative or violent caste-pride reels generate rapid engagement spikes, which the algorithm interprets as high-value signals, reinforcing the cycle of extremity.
Together, these algorithmic features create a digital environment where caste-pride content and the glorification of violence thrive unchecked, effectively making Instagram a safe space for caste-based hate and violence, normalised through amplification rather than challenge.
Upon a caste-based killing, the propaganda machine activates. Archival footage of the accused at festivals or rallies is turned into heroic content. Folk songs and dance numbers with veiled caste references are laid over images of weapons, blood, and sirens. Anti-caste films like Pariyerum Perumal and Mamannan get cynically remixed to valorise aggressors. This narrative lock-in ensures the killing is perceived as an act of honour before any alternative is heard.
The virulent caste-pride content and honour-killing glorifications circulating in Tamil Nadu’s digital spaces are far from isolated phenomena. Instead, they form part of a disturbing global pattern where social media platforms become accelerators of ethnic, religious, and caste-based hatred and violence.
This architecture of digital hate was tragically visible in Myanmar’s 2017 Rohingya genocide, where Facebook’s algorithms amplified inflammatory posts, enabling hate speech that spiralled into mass ethnic cleansing. The United Nations later identified Facebook’s platform as playing a “determining role” in enabling these atrocities.
Similarly, in Ethiopia’s 2021 conflict, Meta’s platforms were accused of facilitating ethnic violence between the Tigrayan and Amhara groups, with algorithms inadvertently promoting divisive and incendiary content. In the United States, social media platforms hosted and amplified militia organising ahead of the 2020 Capitol riots, culminating in real-world violence and loss of life.
Across these diverse contexts, the core mechanics remain consistent: platforms prioritise sensational, emotionally charged content because it maximises user engagement and profit. Hate, fear, and outrage spread faster and further than measured or neutral discourse. When moderation is slow, insufficient, or culturally blind, hateful content normalises within digital communities and translates into offline harm.
Tamil Nadu’s caste-hate reels fit squarely within this global pattern.
The caste divisions in Tamil society—ancient and deeply embedded—are weaponised online with the same technological tools and business models that fuel ethnic and religious violence worldwide.
The platforms’ algorithms do not distinguish caste as a protected category the way race or religion might be in other jurisdictions, allowing caste-based hate speech to flourish unchecked.
The coded language, symbolic imagery, and cultural cues used in caste-pride content mirror the subtle dog whistles seen in other ethnic hate speech online, creating tightly knit echo chambers that are algorithmically reinforced.
Moreover, the transnational nature of platforms means Tamil caste hate is no longer confined regionally; it is now digitally exportable, reaching diaspora communities and connecting caste groups across borders. This global digital architecture of hate thus forms an interconnected web where local prejudices are amplified by international platforms, contributing to a worldwide crisis of social media-enabled violence.
The statistics are harrowing. Evidence, an NGO, reported 195 known honour killings in Tamil Nadu over five years, almost all involving a dominant-caste woman and a Scheduled Caste or OBC man. Nationally, the Union Home Ministry cited 145 honour killings between 2017 and 2019 —a figure likely undercounted, as many cases are recorded simply as homicide.
These crimes are rarely spontaneous. They are planned acts of social control, often backed by family and community consensus—and now, deeply embedded in digital performance.
Some creators resist. Parithabangal, a YouTube channel, released Society Paavangal, a satirical takedown of caste honour violence. Anti-caste audiences lauded it—but it was swiftly met with abuse, mass reporting, and police complaints. The asymmetry is stark: pro-caste content thrives; anti-caste voices remain fragile.
Another side which is compounding this is legal inertia. The Freedom of Marriage and Association and Prohibition of Crimes in the Name of Honour Bill, 2022, which would criminalise honour killings specifically, is stalled. In its absence, such murders are routinely recorded as “family disputes.” Indian IT rules do not treat caste as a protected category—content targeting caste would be banned in the West—but persists in India. Caste-coded slang, imagery, and music routinely evade moderation.
In today’s digital age, the weapon of violence no longer lies solely in the knife or the machete — it resides in the mobile phone we carry in our hands. This device, meant to connect us, inform us, and empower us, has been transformed into a potent tool of hate, a weapon that amplifies caste pride and legitimises murder in real time.
Each tap, each swipe, each shared reel fans the flames of a brutal cycle — where violence is not just committed but broadcast, celebrated, and immortalised online. The killers of 2025 do not hide in shadows; they perform their brutality for the camera, fully aware that their faces, names, and slogans will echo endlessly across digital platforms. The reel is no longer a mere recording — it is an active participant in the crime, a catalyst that turns private hate into public spectacle.
This self-sustaining ecosystem thrives on engagement, transforming spectators into accomplices and bystanders into enablers. Until we recognise the mobile phone as both a tool of connection and a weapon of destruction, and demand that platforms take responsibility for the content they amplify, this cycle of caste violence will only deepen, spreading like wildfire through digital networks.
The fight against caste violence today is not only on the streets but also in the algorithms, feeds, and reels that dominate our screens. Because the reel you are sharing is not an afterthought—it is part of the caste honour killing.
(Edited by Majnu Babu).