Stories

Outrage as Child Sexual Abuse Material Continues to Flourish on Social Media, Even In 2024?

 

All of us have at least once in our lifetime heard the phrase ‘ Children are the future and pillar of the nation!’

But how well are the children being taken care of in this nation? Today we shall not be talking of the various facilities that have been provided to children in terms of education and health, promising a healthy future; it is in protecting the child itself that the rising case of CSAM occurs.

Child Sexual Abuse Material (CSAM) appears as a severe and corroding issue in today’s digital world, where social media platforms are exploited and used to share materials by and of predators, forming a network of similar-minded predators, and selling off thousands and lakhs of such images and videos. 

In a rather chilling and infuriating revelation, a recent investigation has shed information regarding the disturbing prevalence of child sexual abuse material (CSAM) being circulated on major social media platforms, including Instagram and Telegram, in India. 

How frightening! It’s 2024, and still, we find these trends haunting society and targeting the most vulnerable group. And how disappointing it is that platforms like Instagram and Telegram have become hubs for this transaction. The scourge of paedophilia and the exploitation of children remain rampant, facilitated by the very platforms that claim to safeguard their users.

It is highly disappointing on various levels, but what infuriates me is the fact that platforms like Meta, which uphold security, safety, and privacy so closely, are now platforms used by abusers. Even after the heightened awareness and numerous pledges from accountable companies to crack down on such vile activities, the problem persists with alarming regularity.

Meta has also faced various challenges when it comes to this. Independent research from the Stanford Internet Observatory and the Canadian Centre for Child Protection revealed the widespread CSAM distribution on the platforms, expanding across Facebook groups and Instagram. Mera has previously taken action, including hiding 190,000 groups from searches and disabling tens of thousands of accounts; however, this isn’t enough, and the progress remains slow. The problem here lies in the fact that meta platforms are rather slow in taking note of various complaints and reports.

THE PATTERN THAT PERSISTS

The defaulters have come up with certain patterns that were revealed through various investigations and the statements given by the abusers.

A small snippet, also known as the teaser, is released on various Instagram pages with not-so-relevant hashtags, which ultimately leads them to land on the pages of users. When landing on the page of the desired user, these reels or videos lead to links to telegram pages, where child abuse videos are sold for anywhere between Rs. 40 and Rs. 5,000, making Instagram a breeding ground for this abhorrent content. The sheer scale of this problem is staggering, with countless profiles and posts devoted to this distribution of CSAM.

Telegram, a very well-known app among people, mainly for accessing movies and videos on the web, like an alternative to the paid OTT platforms, has numerous private groups where explicit videos and images are sold.

This continued exploitation and circulation of child sexual abuse material on social media does not just point to the failure of society—human beings failing at humanity—but on a larger scale, it is a systematic failure.

Any Instagram user is aware of how the algorithm works and how vulnerable the community rules are, marking every piece of content marked ‘inappropriate’ and taking it down within seconds. However, such CSAM is being circulated without any such problems, which reflects how even after using excellent and advanced AI and machine learning tools that are designed specifically to detect and block such content, perpetrators continue to find ways to circumvent these safeguards.

 

The question is: What are these social media sites doing to eradicate this? The response from their end has been woefully inadequate. While platforms like Instagram and Telegram proudly talk about how they are wholly committed to eradicating CSAM from their networks, their actions have fallen far short of their promises. Revealed was the laziness shown on the end of the responsible platforms, with the explicit content often remaining accessible for extended periods.

These incidents raise significant concerns when it comes to the protection of children online. Through these incidents, they grab the attention of numerous international treaties and national laws, yet enforcement remains patchy at best. They also spark calls for stricter regulation and harsher penalties for both the distributors of CSAM and the platforms that enable them.

PROTECTION OF CHILDREN FROM SEXUAL OFFENCES (POCSO) ACT

In India, the Information Technology Act, of 2000, and the Protection of Children from Sexual Offences (POCSO) Act, of 2012, were set out to provide a legal framework for tackling online child exploitation. 

The POCSO Act holds, at its heart, the virtue of protecting children from sexual offences by mandating strict punishments and compulsory reporting of such crimes. When studied in our interest, it is accompanied by the Information Technology Act, which addresses cyber offences, including the transmission of explicit content.

However, the enforcement of these laws is hampered by bureaucratic failure to be active enough and the ultimate scale of the problem. Beyond this failure lies the heartbreaking reality that the uploading of CSAM continues, on a rather alarming scale. In 2019, as per reports, over 19 lakh images and videos about CSAM were circulated, the highest in the world. 

Each of these images and videos represents the abuse and exploitation of a child, often with lifelong consequences. Survivors of such abuse are frequently haunted by the trauma and shame that follow. That action must be taken against such people because behind every piece of CSAM stands a child whose life has been irrevocably damaged.

As it was also made clear, this is not a newborn issue; these problems have eroded society for a very long time. Let’s look into that.

The arrest of Neeraj Kumar Yadav and Kuljeet Singh Makan for CSAM Distribution

In a rather important scenario, the Central Bureau of Investigation (CBI) arrested Neeraj Kumar Yadav, an engineer, and Kuljeet Singh Makan for their involvement in the sale and purchase of CSAM via social media platforms, notably Instagram. Both were charged under the Protection of Children from Sexual Offences (POCSO) Act, 2012 and relevant sections of the Information Technology Act.

The CBI, tasked with investigating severe crimes, coordinated with local law enforcement to detect and apprehend the suspects. The misuse of Instagram for distributing illegal content highlights the need for social media platforms to rigorously monitor, report, and remove such material in compliance with IT regulations.

This case underscores the persistent threat of CSAM online and the critical role of robust legal frameworks, technological surveillance, and public awareness. Enhanced collaboration between law enforcement agencies, social media platforms, and the public is essential to combating online child exploitation effectively.

 

Child

Why Are We Still Stuck Here?

These various incidents have ignited a wave of anger and frustration, but what else? No action can be taken and no change can be accomplished without the proper support of the administration. It is high time the authorities understood that there is an urgent need for greater international cooperation to tackle the global nature of this problem. A more coordinated and collaborative approach is highly essential to destroying these networks and bringing perpetrators to justice.

The time for excuses and half-measures is over.

 

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button