Technology News

Govt issues warning to X, YouTube, and Telegram over failure to remove child sexual abuse material

Social giants, currently protected by Section 79, may lose immunity as they struggle to remove harmful user content. Enforcing Information Technology Guidelines could lead Facebook, YouTube, Instagram, and "X" into legal trouble in India

The Indian government is reassessing the immunity currently granted to social media giants under the IT Act, as several platforms have allegedly ignored official notices regarding harmful content. The cause for worry revolves around child abuse content, the spread of misinformation, religious incitements, and algorithm mismanagement. 

Technology Giants Under Scrutiny 

Largescale tech companies that profit from the transmission of third-party content can presently enjoy protective measures under Section 79 of the law. However, their failure to remove damaging user content on their platforms has prompted the authorities to reconsider. If provisions under the Information Technology Intermediary Guidelines are invoked, platforms including Facebook, YouTube, Instagram, and the formerly known Twitter, now “X”, could face criminal liabilities under the Indian Penal Code. 

Minister of State for IT and Electronics, Rajeev Chandrasekhar, spoke to the Times of India, revealing a “serious view” of platforms such as YouTube (owned by Google), X, and Telegram repeatedly breaching guidelines. The government plans to action Rule 7 of the IT Intermediary Guidelines of 2021, which could lead to potential immunity loss for non-compliant platforms. 

YouTube Reply to Govt Notice

YouTube, a leading video streaming platform owned by Google, confirmed its undisputable stance against the spread of child sexual abuse material (CSAM) on its site. Following a detailed examination of its platform, YouTube reported that it had discovered no evidence of such offensive content. 

“Our track record demonstrates our resolute commitment to combating child exploitation on YouTube. Our searches yielded no evidence of CSAM content, and no such material was forwarded to us by regulatory bodies,” a YouTube representative assured in a statement. 

The tech giant highlighted its devoted efforts to ensure a safe space for all its users, particularly minors, noting, “We prohibit any content that could put children at risk on our platform, and we consistently invest heavily in the teams and technologies that identify, eliminate, and prevent the circulation of such content.” 

The spokesperson continued to emphasize the company’s determination to collaborate with all stakeholders in the “industry-wide fight” to eradicate the dissemination of CSAM, within an official email correspondence. 

Addressing child safety concerns, YouTube took down over 94,000 channels and pulled more than 2.5 million videos in only three months of 2023 due to violations of its child safety policy. 

According to YouTube, in India, it flashes a warning against CSAM search results. The warning informs users of the criminality surrounding child pornography and guides them toward the national cyber crime reporting website.

Chandrasekhar Calls for Accountability in Digital India Act

The review comes in unison with similar legal actions in the United States, as multiple states have begun legal proceedings against Meta Platforms and its Instagram unit for allegedly contributing to a youth mental health crisis with addicting content. Chandrasekhar was quoted by the Times of India, emphasizing that “companies must be accountable to digital citizens”. 

The statement foreshadows steps towards more stringent discipline over such companies, hinting at the advent of the Digital India Act. The proposed new legislation aims to replace the existing IT Act, focusing on platform responsibilities and accountability. Spectacularly failing to respond to the issue of child sexual abuse material (CSAM) on platforms is one primary reason for considering Rule 7, which would cancel the clause protecting immunity under section 79. 

Proposed Digital India Act & Meta Charge 

The government is also set to address the addictive aspect of online platforms in the new gaming rules under the forthcoming Digital India Act. Responding to growing global concerns, the Act will be proposed in the cabinet soon to replace the existing IT Act. The new legislation is expected to increase accountability for social media platforms and address the immunity they currently enjoy concerning content violations. 

Moreover, Meta Platforms, which also operates Facebook, faces allegations in the US of misleading the public about the dangers of its services and knowingly enticing young users into addictive social media behaviors.  

Child sexual abuse material remains one of the most pressing issues on the internet, necessitating urgent solutions worldwide. Globally, the discourse around end-to-end encryption is intensifying, with critics arguing such bolstered security can be exploited for the dissemination of harmful content. 

With the Information Technology Act, 2000, providing the legal framework to address online pornographic content, including child sexual abuse material, the conversation about making platforms more accountable for their content is becoming increasingly important.

You might also be intersted in – Mr. Beast hits 200 million subscribers, inches closer to the Youtube throne!

Related Articles

Back to top button