Meta and Google face mounting pressure as experts warn that new regulations will necessitate full automation of content moderation.
Major technology firms operating in India are confronting a severe tightening of operational regulations. Starting February 20, platforms including YouTube and X must comply with a government order to take down unlawful content within three hours of receiving a notice, a sharp reduction from the previous standard of nearly two days.
While the tech companies have yet to issue formal responses—Meta declined to comment and others have not replied—industry observers believe the shift will fundamentally alter how moderation works in a market of over one billion users.
The Human Cost of Automation
Anushka Jain, a research associate at the Digital Futures Lab, explained that platforms were already struggling to meet the old 36-hour deadline due to the need for human oversight. The new rule, she argues, leaves no room for judgment.
If it gets completely automated, there is a high risk that it will lead to censoring of content.
Jain welcomed the new provisions requiring the labeling of AI content as a step toward transparency but cautioned that the speed of the required takedowns poses a threat to free expression. The government has not provided a specific reason for the urgency of the new deadline.
SOURCES: BBC, Digital Futures Lab, Tech Industry Inquiries.
This report has been significantly transformed from original source material for journalistic purposes, falling under ‘Fair Use’ doctrine for news reporting. The content is reconstructed to provide original analysis and reporting while preserving the factual essence of the source.
