Last week the government published a set of draft amendments to the 2021 Information Technology Rules (Guidelines for Intermediaries and Digital Media Code of Ethics). Once enacted, digital intermediaries will need to ensure that the community standards to which they hold their users accountable are in accordance with Indian law and the constitutional principles of India. This, the government said, became necessary because a number of intermediaries took it upon themselves to act in violation of the rights of Indian citizens.

The new amendments also propose the creation of a grievance appeal committee which will be responsible for dealing with “problematic content” in a rapid manner. Users dissatisfied with the way their complaint to an intermediary has been handled will be able to appeal the decision to this body. And get it resolved within 30 days.

The proposal was met with more or less dismay. Editorials, including in this newspaper, have called it another attempt by the government to restrict or interfere with free speech. The remark that some intermediaries acted in violation of citizens’ rights is interpreted as a sarcastic reference to cases where intermediaries refused to remove content that did not violate their community guidelines, despite pressure from the Center to do so . Therefore, what the government sees as an escalation mechanism to provide redress to users against unfair decisions by the social media platforms they follow, many in civil society see as merely a tool of government censorship. .

Both sides are right. And a little bit wrong.

Content moderation is a tricky issue. It calls for each piece of suspicious content to be assessed against a number of different legal standards, such as authorship, the harm it might cause to someone’s reputation, the legality of that content in the context of the age of its target audience, etc. Even though social media platforms are designed to enable freedom of expression, they must also eliminate, or at least mitigate, the harms that might arise from freedom of expression. They must strike a balance between the rights of those who publish and those they offend.

Content moderators often have complex decisions to make. They have to decide what stays on and what needs to be removed. Where do you draw the line between speech that is acceptable and speech that is not? More often than not, the issues are so clear that even poorly trained content moderators can get it right. Occasionally, however, even the most experienced among their ranks won’t.

Some of the decisions they have to make in the course of a working day are so difficult that even the best legal minds would have been puzzled. Are disparaging remarks posted about an individual defamatory as alleged? Are they slandering his character with lies or are they actually based on truth? Is this remix of an existing song original enough to be considered a novel, or does it require a license from the copyright holder before publishing? Is a given statement expressing angst over a decision merely normal human frustration or an attempt to foment violent unrest?

Civil society fears that if the proposed grievance appeals committee allows the government to have the final say on issues like this, it will use that power to suppress dissent. While I share this concern, I have similar reservations about leaving all of these decisions entirely in the hands of private enterprise.

After all, not all appeals to the grievance appeal board will be about government withdrawals. Some will deal with illegal content, like copyright violations. What happens if an artist’s original composition is flagged for withdrawal due to an imaginary infringement in a decision that is not overturned, even on appeal? For most emerging artists, their only path to commercial success is to be able to impress the audience that these social networks have to offer. If they are forced, without recourse, to take down their content, it could be the end of their careers.

That said, it is impossible to ignore the concern raised by civil society. If an appeal is filed by a government agency whose takedown notice has been denied, isn’t it likely that a government-appointed appeal panel will decide in favor of its own agency? How to mitigate the likelihood of such an eventuality while preserving the right of appeal?

One solution could be for the industry to establish a self-regulatory appeal body to which appeals of all content moderation decisions can be directed. It could be composed of a cross-section of industry and legal experts, so that its decisions are sufficiently robust – informed both by industry context as well as applicable laws and legal precedents taken into account.

Ideally, this body should function as an appeals forum for all content moderation decisions, regardless of the platform from which the appeal originates. This will keep it above the hierarchy of powers of the platforms themselves, providing the process with a measure of independence that is absent from internal grievance systems. Since it will not be run by the government, it will hopefully have the required neutrality to remain impartial while deciding on takedown notices issued by the government.

The government has already indicated that it is ready to consider self-regulatory alternatives. It is now up to the industry to get the design right.

Rahul Matthan is a partner at Trilegal and also has a podcast under the name Ex Machina. His Twitter handle is @matthan

To subscribe to Mint Bulletins

* Enter a valid email

* Thank you for subscribing to our newsletter.