The ministry of electronics and information technology (MeitY) proposed certain key changes to regulations governing intermediary liability in India during the last week of December 2018. These changes while not yet finalised, are currently a topic of debate and discussion.
The amendments make fundamental changes to the safe harbour provisions currently in the Information Technology Act, 2000 (IT Act) and the Information Technology (Intermediaries Guidelines) Rules, 2011. The IT Act and Rules currently provide that an entity would not be liable for any third-party information or content made available or hosted by it, if it is an ‘intermediary’. Naturally, the intermediary in question is also expected to demonstrate, among other things, ‘due diligence’. The concept of due diligence itself, is limited to ensuring that appropriate rules, regulations, and policies in place; taking curative action within thirty-six hours of receiving notice of unlawful content; preserving associated information and documents for a period of ninety days; and providing such assistance or information as required by a Government or law enforcement agency for the prevention, detection, investigation, prosecution and punishment of offenses.
The amendment, however, seeks to expand the obligations of, and expectations from such intermediaries. Here are some of the crucial pitfalls of the (proposed) new intermediary liability regime:
On-soil requirement
The amendment makes it mandatory for any intermediary having more than 50 lakh users in India to necessarily be a company incorporated in India. This has far reaching implications on intermediaries who operate global platforms from offshore locations. One could argue that the requirement of local presence is essential to enable enforcement agencies to effectively manage any threat to safety or security of citizens. However, the requirement of having an appointed a nodal person of contact and an alternate senior designated functionary (as prescribed under the proposed amendment) would serve exactly this purpose. The government can easily liaise and co-ordinate with such appointed officials to enforce law and order. Mandating local presence in the form of an incorporated entity appears to be an excessive measure undertaken by the government.
Notably, all companies incorporated in India are mandated to have a physical, registered office in the country. The address of the company’s registered office is also to be submitted to the Registrar of Companies immediately after incorporation. Consequently, the second requirement of ensuring that said intermediary has a permanent, registered office in India with a physical address would be superfluous.
Another important concern that begs clarification is that of the consequence of not complying with local-presence requirement. Would such non-compliance imply that the intermediary is prevented from offering Indian users access to their platform itself, or does it solely imply that the intermediary would not be able to take advantage of the safe harbour provisions that would, otherwise, have been available to it? To suggest the former position be taken would mean that any intermediary seeking to provide services to Indian residents would need to do so from India. These entities would not only have to set up shop in India, but also replicate its platform solely for the local region. Suggesting that such drastic steps be taken, would pose great practical and economic challenges for intermediaries.
Pro-active content monitoring
One more critical amendment that is sought to be made is the requirement for intermediaries to deploy technology-based automated tools or appropriate mechanisms, with appropriate controls, for ‘proactively identifying and removing or disabling public access’ to unlawful information or content. The importance of placing greater responsibility on intermediaries for content hosted by them cannot be denied in the wake of recent incidents which included mob violence based on fake news and similar happenings. However, to require the intermediaries to ‘pro-actively’ filter content would be grossly unfair and impractical to implement. Intermediaries such as social media platforms may not be able put in place such pro-active censoring mechanisms owing to the volume of information received, processed or hosted by them on a daily basis.
While intermediaries could potentially develop and launch artificial intelligence (AI) programs to assist with the filtration process, doing so would take an enormous amount of investment, not to mention that this would not automatically solve issues surrounding unlawful content online. Consequently, censorship by such platforms would involve recruiting several moderators, who would then be expected to pore through copious amounts of information and either approve it or proceed to block access to it. Although this may be done on a reactive basis as and when the intermediary is made aware of such content, requiring proactive censorship on such a wide basis, would be extremely onerous. Notably, several smaller entrepreneurs may not have the wherewithal to undertake such a cumbersome process during the initial years of their business. This assumes even more importance, in light of the ambiguous nature of the term ‘unlawful content’.
The direction to proactively censor information also seems to be in direct contrast with the notice and take-down process which has, until now, formed one of the pillars of India’s safe harbour regime. The proposed changes would negate the Supreme Court’s verdict in Shreya Singhal v Union of India, where the Court noted the dangers of requiring private parties to adjudge the lawfulness of content on their platforms. Enacting the proposed amendment is likely to promote mass private censorship and may see a wave of possibly over-aggressive content censorship being undertaken by intermediaries for fear of attracting unwanted attention and liability.
Scope of ‘unlawful content’
A key concern that remains unaddressed in the proposed amendment is the exact scope of what would constitute ‘unlawful content’. Rule 3 (2) prohibits intermediaries from knowingly hosting or publishing information which amongst other things may be ‘grossly harmful, harassing, blasphemous, defamatory, obscene, pornographic, paedophilic, libellous, invasive of another's privacy, hateful, or racially, ethnically objectionable, disparaging, relating or encouraging money laundering or gambling, or otherwise unlawful in any manner whatever’. Several terms such as ‘harmful’, ‘blasphemous’, ‘obscene’ are subjective and may vary from person to person.
The lack of clarity in the definition of what unlawful content is, may lead to the much-feared censorship creep. Without specific illustrations and reasoned guidelines, ambiguous terms may be susceptible to broad and over-reaching interpretations, which exceed their legislative intent. For example, unclear definitions of ‘hateful’ content may be used to suppress legitimate dissent or prevent newsworthy content from being made public. On the other hand, limiting unlawful content to material containing child pornography, rape videos, and real-life violence, also called ‘snuff films’ would be easier to monitor, while at the same time balancing users’ freedom of expression.
Enabling traceability of unlawful content
As per the proposed amendment, intermediaries must now enable law enforcement authorities to trace the origins of any unlawful content on their platforms. Essentially, this requirement would disallow true end-to-end encryption for communications—as currently provided by WhatsApp, for example—thus potentially jeopardising users’ right to privacy. Again, while the need to identify offenders cannot be denied, doing away with encryption in totality and thus restricting the ability for users to express their opinions freely without fear of surveillance, would not be desirable. Stakeholders must instead, work with Government agencies to find a solution which strikes a balance between these two seemingly opposing ideas.
Checks and balances
Lastly, the proposed amendment does not appear to make any room for checks and balances to be implemented in order to ensure that the power to censor is not abused or overused, both by the government and intermediaries, alike. It may be useful to have intermediaries publish reports with details of agencies who have made requests for take-downs, the frequency of such requests, as well as a brief description of the unlawful content. This would, to a certain extent, ensure that newsworthy matters, including any political criticism and dissent is not being suppressed under the garb of unlawful content.
The article is authored by Sherill Pal, senior associate, J Sagar Associates.
Views expressed are personal.