Effective regulation

A draft of The Information Technology Intermediaries Guidelines Amendment Rules, 2018, was issued in December last year by the Ministry of Electronics and Information Technology and put up for public consultation till January 31, 2019. The draft sought to replace the existing rules, framed in 2011, for governing “intermediaries” — a term that includes all information technology (IT) and IT-enabled services (ITeS) platforms and content aggregators including network service providers, internet service providers, search engines, online payment sites, and social media platforms. The draft amendment was formulated to increase the accountability of social media platforms. The new guidelines ensure that major global social media platforms comply with local laws. Any platform with over 5 million users in India, or any platform specifically notified by the government, must be incorporated under Indian law and have a registered office in India. Such an entity must appoint a nodal compliance officer in India for coordination with law enforcement agencies and monitoring compliance under Indian law. In short, big technology companies must be regulated properly and effectively, given the tendency to misuse social media to spread fake news and rumours.

But the draft has some disquieting aspects that may make things difficult. It tightens the web of surveillance and censorship, bringing private parties into the Act. This implies a loss of privacy and anonymity for social media users. It also uses very broad, undefined terms to describe content that is to be taken down. It may be noted that India still has no privacy laws and no explicit protection for digital speech and privacy. The amended guidelines have asked intermediaries to “trace” content creators within 24 hours upon receipt of such a demand from the authorities. This is technically difficult to do without breaking encryption, which, in turn, implies a complete loss of both privacy of content and anonymity. Secondly, the amendments also require intermediaries to set up technology to proactively identify and remove content of various kinds. This could usher in private sector censorship along with private sector surveillance. Such content is described, using highly subjective, undefined terms such as “disparaging” and “harmful”. Moreover, “blasphemy”, which is not a crime, is cited as a cause for a takedown.

The role of IT intermediaries is contentious in most jurisdictions. Social media platforms which allow commentary will receive anonymous content. Enlightened democracies, which have both privacy laws and free speech protections, treat that content on the basis of two broad provisions. One is a “safe harbour” provision, where the intermediary itself is not blamed if the content created by some anonymous poster turns out to be unlawful or slanderous. The intermediary is required to take down such content within a certain time frame once notified by authorities. Secondly, content created by anonymous persons is treated on the same basis as offline content (in printed media and books), and online content generated by known parties. If the content is lawful, it should not be taken down even if it’s created by anonymous posters. These common sense guidelines are missing in the draft. Private sector intermediaries should not be asked to perform censorship and surveillance. Any takedowns should be on the basis of court orders, or orders by senior officials in the government departments concerned. Moreover, there should be a provision for appealing against a takedown.