The Centre’s stance is in alignment with Section 69 of the Information Technology Act, 2000 (IT Act), based on which an intermediary is supposed to provide ‘all facilities and technical assistance’ to the government to decrypt data. The same was supplemented by the IT Rules, 2009 under Section 2(g), which states that these decryption facilities must be provided by such platforms to the ‘extent possible’.
Facebook Inc has argued that messages on its mobile application WhatsApp are secured by end-to-end encryption. The technology works like this: Once users sends a message, they generate a key. Only the intended recipient of the message has the unique public key to unlock it, and the company says it does not have access to this key. The ‘extent possible’ provision in the Rules, Facebook argues, could only mean the ability of the platform to decrypt the same, which, in this case, it lacks.
The argument has, therefore, transformed into a matter of national security versus individual privacy. “National security is paramount, but after the Puttaswamy case, ‘Right to Privacy’ has been declared a fundamental right. This case tests the applicability of this right vis-à-vis the former,” says Dhruv Suri, partner at PSA Legal.
Many groups have argued that these guidelines could have a chilling impact on the freedom of speech under Article 19. “We cannot jeopardise the right to privacy of citizens of this country, either. There should not be a situation where the government has easy access to each and every chat just because the Rules are vague,” says Kazim Rizvi, founding director of The Dialogue, a policy think-tank.
The Centre says the liability to decrypt always lay with intermediaries under provisions of the IT Act. However, firms cannot create a mechanism of encryption and, subsequently, say that they cannot decrypt. “The intermediaries, however, say they don’t have a problem with decryption, but the government must do it,” adds Suri.
Policy experts have argued that end-to-end encryption is not a technology restricted to intermediaries, and forcing them to decrypt would lead to criminals creating a new encrypted service. As the government has widened the scope of those who could request access to “any government agency”, they say this could potentially be a mass surveillance threat.
As the Centre has only prescribed a limit of 72 hours to take down a post that it deems as “threatening public health or safety” (which many have argued is a vague term), it could prove to be quite onerous for social media
In the Shreya Singhal case, the Apex Court had laid down that it was against pre-censorship and any kind of blanket ban. “Actual knowledge of the content being illegal needs to be established,” says Simranjeet Singh, partner at Athena Legal.
Companies argue that the guidelines pose problems for smaller businesses as well as start-ups. While the IT Act is silent, the draft Rules say intermediaries with over
5 million users must be locally incorporated. This, along with the fact that the Rules seem to have an extra-territorial application, have been argued to be violations of the parent Act. “The government must keep in mind that Rules, being secondary legislation, cannot override the main legislation, which is the IT Act,” says Pavan Duggal, advocate, Supreme Court.
Such Rules may also act as a barrier to the markets. Further, stringent decryption mechanisms are extremely hard to set up for start-ups. “These Rules may affect venture funding in India, and could also pose risks to start-ups trying to establish their businesses with limited resources,” says Manuj Garg, co-founder of myUpchar.com.
The case, therefore, boils down to proportionality, as laid down in the Puttaswamy case. The Centre says it will come up with the final guidelines in another couple of months. Whether the guidelines turn out proportional to the threat, is what remains to be seen.
Key provisions under draft guidelines
Intermediaries with over 5 million users must be locally incorporated, with a physical address in India
They must provide response to a request from a government agency within 72 hours
They must use currently available technological tools to identify and remove unlawful content
They must remove any content that harms public health or safety
They must be able to decrypt and trace origin of the content