After pushback from civil society, the protocol for data-sharing was released. This protocol was opaque and ambiguous in several key respects. Personal data including medical information and location data would be gathered and stored for a minimum 30 days, and maybe till November 2020, or even later.
The data could be shared with any government organisation, and also with private organisations if deemed “necessary”, without details as to when and why it would be necessary. While data would be anonymised, the anonymisation process was also not indicated.
Soon after release, researchers reported bugs. The code was then released on the website GitHub. Now it turns out the code released on GitHub was not only incomplete; it is not the actual code the app is running according to researchers.
Contact tracing apps have been released by governments in Singapore, Israel, UK, Australia, Italy, Austria, Germany, etc. Such apps have also been jointly developed by Google
and Apple, ensuring smartphones running Android or iOS are covered.
The government apps, as well as the private sector apps are mostly open-source. Open source allows code to be studied by independent researchers. Bugs can be quickly discovered and patched. It is important source code for both user and server-side operations is open to identify and plug security leaks. The AS app did not release server-side code. Researchers say the actual code is not what has been released on GitHub.
Apart from China, most governments have not made using contact-tracing apps mandatory. In many apps, data is stored on the user’s handset, to be accessed if asked for. The AS data is stored on the handset but it can uploaded to a cloud server at the government’s discretion.
AS uses Bluetooth and location data. Most other apps use Bluetooth but many don’t use locational data. Some use the new DP-3T (Decentralized Privacy-Preserving Proximity Tracing) protocol developed post-pandemic. DP-3T assigns randomised anonymised unique IDs to users. Those IDs change every so often. Google-Apple
uses a similar system. These ephemeral IDs allow proximity to a corona-positive person to be picked up without identity or location being disclosed.
If bluetooth is on, two handsets communicate if they are in close proximity. Hence the app will flag proximity to users who may be corona-positive. This can throw up false positives — Bluetooth works through walls. But location is not recorded. (Location also throws up false positives.) AS uses location data, which means whereabouts and movements of users is known 24x7.
Data must be anonymised if it is going to be stored or shared since it includes medical information and location. Anonymising is hard. The protocol doesn’t detail how anonymising will be done. It is easy to de-anonymise, if any clues remain. For example, given a phone number, or handset EMEI, the user is known. Given location data over 30 days, it would be possible to write a dossier detailing users’ habitual routines.
Net-net, the app gathers a lot of data, which is unnecessary for the task it supposedly performs. How it stores this data, and how and why this may be shared is also opaque. The server-side code is definitely not open-source. The app may exclude people who don’t own smartphones from travelling. The data it gathers could be used to build a profile of every user’s daily routines.
India lacks a digital privacy law though the Supreme Court
ruled that was a fundamental right in September 2017. A Committee under Retired Justice B N Srikrishna drafted a model law, released into public domain in July 2018. That model legislation has since been redrafted in a fashion that Justice Srikrishna described the redraft as “Orwellian”. The new draft legislation proposes to allow the government to perform surveillance as it likes, with no checks and balances. The design of the Aarogya Setu app will facilitate widespread surveillance.