In the process of combating the pandemic, a multitude of potential technological solutions have been proposed, from QR apps that can locate people who were present in a particular place at a given time, to digital vaccination passports or contact tracing applications.
These contact tracking apps were designed to be able to find out when a person has been in contact with someone who has tested positive for coronavirus and thus carry out the appropriate tests, as well as preventive isolation.
Stay in the loop.
On the first case - without entering into discussions on which percentages assure effectiveness - it is clear that there is a benefit to being able to locate people who have been exposed to the virus and who, therefore, can contribute to the spread of COVID-19.
However, the risks are obvious. We must bear in mind that we are dealing with health-related data, and a series of actors involved that come from outside the public administration, such as those related to information society services.
Proportionate use of location data
In order to ensure transparency and increase trust in the system, the Guidelines 04/2020 on the use of location data and contact tracing tools in the context of the COVID-19 outbreak were approved by the European Data Protection Board (EDPB). According to this document, the EDPB understands that the data protection legal framework was designed to be a flexible instrument and, as such, is able to provide an efficient response in limiting the pandemic and, at the same time, protect fundamental human rights and freedoms.Therefore, it proposes a proportionate use of location data and contact tracing tools for specific purposes, using mechanisms to guarantee the rights of individuals through the tracing of individual movements rather than information about the proximity of users. It is for this reason that proximity detection solutions using the DP-3T standard for decentralized proximity tracing have progressively been chosen in order to preserve privacy.
Apps using this technology are capable of registering devices that have been at a distance that may have contributed to the spread of the disease if a positive case is subsequently detected, but that don't need to identify the person who has tested positive. For this reason, it is preferable to use these solutions rather than centralized servers that may have access to all the information.
Unnecessary risk to users' rights
Having said that, tracing apps are not exclusively made up of this technological base of proximity tracking. They include different routines that are necessary to communicate with the health institutions' servers, send the confirmation of a positive result, or download the lists.
It is here that potential vulnerabilities can appear, allowing users to be identified and, therefore, creating an unnecessary risk to their rights. In fact, in the case of the Spanish app, Radar COVID, problems were subsequently detected that had to be solved in new versions of the application. Likewise, compliance problems regarding the transparency obligations indicated in the Guidelines were also detected.
The first problem occurred during the development of the application itself. The pilot program was carried out in June 2020 in the island of La Gomera in the Canary Islands, but it was not until September that citizens were able to access the application's source code.
It was then, with the opening of the repository, that third parties detected privacy issues related to the implementation of DP3T in the app, such as the non-existence of false traffic at the time of sending a positive case to the servers. Since the application only connected to the servers in the event of a positive case, tracing which people had tested positive seemed trivial.
The second problem arose when, despite opening the repository and providing access to the source code, the documentation relating to the application's risk analysis and data protection impact assessment was not published. It was this circumstance that prompted Rights International Spain, in coordination with the Civil Liberties Union for Europe, to initiate the necessary strategic litigation to obtain the documentation.
Data protection impact assessment not in time
According to the Guidelines, the EDPB considers that a data protection impact assessment (DPIA) must be carried out before implementing such a tool, as the processing is considered likely to be high risk, such as health data, anticipated large-scale adoption, systematic monitoring, use of new technological solution. Furthermore, in the same document, the EDPB strongly recommends the publication of the DPIA.
Therefore, these recommendations, as well as compliance with the basic principles regarding transparency, imply that the DPIA should have been developed even prior to the pilot program, and access to it should have been facilitated to any interested citizen, without the need to make a request to that effect. This way, citizens' confidence in the system would increase since they would have first-hand access to the information used to guarantee the right.
This refusal was appealed before the Transparency Council, which finally agreed with our arguments, and considered that there was an existing public document, different from the one that SEDIA indicated that would be published later, and that the limitation of access to public information should be interpreted restrictively.
Meanwhile, the Ministry published an alleged DPIA in the GitHub repository dedicated to the app, but with important deficiencies. First of all, and as with the original document, there is no electronic signature and it merely indicates a date of completion.
Curiously, this date is prior to the request made, which was in fact rejected due to alleged changes that were to be made, but with no indication that it was subsequently modified. In addition, despite indicating a version control in the same document, claiming it is version 2.0, there is no version control of any kind in the document, preventing citizens from knowing what changes have been made since it was first produced.Of course, we absolutely acknowledge that both the DPIA and the app itself are ultimately live tools, susceptible to modification during their useful life. They must adapt to potential changes related to the detection of vulnerabilities and recommendations for improvement. But this does not imply that citizens can't have access to this information.
But we can go further, once the Ministry finally delivered the requested document, and in which we warn:
- It does not have an electronic signature
- It is declared that it was carried out in August
If a pilot program was conducted in June 2020, and the app was already widely distributed in August close to the dates of the report, it is impossible that the potential risks were taken into account in the first case, and that the recommendations of the evaluation had been incorporated before the launch of the second one. Furthermore, the lack of an electronic signature mechanism on the document prevents verification that it was actually carried out on the dates indicated.
In addition to all of the above, and despite having obtained access to the document, the Ministry's transparency practice continues to be deficient. Even though the development of these solutions requires great transparency, the document provided to us has not been included in the repository accessible to the public (neither directly nor as part of the version control of the uploaded document). Therefore, citizens who want to access it are forced to assert a claim.
If no complaint had been filed before the Transparency Council, the Ministry would have simply closed the case, not provided any information, and would have published a DPIA that does not correspond to the application developed.
In our opinion, the development of the whole campaign demonstrates that there is still a lack of practice in the promotion of fully transparent actions, especially if it affects open-source development. The obfuscation of the code, and the time it took to give access to third parties indicating that it would be done with the app already in production, prevented the solution of several vulnerabilities that were quickly detected once the code was uploaded to the repository. All this has hindered the adoption of a tool that is already complex per se.
We are still waiting for the results of the Spanish Data Protection Agency's investigation, given that, in our opinion, several actions may have been potentially contrary to data protection regulations.
It is important that when we talk about healthcare, and even more when a new technological solution is to be introduced, all the circumstances of the case are taken into account so that the best possible action is taken.