diff options
Diffstat (limited to 'updates')
-rw-r--r-- | updates/2020/contact-tracing-requirements.en.md | 194 |
1 files changed, 194 insertions, 0 deletions
diff --git a/updates/2020/contact-tracing-requirements.en.md b/updates/2020/contact-tracing-requirements.en.md new file mode 100644 index 00000000..62e8d97a --- /dev/null +++ b/updates/2020/contact-tracing-requirements.en.md | |||
@@ -0,0 +1,194 @@ | |||
1 | title: 10 requirements for the evaluation of "Contact Tracing" apps | ||
2 | date: 2020-04-06 15:19:28 | ||
3 | updated: 2020-04-06 15:19:28 | ||
4 | author: linus | ||
5 | tags: pressemitteilung, updates | ||
6 | |||
7 | "Corona apps" are on everyone's lips as a way to contain the SARS-CoV-2 epidemic. CCC publishes 10 requirements for their evaluation from a technical and societal perspective. | ||
8 | |||
9 | <!-- TEASER_END --> | ||
10 | |||
11 | Currently, technically supported "contact tracing" is being considered | ||
12 | as means to counteract the spread of SARS-CoV-2 in a more targeted | ||
13 | manner. The general motivation is to allow greater freedom of movement | ||
14 | for the broad spectrum of society by allowing quick tracing and | ||
15 | interruption of infection chains. Contacts of infected persons should be | ||
16 | alerted more quickly and thus be able to quarantine themselves more | ||
17 | quickly. This, in turn, should prevent further infections. A "corona | ||
18 | app" could therefore protect neither ourselves nor our contacts: It | ||
19 | would be designed to break chains of infection by protecting the | ||
20 | contacts of our contacts. | ||
21 | |||
22 | ## Contact Tracing as a risk technology | ||
23 | |||
24 | There are a number of suggestions for the technical implementation of | ||
25 | this concept. These proposals range from dystopian systems of full | ||
26 | surveillance to targeted, completely anonymous methods of alerting | ||
27 | potentially infected persons without knowledge of the specific person. | ||
28 | |||
29 | In principle, the concept of a "Corona App" involves an enormous risk | ||
30 | due to the contact and health data that may be collected. At the same | ||
31 | time, there is a chance for "privacy-by-design" concepts and | ||
32 | technologies that have been developed by the crypto and privacy | ||
33 | community over the last decades. With the help of these technologies, it | ||
34 | is possible to unfold the epidemilogical potential of contact tracing | ||
35 | without creating a privacy disaster. For this reason alone, all concepts | ||
36 | that violate or even endanger privacy must be strictly rejected. | ||
37 | |||
38 | In the following, we outline social and technical minimum requirements | ||
39 | for such technologies. The CCC sees itself in an advisory and | ||
40 | observation role in this debate. *We will not recommend* specific apps, | ||
41 | concepts or procedures. We however *advise against* the use of apps that | ||
42 | do not meet these requirements. | ||
43 | |||
44 | ## I. Societal requirements | ||
45 | |||
46 | ### 1. Epidemiological sense & purpose | ||
47 | |||
48 | The basic prerequisite is that "contact tracing" can realistically help | ||
49 | to significantly and demonstrably reduce the number of infections. The | ||
50 | validation of this assessment is the responsibility of epidemiology. If | ||
51 | it turns out that "contact tracing" via app is not useful or does not | ||
52 | fullfil the purpose, the experiment must be terminated. | ||
53 | |||
54 | The application and any data collected must be used exclusively to | ||
55 | combat SARS-CoV-2 infection chains. Any other use must be technically | ||
56 | prevented as far as possible and legally prohibited. | ||
57 | |||
58 | ### 2. Voluntariness & freedom from discrimination | ||
59 | |||
60 | For an epidemiologically significant efficacy, a "contact tracing" app | ||
61 | requires a high degree of dissemination in society. This wide | ||
62 | distribution must not be achieved by force, but only by implementing a | ||
63 | trustworthy system that respects privacy. Against this background, there | ||
64 | must be no levying of fees for use as well as no financial incentives | ||
65 | for usage. | ||
66 | |||
67 | People who refuse to use it must not experience any negative | ||
68 | consequences. Ensuring this is a matter for politics and legislation. | ||
69 | |||
70 | The app must regularly inform people about its operation. It must allow | ||
71 | for simple temporary deactivation and permanent removal. Restrictive | ||
72 | measures, e.g. an "electronic shackles" function to control contact | ||
73 | restrictions, must not be implemented. | ||
74 | |||
75 | ### 3. Fundamental privacy | ||
76 | |||
77 | Only with a convincing concept based on the principle of privacy can | ||
78 | social acceptance be achieved at all. | ||
79 | |||
80 | At the same time, verifiable technical measures such as cryptography and | ||
81 | anonymisation technologies must ensure user privacy. It is not | ||
82 | sufficient to rely on organisational measures, "trust" and promises. | ||
83 | Organisational or legal hurdles against data access cannot be regarded | ||
84 | as sufficient in the current social climate of state-of-emergency | ||
85 | thinking and possible far-reaching exceptions to constitutional rights | ||
86 | through the Infection Protection Act. | ||
87 | |||
88 | We reject the involvement of companies developing surveillance | ||
89 | technologies as "covid washing". As a basic principle, users should not | ||
90 | have to 'trust' any person or institution with their data, but should | ||
91 | enjoy documented and tested technical security. | ||
92 | |||
93 | ### 4. Transparency and verifiability | ||
94 | |||
95 | The complete source code for the app and infrastructure must be freely | ||
96 | available without access restrictions to allow audits by all interested | ||
97 | parties. Reproducible build techniques must be used to ensure that users | ||
98 | can verify that the app they download has been built from the audited | ||
99 | source code. | ||
100 | |||
101 | ## II. Technical requirements | ||
102 | |||
103 | ### 5. No central entity to trust | ||
104 | |||
105 | A completely anonymous contact tracing without omniscient central | ||
106 | servers is technically possible. A dependence of the users' privacy on | ||
107 | the trustworthiness and competence of the operator of central | ||
108 | infrastructure is technically not necessary. Concepts based on this | ||
109 | "trust" are therefore to be rejected. | ||
110 | |||
111 | In addition, promised security and trustworthiness of centralised | ||
112 | systems - for example against the connection of IP addresses with | ||
113 | anonymous user IDs - cannot be effectively verified by users. Systems | ||
114 | must therefore be designed to guarantee the security and confidentiality | ||
115 | of user data exclusively through their encryption and anonymisation | ||
116 | concept and the verifiability of the source code. | ||
117 | |||
118 | ### 6. Data economy | ||
119 | |||
120 | Only minimal data and metadata necessary for the application purpose may | ||
121 | be stored. This requirement prohibits the central collection of any data | ||
122 | that is not specific to a contact between people and its duration. | ||
123 | |||
124 | If additional data such as location information are recorded locally on | ||
125 | the phones, users must not be forced or tempted to pass this data on to | ||
126 | third parties or even publish it. Data that is no longer needed must be | ||
127 | deleted. Sensitive data must also be securely encrypted locally on the | ||
128 | phone. | ||
129 | |||
130 | For voluntary data collection for epidemiological research purposes that | ||
131 | goes beyond the actual purpose of contact tracing, a clear, separate | ||
132 | consent must be explicitly obtained in the app's interface and it must | ||
133 | be possible to revoke it at any time. This consent must not be a | ||
134 | prerequisite for use. | ||
135 | |||
136 | ### 7. Anonymity | ||
137 | |||
138 | The data that each device collects about other devices must not be | ||
139 | suitable for deanonymizing their users. The data that each person may | ||
140 | pass on about themself must not be suitable for deanonymising the | ||
141 | person. It must therefore be possible to use the system without | ||
142 | collecting or being able to derive personal data of any kind. This | ||
143 | requirement prohibits unique user identifications. | ||
144 | |||
145 | IDs for "contact tracing" via wireless technology (e.g. Bluetooth or | ||
146 | ultrasound) must not be traceable to persons and must change frequently. | ||
147 | For this reason, it is also forbidden to connect or derive IDs with | ||
148 | accompanying communication data such as push tokens, telephone numbers, | ||
149 | IP addresses used, device IDs etc. | ||
150 | |||
151 | ### 8. No creation of central movement or contact profiles | ||
152 | |||
153 | The system must be designed in such a way that movement profiles | ||
154 | (location tracking) or contact profiles (patterns of frequent contacts | ||
155 | traceable to specific people) can't be established intentionally or | ||
156 | unintentionally. Methods such as central GPS/location logging or linking | ||
157 | the data to telephone numbers, social media accounts and the like must | ||
158 | therefore be rejected as a matter of principle. | ||
159 | |||
160 | ### 9. Unlinkability | ||
161 | |||
162 | The design of the temporary ID generation must be such that IDs cannot | ||
163 | be interpreted and linked without possession of a user controlled | ||
164 | private key. They must therefore not be derived from other directly or | ||
165 | indirectly user identifying information. Regardless of the way IDs are | ||
166 | communicated in the event of infection, it must be ruled out that the | ||
167 | collected "contact tracing" data can be chained over longer periods of | ||
168 | time. | ||
169 | |||
170 | ### 10. Unobservability of communication | ||
171 | |||
172 | Even if the transmission of a message is observed in the system (e.g. | ||
173 | via communication metadata), it must not be possible to conclude that a | ||
174 | person is infected himself or herself or has had contact with infected | ||
175 | persons. This must be ensured both with regard to other users and to | ||
176 | infrastructure and network operators or attackers who gain insight into | ||
177 | these systems. | ||
178 | |||
179 | ## Role of the CCC | ||
180 | |||
181 | For well over 30 years, CCC has engaged in voluntary work at the | ||
182 | intersection between technology and society. [Our ethical | ||
183 | principles](/en/hackerethics) stand for privacy, decentralization and | ||
184 | data economy – and against any form of surveillance and coercion. | ||
185 | |||
186 | Without claiming to be exhaustive, in this article we name minimum | ||
187 | privacy requirements that a "Corona App" must meet in order to be | ||
188 | socially and technologically tolerable at all. CCC will under no | ||
189 | circumstances ever provide a concrete implementation with approval, | ||
190 | recommendation, a certificate or test seal. | ||
191 | |||
192 | It is the responsibility of the developers of contact tracing systems to | ||
193 | prove the fulfillment of these requirements or to have them proven by | ||
194 | independent third parties. | ||