summaryrefslogtreecommitdiff
path: root/updates/2020/contact-tracing-requirements.en.md
blob: 1427f1528761164d6babcfd03cb236b6ea9d2760 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
title: 10 requirements for the evaluation of "Contact Tracing" apps
date: 2020-04-06 15:19:28 
updated: 2020-05-02 22:03:25 
author: linus
tags: update, pressemitteilung, anonymisierung

"Corona apps" are on everyone's lips as a way to contain the SARS-CoV-2 epidemic. CCC publishes 10 requirements for their evaluation from a technical and societal perspective.

<!-- TEASER_END -->

Currently, technically supported "contact tracing" is being considered
as means to counteract the spread of SARS-CoV-2 in a more targeted
manner. The general motivation is to allow greater freedom of movement
for the broad spectrum of society by allowing quick tracing and
interruption of infection chains. Contacts of infected persons should be
alerted more quickly and thus be able to quarantine themselves more
quickly. This, in turn, should prevent further infections. A "corona
app" could therefore protect neither ourselves nor our contacts: It
would be designed to break chains of infection by protecting the
contacts of our contacts.

## Contact Tracing as a risk technology

There are a number of suggestions for the technical implementation of
this concept. These proposals range from dystopian systems of full
surveillance to targeted, completely anonymous methods of alerting
potentially infected persons without knowledge of the specific person.

In principle, the concept of a "Corona App" involves an enormous risk
due to the contact and health data that may be collected. At the same
time, there is a chance for "privacy-by-design" concepts and
technologies that have been developed by the crypto and privacy
community over the last decades. With the help of these technologies, it
is possible to unfold the epidemilogical potential of contact tracing
without creating a privacy disaster. For this reason alone, all concepts
that violate or even endanger privacy must be strictly rejected.

In the following, we outline social and technical minimum requirements
for such technologies. The CCC sees itself in an advisory and
observation role in this debate. *We will not recommend* specific apps,
concepts or procedures. We however *advise against* the use of apps that
do not meet these requirements.

## I. Societal requirements

### 1. Epidemiological sense & purpose

The basic prerequisite is that "contact tracing" can realistically help
to significantly and demonstrably reduce the number of infections. The
validation of this assessment is the responsibility of epidemiology. If
it turns out that "contact tracing" via app is not useful or does not
fullfil the purpose, the experiment must be terminated.

The application and any data collected must be used exclusively to
combat SARS-CoV-2 infection chains. Any other use must be technically
prevented as far as possible and legally prohibited.

### 2. Voluntariness & freedom from discrimination

For an epidemiologically significant efficacy, a "contact tracing" app
requires a high degree of dissemination in society. This wide
distribution must not be achieved by force, but only by implementing a
trustworthy system that respects privacy. Against this background, there
must be no levying of fees for use as well as no financial incentives
for usage.

People who refuse to use it must not experience any negative
consequences. Ensuring this is a matter for politics and legislation.

The app must regularly inform people about its operation. It must allow
for simple temporary deactivation and permanent removal. Restrictive
measures, e.g. an "electronic shackles" function to control contact
restrictions, must not be implemented.

### 3. Fundamental privacy

Only with a convincing concept based on the principle of privacy can
social acceptance be achieved at all.

At the same time, verifiable technical measures such as cryptography and
anonymisation technologies must ensure user privacy. It is not
sufficient to rely on organisational measures, "trust" and promises.
Organisational or legal hurdles against data access cannot be regarded
as sufficient in the current social climate of state-of-emergency
thinking and possible far-reaching exceptions to constitutional rights
through the Infection Protection Act.

We reject the involvement of companies developing surveillance
technologies as "covid washing". As a basic principle, users should not
have to 'trust' any person or institution with their data, but should
enjoy documented and tested technical security.

### 4. Transparency and verifiability

The complete source code for the app and infrastructure must be freely
available without access restrictions to allow audits by all interested
parties. Reproducible build techniques must be used to ensure that users
can verify that the app they download has been built from the audited
source code.

## II. Technical requirements

### 5. No central entity to trust

A completely anonymous contact tracing without omniscient central
servers is technically possible. A dependence of the users' privacy on
the trustworthiness and competence of the operator of central
infrastructure is technically not necessary. Concepts based on this
"trust" are therefore to be rejected.

In addition, promised security and trustworthiness of centralised
systems - for example against the connection of IP addresses with
anonymous user IDs - cannot be effectively verified by users. Systems
must therefore be designed to guarantee the security and confidentiality
of user data exclusively through their encryption and anonymisation
concept and the verifiability of the source code.

### 6. Data economy

Only minimal data and metadata necessary for the application purpose may
be stored. This requirement prohibits the central collection of any data
that is not specific to a contact between people and its duration.

If additional data such as location information are recorded locally on
the phones, users must not be forced or tempted to pass this data on to
third parties or even publish it. Data that is no longer needed must be
deleted. Sensitive data must also be securely encrypted locally on the
phone.

For voluntary data collection for epidemiological research purposes that
goes beyond the actual purpose of contact tracing, a clear, separate
consent must be explicitly obtained in the app's interface and it must
be possible to revoke it at any time. This consent must not be a
prerequisite for use.

### 7. Anonymity

The data that each device collects about other devices must not be
suitable for deanonymizing their users. The data that each person may
pass on about themself must not be suitable for deanonymising the
person. It must therefore be possible to use the system without
collecting or being able to derive personal data of any kind. This
requirement prohibits unique user identifications.

IDs for "contact tracing" via wireless technology (e.g. Bluetooth or
ultrasound) must not be traceable to persons and must change frequently.
For this reason, it is also forbidden to connect or derive IDs with
accompanying communication data such as push tokens, telephone numbers,
IP addresses used, device IDs etc.

### 8. No creation of central movement or contact profiles

The system must be designed in such a way that movement profiles
(location tracking) or contact profiles (patterns of frequent contacts
traceable to specific people) can't be established intentionally or
unintentionally. Methods such as central GPS/location logging or linking
the data to telephone numbers, social media accounts and the like must
therefore be rejected as a matter of principle.

### 9. Unlinkability

The design of the temporary ID generation must be such that IDs cannot
be interpreted and linked without possession of a user controlled
private key. They must therefore not be derived from other directly or
indirectly user identifying information. Regardless of the way IDs are
communicated in the event of infection, it must be ruled out that the
collected "contact tracing" data can be chained over longer periods of
time.

### 10. Unobservability of communication

Even if the transmission of a message is observed in the system (e.g.
via communication metadata), it must not be possible to conclude that a
person is infected himself or herself or has had contact with infected
persons. This must be ensured both with regard to other users and to
infrastructure and network operators or attackers who gain insight into
these systems.

## Role of the CCC

For well over 30 years, CCC has engaged in voluntary work at the
intersection between technology and society. [Our ethical
principles](/en/hackerethics) stand for privacy, decentralization and
data economy – and against any form of surveillance and coercion.

Without claiming to be exhaustive, in this article we name minimum
privacy requirements that a "Corona App" must meet in order to be
socially and technologically tolerable at all. CCC will under no
circumstances ever provide a concrete implementation with approval,
recommendation, a certificate or test seal.

It is the responsibility of the developers of contact tracing systems to
prove the fulfillment of these requirements or to have them proven by
independent third parties.