Dienst van SURF
© 2025 SURF
The American company Amazon has made headlines several times for monitoring its workers in warehouses across Europe and beyond.1 What is new is that a national data protection authority has recently issued a substantial fine of €32 million to the e-commerce giant for breaching several provisions of the General Data Protection Regulation (gdpr) with its surveillance practices. On 27 December 2023, the Commission nationale de l’informatique et des libertés (cnil)—the French Data Protection Authority—determined that Amazon France Logistique infringed on, among others, Articles 6(1)(f) (principle of lawfulness) and 5(1)(c) (data minimization) gdpr by processing some of workers’ data collected by handheld scanner in the distribution centers of Lauwin-Planque and Montélimar.2 Scanners enable employees to perform direct tasks such as picking and scanning items while continuously collecting data on quality of work, productivity, and periods of inactivity.3 According to the company, this data processing is necessary for various purposes, including quality and safety in warehouse management, employee coaching and performance evaluation, and work planning.4 The cnil’s decision centers on data protection law, but its implications reach far beyond into workers’ fundamental right to health and safety at work. As noted in legal literature and policy documents, digital surveillance practices can have a significant impact on workers’ mental health and overall well-being.5 This commentary examines the cnil’s decision through the lens of European occupational health and safety (EU ohs). Its scope is limited to how the French authority has interpreted the data protection principle of lawfulness taking into account the impact of some of Amazon’s monitoring practices on workers’ fundamental right to health and safety.
MULTIFILE
Data collected from fitness trackers worn by employees could be very useful for businesses. The sharing of this data with employers is already a well-established practice in the United States, and companies in Europe are showing an interest in the introduction of such devices among their workforces. Our argument is that employers processing their employees’ fitness trackers data is unlikely to be lawful under the General Data Protection Regulation (GDPR). Wearable fitness trackers, such as Fitbit and AppleWatch devices, collate intimate data about the wearer’s location, sleep and heart rate. As a result, we consider that they not only represent a novel threat to the privacy and autonomy of the wearer, but that the data gathered constitutes ‘health data’ regulated by Article 9. Processing health data, including, in our view, fitness tracking data, is prohibited unless one of the specified conditions in the GDPR applies. After examining a number of legitimate bases which employers can rely on, we conclude that the data processing practices considered do not comply with the principle of lawfulness that is central to the GDPR regime. We suggest alternative schema by which wearable fitness trackers could be integrated into an organization to support healthy habits amongst employees, but in a manner that respects the data privacy of the individual wearer.
MULTIFILE
The modern economy is largely data-driven and relies on the processing and sharing of data across organizations as a key contributor to its success. At the same time, the value, amount, and sensitivity of processed data is steadily increasing, making it a major target of cyber-attacks. A large fraction of the many reported data breaches happened in the healthcare sector, mostly affecting privacy-sensitive data such as medical records and other patient data. This puts data security technologies as a priority item on the agenda of many healthcare organizations, such as of the Dutch health insurance company Centraal Ziekenfonds (CZ). In particular when it comes to sharing data securely, practical data protection technologies are lacking as they mostly focus on securing the link between two organizations while being completely oblivious of what is happening with the data after sharing. For CZ, searchable encryption (SE) technologies that allow to share data in encrypted form, while enabling the private search on this encrypted data without the need to decrypt, are of particular interest. Unfortunately, existing efficient SE schemes completely leak the access pattern (= pattern of encrypted search results, e.g. identifiers of retrieved items) and the search pattern (= pattern of search queries, e.g. frequency of same queries), making them susceptible to leakage-abuse attacks that exploit this leakage to recover what has been queried for and/or (parts of) the shared data itself. The SHARE project will investigate ways to reduce the leakage in searchable encryption in order to mitigate the impact of leakage-abuse attacks while keeping the performance-level high enough for practical use. Concretely, we propose the construction of SE schemes that allow the leakage to be modeled as a statistic released on the queries and shared dataset in terms of ε-differential privacy, a well-established notion that informally says that, after observing the statistic, you learn approximately (determined by the ε-parameter) the same amount of information about an individual data item or query as if the item was not present in the dataset or the query has not been performed. Naturally, such an approach will produce false positives and negatives in the querying process, affecting the scheme’s performance. By calibrating the ε-parameter, we can achieve various leakage-performance trade-offs tailored to the needs of specific applications. SHARE will explore the idea of differentially-private leakage on different parts of SE with different search capabilities, starting with exact-keyword-match SE schemes with differentially-private leakage on the access pattern only, up to schemes with differentially-private leakage on the access and search pattern as well as on the shared dataset itself, allowing for more expressive query types like fuzzy match, range, or substring queries. SHARE comes with an attack lab in which we investigate existing and new types of leakage-abuse attacks to assess the mitigation-potential of our proposed combination of differential privacy with cryptographic guarantees in searchable encryption. To stimulate commercial exploitation of SHARE-results, our consortium partners CZ and TNO will take the lead on applying and evaluating our envisioned technologies in various healthcare use-cases.