This post is the fifth blog post in a seven-post series. If you want to see all the posts in the series check out Recap: Privacy Engineering Practice and Respect 2020.
The three PEPR 2020 talks on privacy-preserving technologies are:
- Privacy in Deployment
- Design of a Privacy Infrastructure for the Internet of Things
- A Backdoor by Any Other Name, and How to Stop It
Privacy in Deployment
In Privacy in Deployment, Patricia Thaine presents on behalf of her colleagues Pieter Luitjens and Parinaz Sobhani. Patricia begins by providing a definition of data privacy under the General Data Protection Regulation (GDPR). In addition to the GDPR definition, Patricia says that privacy is about maintaining customer trust, building better technology through Privacy Enhancing Technologies (PETs), and having peace of mind from doing the right thing.
This talk walks you through Private AI's Privacy Enhancing Technologies Decision Tree. The decision tree, in combination with the talk, helps you understand what PETs are available, when and how they may be useful, and the different resources or support that may be available for them. The decision tree provides insight into PETs like:
- Trusted Execution Environments (e.g., Intel SGX)
- Secure Multiparty Computation
- Homomorphic Encryption
- Anonymization (e.g., Differential Privacy)
- Pseudonymization (e.g., Deidentification)
- Federated Learning
- Synthetic Data
In conclusion, Patricia states that privacy is possible. You can protect users' data while continuing to use it for practical tasks with the right technologies. Privacy is also practical. You can maintain customer trust, lower the risk of regulatory fines, gain a competitive advantage, and obtain access that would otherwise be inaccessible.
Design of a Privacy Infrastructure for the Internet of Things
In Design of a Privacy Infrastructure for the Internet of Things, Norman Sadeh presents the work he and his colleagues have done to bring greater awareness and control of users' privacy when encountering Internet of Things (IoT) devices.
For iPhones, all (or most of) the application privacy settings are presented in one place. However, unlike downloaded applications, you do not have the same control over the IoT devices all around you—you may not even know these devices exist. Norman suggests combining QR codes located near devices and location-based discovery to communicate the privacy options of IoT devices to users.
To achieve this in a scalable way, Norman proposes a unified privacy infrastructure. This would be achieved by owners of devices and device manufacturers to publicize IoT devices and the privacy options available through an IoT Portal. Meanwhile, users could discover nearby devices and associated privacy options through the use of a mobile application. Designing such a privacy infrastructure brings several challenges:
- Should be usable and understandable with varying technical acumen
- Infrastructure should be jurisdiction-agnostic
- Support data subject rights without managing those rights
- Discourage abuse by authenticating users and holding them responsible
The infrastructure should leverage reusable device templates for common devices e.g., Alexa smart speaker. If a template was not available, a wizard can assist users when describing the behavior of one-off IoT devices. Users are also given the choice to control how they are notified by the mobile application e.g., when they walk by a new IoT device, when their data is collected, etc. For a more detailed breakdown of these design challenges, check out the talk!
A Backdoor by Any Other Name, and How to Stop It
In A Backdoor by Any Other Name, and How to Stop It, Max Hunter presents the continued work by the Electronic Frontier Foundation (EFF) to push back on bad legislation and fight for users' ability to use strong encryption. This includes fighting the Clipper chip, export restrictions on development and sharing of strong encryption, and potential plans for mandatory cryptographic backdoors under the Obama administration. The next frontier for the EFF is challenging the EARN IT Act in the US and traceability mandates in India and Brazil.
At the time of this presentation, the EARN IT Act would create an advisory committee responsible for defining the "best practices" that companies should or must follow. Many view this as an attempt to require disclosure of private messages and in effect break end-to-end encryption. Some may also recommend client-side scanning to allow lawful access to content. However, this has the same effect of breaking end-to-end encryption.
Meanwhile, Brazil's Fake News Law would compel service provides to produce a complete chain of communication for encrypted messages. Max suggests such legislation would break WhatsApp's Privacy by Design approach to tracking forwarded messages. WhatsApp would need to collect additional information from clients and violate the users' trust in privacy-centric software. However, Max states that not only will such changes harm users, they also will not assist law enforcement with eliminating misinformation.
I hope these posts have piqued your interest in PEPR 2020 and future iterations of the conference! If you are interested in checking out other sessions at PEPR 2020 check out Recap: Privacy Engineering Practice and Respect 2020.
If you liked this post (or have ideas on how to improve it), I'd love to know as always. Cheers!