Recap: PEPR 2020 — Design
3 min read

Recap: PEPR 2020 — Design

Missed PEPR 2020 and want a recap of the privacy design talks? Read this.

Overview

This post is the third blog post in a seven-post series. If you want to see all the posts in the series check out Recap: Privacy Engineering Practice and Respect 2020.

The three PEPR 2020 talks on privacy design are:

  1. How to (In)Effectively Convey Privacy Choices with Icons and Link Text
  2. Beyond the Individual: Exploring Data Protection by Design in Connected Communal Spaces
  3. Throwing Out the Checklist

In How to (In)Effectively Convey Privacy Choices with Icons and Link Text, Florian Schaub and Lorrie Cranor present how they designed and tested privacy icons to be used with the California Consumer Privacy Act (CCPA). Icons can help guide users' attention, communicate concepts concisely, can be language and culture-independent, and help with recognition across services.

The speakers followed a 7-step process for developing privacy icons for CCPA including steps like icon ideation, icon evaluation, improvement, and the evaluation of link text. During testing, they found the following attributes to be important in privacy icons: Attention, Discoverability, Comprehension, Expectation, Utility, and Behavior. Finally, Florian and Lorrie identify a few key takeaways from their research:

  1. Icon design is hard and should be tested with users
  2. Privacy choice icons are rooted in simple and familiar concepts
  3. Icons should be accompanied by link text
  4. Wide adoption of icons does not mean they are effective
  5. Privacy icons are one piece of usable privacy choices
  6. Privacy icons should be standardized across legislation

Beyond the Individual: Exploring Data Protection by Design in Connected Communal Spaces

In Beyond the Individual: Exploring Data Protection by Design in Connected Communal Spaces, Martin Kraemer discusses the less-researched side of communal privacy vs. the more discussed individual privacy. While legislation is traditionally concerned with an individual right to privacy, they also provide exceptions (GDPR) or ambiguous rights (CCPA) for households. This research examines how privacy is impacted for groups of people in shared spaces like homes, cafes, and airports.

In these types of settings, heterogeneous groups may have dynamic social structures, varying responsibilities within the group, and different technical capabilities. To help address this Martin presents hypothetical scenarios to two research groups which are focused on 1) critiquing the present scenario and 2) envisioning the future of privacy in communal spaces—the two scenarios were managing communal privacy in a coffee shop and a home.

Martin mentions a few reflections and observations based on his research. The initial pilots were limited to students and their lived experiences. However, personas were presented which provide personal characteristics, this allowed participants to build matching rules and relationships to these needs e.g., technical capabilities of a person. Among other things, Martin also found that participants were resourceful in making privacy-preserving decisions in communal spaces.

For a deeper look at the two hypothetical scenarios and more in-depth takeaways, I recommend you watch the talk!

Throwing Out the Checklist

In Throwing out the Checklist, Dan Crowley discusses growing the privacy by ethos culture at Quizlet and moving away from only relying on compliance checklists. While there are great frameworks like Privacy by Design that tells us how to approach privacy, implementing this in practice is sometimes difficult. This may manifest in a checklist of requirements including triage questionnaires, PIAs, DPIAs, policy approvals, risk assessments, vendor reviews, certifications, and more.

For small and rapidly growing organizations, Dan suggests that "the single most impactful thing you can do as your company's 'privacy person' is to turn everyone else into a 'privacy person' too." Instead of throwing out the checklists completely, keep them to yourself. Instead, focus on culture and ask others in your organization:

  • Why does privacy matter?
  • What are our principles?
  • What could go wrong?
  • What more can go right?
  • What do our users expect of us?
  • What can you, as an engineer, designer, etc. teach me?

Dan closes it out by describing some key takeaways of building a privacy by ethos culture at Quizlet by making it fun, starting early, being consistent and present, inviting conversations, providing real-life rewards, and using what's already there to your advantage.

Wrapping Up

I hope these posts have piqued your interest in PEPR 2020 and future iterations of the conference! If you are interested in checking out other sessions at PEPR 2020 check out Recap: Privacy Engineering Practice and Respect 2020.

If you liked this post (or have ideas on how to improve it), I'd love to know as always. Cheers!