3 min read

Recap: PEPR 2020 — Frameworks and Risks

Missed PEPR 2020 and want a recap of the Frameworks and Risks talks? Read this.

Overview

This post is the seventh and final blog post in a seven-post series recapping the PEPR 2020 conference. If you're wondering what PEPR is or want to see the other PEPR conference recaps check out this post!

The three PEPR 2020 talks on Frameworks and Risks are:

  1. Assessing Privacy Risk with the IPA Triad
  2. Engineering Ethics into the NIST Privacy Framework
  3. When Engineers and Lawyers Talk: Right-Sizing Your Data Protection Risk Profile

Assessing Privacy Risk with the IPA Triad

In Assessing Privacy Risk with the IPA Triad, Mark Funk proposes the IPA Triad (Identity, Presence, Activity) to help describe what it means to be private—this is analogous to security's CIA Triad (Confidentiality, Integrity, Availability) to describe what it means to be secure. Identity describes data that reveals who an individual, group, or population is. Presence describes data that reveals where these people have been, and Activity describes data that reveals what happened at a given point in time.

Similar to the CIA Triad, it's difficult to find data that does not touch multiple dimensions of the IPA Triad. Using a credit card at a store likely identifies you, what store you're shopping at, and when the purchase was made. Mark suggests using a qualitative risk assessment of Severity, Scope, and Likelihood, to measure the risk severity based on data sensitivity.

Mark then proceeds to share two examples where the IPA Triad and associated risk assessment can be used. For developer APIs, you can use the IPA Triad to assess privacy risks for each API resource and assess the overall risk to a particular OAuth scope based on the resources it grants access to. Similarly with device sensors, one can assess what the device is capable of inferring about Identity, Presence, or Activity and isolate the data available from the device. Finally, if you must expose some sensor data, can you degrade the fidelity to reduce the risk to the user?

Engineering Ethics into the NIST Privacy Framework

In Engineering Ethics into the NIST Privacy Framework, Jason Cronk introduces the foundational components of the NIST Privacy Framework and compares it against the NIST Cybersecurity Framework. The framework defines several core functions which are: Identify, Govern, Control, Communicate, and Protect. Within these functions, there are various categories and subcategories which elaborate on the particular privacy elements to be followed e.g., physical access to data and devices is managed.

Jason then proceeds to provide an overview of definitions of privacy based on social norms. These include Westin's States of Privacy (anonymity, intimacy, seclusion, and solitude), Calo's Harms (objective and subjective), Prosser's Privacy Torts (false light, intrusion upon seclusion, public disclosure, and appropriation), etc. This also includes Daniel Solove's Taxonomy of Privacy harms which includes 16 types of privacy harms in four broad categories (information processing, collection, invasion, and information dissemination).

Next, these privacy norms were mapped to one another based on their common features e.g., trust, intimacy, objective harms, breach of confidentiality, etc. In turn, once grouped, these norms can be used to define an organization-specific privacy value which can be mapped to the NIST Privacy Framework—this creates privacy values that are distinct, understandable, and applicable for your organization.

When Engineers and Lawyers Talk: Right-Sizing Your Data Protection Risk Profile

In When Engineers and Lawyers Talk: Right-Sizing Your Data Protection Risk Profile, Rafae Bhatti describes how you can recognize the most important risks for a company based on its unique risk profile.

Rafae describes right-sizing a risk profile through several analogies. Depending on legal obligations and technical rationale, you may underestimate or overestimate the need to apply certain technical controls to meet a legal need. For example, the data should be encrypted depends on the sensitivity of that data as well as the legal definitions of personal data relevant to your company.

The talk also covers four hypothetical scenarios where one can question whether they're right-sizing the risk profile. If a user requests that you delete all data about them, should you? Are there situations in which you may be permitted to retain some or all of that data? If a report was unintentionally disclosed to a third party, does this constitute a data breach? What about if the data was encrypted or contained no personal data? Your risk profile may vary here based on these factors as well as the jurisdictions of your company and your users.

If you're interested in hearing these types of scenarios and the key takeaways from each I recommend watching this talk!

Wrapping Up

I hope these posts have piqued your interest in PEPR 2020 and future iterations of the conference. Don't forget to check out the other Conference Recaps for PEPR 2020 as well!

If you liked this post (or have ideas on how to improve it), I'd love to know as always. Cheers!