Data Privacy Convening: Challenges, Opportunities, and the Prospects for Meaningful Data Privacy Legislation in the US

From Internet Community Wiki
Jump to: navigation, search
General Information
Dates: 26 November 2018
Location: Washington, DC, USA
URL:

LinkIcon.png   Event Page

Report:

LinkIcon.png   PDF

Issue Areas
Privacy and Data Protection
Organizers
Internet Governance Lab at American University
ISOC-DC

The AU Internet Governance Lab in partnership with the Internet Society Washington, DC Chapter (ISOC-DC) convened experts and stakeholders from industry, government, and civil society to explore the problem, path, and promise of data privacy and surveillance. The half-day event focused on current and emerging issues in data privacy, including public perceptions of data privacy, implications of GDPR and a possible data privacy framework in the US, the relationship between Internet governance and data privacy, and the human rights implications of the privacy debate.

Agenda

12:00pm – 12:30pm  Lunch and Introduction by Dr. Aram Sinnreich
12:30pm – 2:00pm Data privacy in context

·       The Problem: Data Privacy, State and Commercial Surveillance

·       The Path: How Can We Achieve Data Privacy in the U.S.?

·       The Promise: How Can Internet Governance Privilege Fairness and Data Privacy?

2:00pm – 2:30pm Coffee break
2:30pm – 4:00pm Looking ahead

·       Prescriptive takeaways towards meaningful data privacy regulation in the U.S.?

·       Recommendations to lawmakers and other key stakeholders?

·       Spring Privacy Symposium

Report on Discussion

Download Report PDF

Part 1. Defining terms and framing the discussion: What do we mean by “data privacy”?

  • The Privacy Stack:
    • Confidentiality: Once data is collected what are the limitations on sharing and using?
    • Fair information practices: Once data is collected and shared, what are individuals’ rights to change, control or correct their information?
    • Consumer protection
    • The rights framework -- two perspectives:
      • Individual rights framing – tapping into something essentially American that could give US regulation teeth (e.g. personal autonomy is at the center of debates over AI).
      • Collective communal rights – personal autonomy framing runs the risk of feeding data ownership narratives. 
    • Data ethics and the importance of regulation shaped by a strong ethical and moral compass.  
      • Privacy as a public good – essential to human rights and innovation.
    • Data hygiene and literacy.
      • Communicating values is going to be critical to getting the country invested.
      • Worth noting that literacy efforts in developing nations have often been coopted by governments. 
      • Lack of awareness is really starting to erode personal autonomy.
      • Translating risk – the risks of handling data are understood (if not always acted upon) by corporations but not by individuals. Need to make sure individuals/users are at the center of risk analysis and accurately communicate these risks to individuals. 
    • Data as property – transactional framing
      • Need to emphasize the value in mining one’s own data as a means of establishing ownership and personal responsibility.  
  • When framing the problem, it’s important to consider the entire ecosystem, including IoT. Internet and IoT can’t function without massive data collection.
  • Important to consider the risks that are beyond the immediate horizon -- it’s not about unaccountable power but unforeseen unaccountable power. 
  • The word “privacy” is extremely misleading for the public. Scholars tend to talk in abstractions about privacy, but users don’t understand how it applies in practice. From a scholarly perspective the data stack would include: “Data affordances,” “coerciveness,” and “control. 
  • Privacy is also incredibly contextual. 
  • Timing and momentum are important. It was critical for GDPR and it will almost certainly prove critical in the US context. Need to avoid the entropy of perfectionism (e.g. cars can still be dangerous but that doesn’t mean we don’t regulate them).
  • The consequences of privacy?
    • The post-consent era
    • The contemporary panopticon can be used to intentionally disrupt democratic practices
    • The credit score state
    • weaponization of data against the press. 
    • Differential pricing
    • Underserved communities disproportionately feel the effects of data breaches. 
    • Incentive stakeholders: individuals to consent and what are the institutional incentives to use and deploy this data. 
  • Are there affirmative values we can attach to data privacy. To the extent we can engineer through tech, policy, markets, what’s the outcome we want to create?
    • A balance of rights. For instance, balancing the right to be forgotten with freedom of speech, freedom of press, and public’s right to know.  
    • Don’t gather more data than you need now. 
    • Think, “what’s the minimum viable data?” 

Part 2. Policy recommendations

  • GDPR 
    • Is it implementable in the context of the First Amendment and the commerce clause?
    • It does a good job of instituting agency, but single purpose agency leads to law enforcement ignoring data privacy regulations. 
    • GDPR reflects European tradition and values. It would be a mistake to just copy this in the US. 
    • Disproportionately impacts small businesses with prohibitive cost structure. 
    • Sloppy implementation – cookies announcements have become the mattress tags of the Internet. 
    • Need to identify what the legitimate and proportional uses of data are?
    • GDPR is an affirmative set of responsibilities. Those rights can travel as things change. 
    • GDPR is not just about privacy but about the exploitation of data. 
    • Elements transferable from GDPR to US include, data breach laws, data processing, data minimization and purpose limitations. 
  • Should a goal be harmonization, or should they be particular to specific jurisdictions, etc?
    • California has effectively implemented federal law. 
    • Harmonization could set the bar too low and it tends to ignore developing nations. 
    • Rather than harmonization, is there a minimum set of standards or a floor that could be agreed upon?
  • If we were drafting federal law what would it look like?
    • Perhaps implementing the creative commons model of different facing contracts for individual users, tech, and lawyers?
    • Transparency. But what do we want transparency for and how do we empower users to push back without opting out entirely?
  • The tendency to fixate on one social media company and pretend like there won’t be something else that will emerge. 
  • Data retention – how long should a company keep user data (6 months? As long as it is legitimately needed?)
  • Opt-in, opt-out, or some combination of both? Most would prefer opt-in. 

Part 3. Looking ahead to spring privacy symposium

  • Public event with keynote from prominent lawmaker. 
  • Around 200 people (students, faculty, policymakers, technologists, law enforcement, general public). 
  • Launch of Privacy Wiki with readings, key stakeholders, laws, and other relevant information.