Papers
Topics
Authors
Recent
Search
2000 character limit reached

Crowdsourced, Actionable and Verifiable Contextual Informational Norms

Published 18 Jan 2016 in cs.CY | (1601.04740v4)

Abstract: There is often a fundamental mismatch between programmable privacy frameworks, on the one hand, and the ever shifting privacy expectations of computer system users, on the other hand. Based on the theory of contextual integrity (CI), our paper addresses this problem by proposing a privacy framework that translates users' privacy expectations (norms) into a set of actionable privacy rules that are rooted in the language of CI. These norms are then encoded using Datalog logic specification to develop an information system that is able to verify whether information flows are appropriate and the privacy of users thus preserved. A particular benefit of our framework is that it can automatically adapt as users' privacy expectations evolve over time. To evaluate our proposed framework, we conducted an extensive survey involving more than 450 participants and 1400 questions to derive a set of privacy norms in the educational context. Based on the crowdsourced responses, we demonstrate that our framework can derive a compact Datalog encoding of the privacy norms which can in principle be directly used for enforcing privacy of information flows within this context. In addition, our framework can automatically detect logical inconsistencies between individual users' privacy expectations and the derived privacy logic.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.