Asset Details
MbrlCatalogueTitleDetail
Do you wish to reserve the book?
Towards formalizing the GDPR’s notion of singling out
by
Cohen, Aloni
, Nissim, Kobbi
in
Computer Sciences
/ General Data Protection Regulation
/ Mathematical analysis
/ Physical Sciences
/ Privacy
/ Security
/ Statistical analysis
/ Uncertainty
2020
Hey, we have placed the reservation for you!
By the way, why not check out events that you can attend while you pick your title.
You are currently in the queue to collect this book. You will be notified once it is your turn to collect the book.
Oops! Something went wrong.
Looks like we were not able to place the reservation. Kindly try again later.
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
Do you wish to request the book?
Towards formalizing the GDPR’s notion of singling out
by
Cohen, Aloni
, Nissim, Kobbi
in
Computer Sciences
/ General Data Protection Regulation
/ Mathematical analysis
/ Physical Sciences
/ Privacy
/ Security
/ Statistical analysis
/ Uncertainty
2020
Please be aware that the book you have requested cannot be checked out. If you would like to checkout this book, you can reserve another copy
We have requested the book for you!
Your request is successful and it will be processed during the Library working hours. Please check the status of your request in My Requests.
Oops! Something went wrong.
Looks like we were not able to place your request. Kindly try again later.
Journal Article
Towards formalizing the GDPR’s notion of singling out
2020
Request Book From Autostore
and Choose the Collection Method
Overview
There is a significant conceptual gap between legal and mathematical thinking around data privacy. The effect is uncertainty as to which technical offerings meet legal standards. This uncertainty is exacerbated by a litany of successful privacy attacks demonstrating that traditional statistical disclosure limitation techniques often fall short of the privacy envisioned by regulators. We define “predicate singling out,” a type of privacy attack intended to capture the concept of singling out appearing in the General Data Protection Regulation (GDPR). An adversary predicate singles out a dataset x using the output of a data-release mechanism M(x) if it finds a predicate p matching exactly one row in x with probability much better than a statistical baseline. A data-release mechanism that precludes such attacks is “secure against predicate singling out” (PSO secure). We argue that PSO security is a mathematical concept with legal consequences. Any data-release mechanism that purports to “render anonymous” personal data under the GDPR must prevent singling out and, hence, must be PSO secure. We analyze the properties of PSO security, showing that it fails to compose. Namely, a combination of more than logarithmically many exact counts, each individually PSO secure, facilitates predicate singling out. Finally, we ask whether differential privacy and k-anonymity are PSO secure. Leveraging a connection to statistical generalization, we show that differential privacy implies PSO security. However, and in contrast with current legal guidance, k-anonymity does not: There exists a simple predicate singling out attack under mild assumptions on the k-anonymizer and the data distribution.
Publisher
National Academy of Sciences
This website uses cookies to ensure you get the best experience on our website.