Paper ID | D2-S1-T4.3 |
Paper Title |
Robust Local Differential Privacy |
Authors |
Milan Lopuhaä-Zwakenberg, Eindhoven University of Technology, Netherlands; Jasper Goseling, University of Twente, Netherlands |
Session |
D2-S1-T4: Local Differential Privacy |
Chaired Session: |
Tuesday, 13 July, 22:00 - 22:20 |
Engagement Session: |
Tuesday, 13 July, 22:20 - 22:40 |
Abstract |
We consider data release protocols for data X = (S;U), where S is sensitive; the released data Y contains as much information about X as possible, measured as I(X;Y), without leaking too much about S. We introduce the Robust Local Differential Privacy (RLDP) framework to measure privacy. This framework relies on the underlying distribution of the data, which needs to be estimated from available data. Robust privacy guarantees ensure privacy for all distributions in a confidence set based on this estimate. We also present three algorithms that construct RLDP protocols from a given dataset. One of these approximates the confidence set by a polytope and uses results from robust optimisation to yield high utility release protocols. However, it relies on vertex enumeration and becomes computationally infeasible for large input alphabets. The other two algorithms are low-complexity and build on randomised response. Experiments verify that all three algorithms offer significantly improved utility over regular LDP.
|