Exploring the Privacy Bound for Differential Privacy: From Theory to Practice

Authors

DOI:

https://doi.org/10.4108/eai.8-4-2019.157414

Keywords:

Differential Privacy, Inference, Privacy Bound

Abstract

Data privacy has attracted significant interests in both database theory and security communities in the past few decades. Differential privacy has emerged as a new paradigm for rigorous privacy protection regardless of adversaries prior knowledge. However, the meaning of privacy bound ꞓ and how to select an appropriate ꞓ may still be unclear to the general data owners. More recently, some approaches have been proposed to derive the upper bounds of ꞓ for specified privacy risks. Unfortunately, these upper bounds suffer from some deficiencies (e.g., the bound relies on the data size, or might be too large), which greatly limits their applicability. To remedy this problem, we propose a novel approach that converts the privacy bound in differential privacy ꞓ to privacy risks understandable to generic users, and present an in-depth theoretical analysis for it. Finally, we have conducted experiments to demonstrate the effectiveness of our model.

Downloads

Published

25-01-2019

How to Cite

He, X. ., Hong, Y. ., & Chen, Y. . (2019). Exploring the Privacy Bound for Differential Privacy: From Theory to Practice. EAI Endorsed Transactions on Security and Safety, 5(18), e2. https://doi.org/10.4108/eai.8-4-2019.157414

Funding data