Exploring the Privacy Bound for Differential Privacy: From Theory to Practice
DOI:
https://doi.org/10.4108/eai.8-4-2019.157414Keywords:
Differential Privacy, Inference, Privacy BoundAbstract
Data privacy has attracted significant interests in both database theory and security communities in the past few decades. Differential privacy has emerged as a new paradigm for rigorous privacy protection regardless of adversaries prior knowledge. However, the meaning of privacy bound ꞓ and how to select an appropriate ꞓ may still be unclear to the general data owners. More recently, some approaches have been proposed to derive the upper bounds of ꞓ for specified privacy risks. Unfortunately, these upper bounds suffer from some deficiencies (e.g., the bound relies on the data size, or might be too large), which greatly limits their applicability. To remedy this problem, we propose a novel approach that converts the privacy bound in differential privacy ꞓ to privacy risks understandable to generic users, and present an in-depth theoretical analysis for it. Finally, we have conducted experiments to demonstrate the effectiveness of our model.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 EAI Endorsed Transactions on Security and Safety
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
This is an open-access article distributed under the terms of the Creative Commons Attribution CC BY 3.0 license, which permits unlimited use, distribution, and reproduction in any medium so long as the original work is properly cited.
Funding data
-
National Social Science Fund of China
Grant numbers No. 61672303