Exploring the Privacy Bound for Diﬀerential Privacy: From Theory to Practice
Keywords:Differential Privacy, Inference, Privacy Bound
Data privacy has attracted signiﬁcant interests in both database theory and security communities in the past few decades. Diﬀerential privacy has emerged as a new paradigm for rigorous privacy protection regardless of adversaries prior knowledge. However, the meaning of privacy bound ꞓ and how to select an appropriate ꞓ may still be unclear to the general data owners. More recently, some approaches have been proposed to derive the upper bounds of ꞓ for speciﬁed privacy risks. Unfortunately, these upper bounds suﬀer from some deﬁciencies (e.g., the bound relies on the data size, or might be too large), which greatly limits their applicability. To remedy this problem, we propose a novel approach that converts the privacy bound in diﬀerential privacy ꞓ to privacy risks understandable to generic users, and present an in-depth theoretical analysis for it. Finally, we have conducted experiments to demonstrate the eﬀectiveness of our model.
How to Cite
Copyright (c) 2022 EAI Endorsed Transactions on Security and Safety
This work is licensed under a Creative Commons Attribution 3.0 Unported License.
This is an open-access article distributed under the terms of the Creative Commons Attribution CC BY 3.0 license, which permits unlimited use, distribution, and reproduction in any medium so long as the original work is properly cited.
National Social Science Fund of China
Grant numbers No. 61672303