谷歌浏览器插件
订阅小程序
在清言上使用

Differentially Private Normalizing Flows for Privacy-Preserving Density Estimation

Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society(2021)

引用 1|浏览3
暂无评分
摘要
Normalizing flow models have risen as a popular solution to the problem of density estimation, enabling high-quality synthetic data generation as well as exact probability density evaluation. However, in contexts where individuals are directly associated with the training data, releasing such a model raises privacy concerns. In this work, we propose the use of normalizing flow models that provide explicit differential privacy guarantees as a novel approach to the problem of privacy-preserving density estimation. We evaluate the efficacy of our approach empirically using benchmark datasets, and we demonstrate that our method substantially outperforms previous state-of-the-art approaches. We additionally show how our algorithm can be applied to the task of differentially private anomaly detection.
更多
查看译文
关键词
differential privacy,density estimation,normalizing flows,anomaly detection
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要