Differentially private publication for related POI discovery

JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING(2022)

引用 1|浏览1
暂无评分
摘要
Among the advanced methods, differential privacy (DP), introducing independent Laplace noise, has become an influential privacy mechanism owing to its provable and rigorous privacy guarantee. Nonetheless, in practice, POI data to be protected is always correlated, while independent noise may cause undesirable information disclosure than expected. Recent researches attempt to optimize the sensitivity function of DP with consideration of the correlation strength between POI—but there is a drawback in a substantial growth of noise level. To remedy this problem, this paper exploits the degradation of DP in expected privacy levels for correlated POI data and proposes a solution to mitigate it. We propose a generalized Laplace mechanism to achieve privacy guarantees. Specifically, we design a practical iteration mechanism, including an update function, to conduct a generalized Laplace mechanism when facing large scale queries. Experimental evaluation on real-world datasets over multiple fields show that our solution consistently outperforms state-of-the-art mechanisms in data utility while providing the same privacy guarantee as other approaches for correlated POI data.
更多
查看译文
关键词
POI data, Correlated POI, Differential privacy, Privacy preserving
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要