Bounds on the Privacy Amplification of Arbitrary Channels via the Contraction of fα-Divergence
Grosse, Leonhard; Saeidian, Sara; Oechtering, Tobias J.; Skoglund, Mikael
Loading…
Permalink
https://hdl.handle.net/2142/130261
Description
Title
Bounds on the Privacy Amplification of Arbitrary Channels via the Contraction of fα-Divergence
Author(s)
Grosse, Leonhard
Saeidian, Sara
Oechtering, Tobias J.
Skoglund, Mikael
Issue Date
2025-09-17
Keyword(s)
Privacy amplification
Local differential privacy
Strong data processing inequalities
Rényi-divergence
F-divergence inequalities
Abstract
We examine the privacy amplification of channels that do not necessarily satisfy any LDP guarantee by analyzing their contraction behavior in terms of fα-divergence, an f-divergence related to Rényi-divergence via a monotonic transformation. We present bounds on contraction for restricted sets of prior distributions via f-divergence inequalities and present an improved Pinsker’s inequality for fα-divergence based on the joint range technique by Harremoës and Vajda [1]. The presented bound is tight whenever the value of the total variation distance is larger than 1/α. By applying these inequalities in a cross-channel setting, we arrive at strong data processing inequalities for fα-divergence that can be adapted to use-case specific restrictions of input distributions and channel. The application of these results to privacy amplification shows that even very sparse channels can lead to significant privacy amplification when used as a postprocessing step after local differentially private mechanisms.
Publisher
Allerton Conference on Communication, Control, and Computing
Series/Report Name or Number
2025 61st Allerton Conference on Communication, Control, and Computing Proceedings
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.