Files in this item

FilesDescriptionFormat

application/pdf

application/pdfKAIROUZ-DISSERTATION-2016.pdf (6MB)
(no description provided)PDF

Description

Title:The fundamental limits of statistical data privacy
Author(s):Kairouz, Peter
Director of Research:Viswanath, Pramod
Doctoral Committee Chair(s):Viswanath, Pramod
Doctoral Committee Member(s):Oh, Sewoong; Hajek, Bruce; Borisov, Nikita; Srikant, Rayadurgam
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Degree:Ph.D.
Genre:Dissertation
Subject(s):Privacy
Information Theory
Data Privacy
Statistics
Multi-Party Computation
Security
Local Differential Privacy
Privacy-Preserving Machine Learning Algorithms
Information Theoretic Utilities
f-Divergence
Mutual Information
Statistical Inference
Hypothesis Testing
Estimation
Abstract:The Internet is shaping our daily lives. On the one hand, social networks like Facebook and Twitter allow people to share their precious moments and opinions with virtually anyone around the world. On the other, services like Google, Netflix, and Amazon allow people to look up information, watch movies, and shop online anytime, anywhere. However, with this unprecedented level of connectivity comes the danger of being monitored. There is an increasing tension between the need to share data and the need to preserve the privacy of Internet users. The need for privacy appears in three main contexts: (1) the global privacy context, as in when private companies and public institutions release personal information about individuals to the public; (2) the local privacy context, as in when individuals disclose their personal information with potentially malicious service providers; (3) the multi-party privacy context, as in when different parties cooperate to interactively compute a function that is defined over all the parties' data. Differential privacy has recently surfaced as a strong measure of privacy in all three contexts. Under differential privacy, privacy is achieved by randomizing the data before releasing it. This leads to a fundamental tradeoff between privacy and utility. In this thesis, we take a concrete step towards understanding the fundamental structure of privacy mechanisms that achieve the best privacy-utility tradeoff. This tradeoff is formulated as a constrained optimization problem: maximize utility subject to differential privacy constraints. We show, perhaps surprisingly, that in all three privacy contexts, the optimal privacy mechanisms have the same combinatorial staircase structure. This deep result is a direct consequence of the geometry of the constraints imposed by differential privacy on the privatization mechanisms.
Issue Date:2016-04-20
Type:Thesis
URI:http://hdl.handle.net/2142/92686
Rights Information:Copyright © 2016 by Peter Kairouz
Date Available in IDEALS:2016-11-10
Date Deposited:2016-08


This item appears in the following Collection(s)

Item Statistics