Withdraw
Loading…
The fundamental limits of statistical data privacy
Kairouz, Peter
Loading…
Permalink
https://hdl.handle.net/2142/92686
Description
- Title
- The fundamental limits of statistical data privacy
- Author(s)
- Kairouz, Peter
- Issue Date
- 2016-04-20
- Director of Research (if dissertation) or Advisor (if thesis)
- Viswanath, Pramod
- Doctoral Committee Chair(s)
- Viswanath, Pramod
- Committee Member(s)
- Oh, Sewoong
- Hajek, Bruce
- Borisov, Nikita
- Srikant, Rayadurgam
- Department of Study
- Electrical & Computer Eng
- Discipline
- Electrical & Computer Engr
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- Ph.D.
- Degree Level
- Dissertation
- Keyword(s)
- Privacy
- Information Theory
- Data Privacy
- Statistics
- Multi-Party Computation
- Security
- Local Differential Privacy
- Privacy-Preserving Machine Learning Algorithms
- Information Theoretic Utilities
- f-Divergence
- Mutual Information
- Statistical Inference
- Hypothesis Testing
- Estimation
- Abstract
- The Internet is shaping our daily lives. On the one hand, social networks like Facebook and Twitter allow people to share their precious moments and opinions with virtually anyone around the world. On the other, services like Google, Netflix, and Amazon allow people to look up information, watch movies, and shop online anytime, anywhere. However, with this unprecedented level of connectivity comes the danger of being monitored. There is an increasing tension between the need to share data and the need to preserve the privacy of Internet users. The need for privacy appears in three main contexts: (1) the global privacy context, as in when private companies and public institutions release personal information about individuals to the public; (2) the local privacy context, as in when individuals disclose their personal information with potentially malicious service providers; (3) the multi-party privacy context, as in when different parties cooperate to interactively compute a function that is defined over all the parties' data. Differential privacy has recently surfaced as a strong measure of privacy in all three contexts. Under differential privacy, privacy is achieved by randomizing the data before releasing it. This leads to a fundamental tradeoff between privacy and utility. In this thesis, we take a concrete step towards understanding the fundamental structure of privacy mechanisms that achieve the best privacy-utility tradeoff. This tradeoff is formulated as a constrained optimization problem: maximize utility subject to differential privacy constraints. We show, perhaps surprisingly, that in all three privacy contexts, the optimal privacy mechanisms have the same combinatorial staircase structure. This deep result is a direct consequence of the geometry of the constraints imposed by differential privacy on the privatization mechanisms.
- Graduation Semester
- 2016-08
- Type of Resource
- text
- Permalink
- http://hdl.handle.net/2142/92686
- Copyright and License Information
- Copyright © 2016 by Peter Kairouz
Owning Collections
Dissertations and Theses - Electrical and Computer Engineering
Dissertations and Theses in Electrical and Computer EngineeringGraduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…