Séminaire de Cryptographie

Accueil     Présentation     Archives

David Xiao


Privacy, incentives, and truthfulness

Imagine the government is taking a census, and you as an individual are worried that by participating, private information about you (such as your address, age, ethnicity, etc.) may eventually be revealed when the government publishes the census data. How can the government assure you that by using an appropriate release mechanism that "sanitizes" census data, no individual's privacy will be compromised?

This question has been studied for a long time in the statistics community, and more recently the computer science community has contributed the formal notion of differential privacy, which captures the idea that "no individual's data can have a large effect on the output of the release mechanism". This has been interpreted to mean that individuals should be comfortable revealing their information, since little private information is leaked.

In this talk, we first give an introduction to this fast-developing area of research. We then investigate the above interpretation about the guarantees of differential privacy. We argue that the interpretation is incomplete because unless participation in the database somehow explicitly benefits the individuals, they will always refuse to participate regardless of whether the release mechanism is differentially private or not. We then show that by combining differential privacy with the notion of incentives and truthfulness from game theory, one can take (almost) any release mechanism that motivates individuals to participate and modify it so that in addition it satisfies differential privacy.