Vous êtes cordialement invités à assister à la soutenance de thèse de Mohammad Nabil ALAGGAN  intitulée :

Private Peer-to-peer similarity computation in personalized collaborative platforms

Composition du Jury :

M. Daniel LE METAYER, Directeur de Recherche à Inria Grenoble, Rapporteur
M. Marc-Olivier KILLIJIAN, Chargé de Recherche CNRS-LAAS Toulouse, Rapporteur
M. Luis RODRIGUES, Professor at Departamento de Engenharia Informática, Instituto Superior Técnico, Universidade Técnica de Lisboa, Examinateur
M. Ludovic ME, Enseignant Chercheur à Supélec Rennes, Examinateur
Mme. Anne-Marie KERMARREC, Directrice de Recherche à Inria Rennes, Directrice de thèse
M. Sébastien GAMBS, Chercheur à Inria Rennes, Co-directeur de thèse

Abstract :

In this thesis, we consider a distributed collaborative platform in which each peer hosts his private information, such as the URLs he liked or the news articles that grabbed his interest or videos he watched, on his own machine. Then, without relying on a trusted third party, the peer engages in a distributed protocol, combining his private data with other peers’ private data to perform collaborative filtering. The main objective is to be able to receive personalized recommendations or other services such as a personalized distributed search engine. User-based collaborative filtering protocols, which depend on computing user-to-user similarity, have been applied to distributed systems. As computing the similarity between users requires the use of their private profiles, this raises serious privacy concerns. In this thesis, we address the problem of privately computing similarities between peers in collaborative platforms. Our work provides a private primitive for similarity computation that can make collaborative protocols privacy-friendly. We address the unique challenges associated with applying privacy-preserving techniques for similarity computation to dynamic large scale systems. In particular, we introduce a two-party cryptographic protocol that ensures differential privacy, a strong notion of privacy. Moreover, we solve the privacy budget issue that would prevent peers from computing their similarities more than a fixed number of times by introducing the notion of bidirectional anonymous channel. We also develop a heterogeneous variant of differential privacy that can provide different level of privacy to different users, and even different level of privacy to different items within a single user’s profile, thus taking into account different privacy expectations. Moreover, we propose a non-interactive protocol that is very efficient for releasing a small and private representation of peers’ profiles that can be used to estimate similarity. Finally, we study the problem of choosing an appropriate privacy parameter both theoretically and empirically by creating several inference attacks that demonstrate for which values of the privacy parameter the privacy level provided is acceptable.