Skip to main content
  • Christine Task - Defining Differential Privacy for Social Network Analysis

  • Tuesday, September 10, 2013 2:30 PM - 3:30 PM EDT
    Online
    Purdue University

    Abstract:
    Privacy of social network data is a growing concern that threatens to limit access to this valuable data source. Analysis of the graph structure of social networks can provide valuable information for revenue generation and social science research, but unfortunately, ensuring this analysis does not violate individual privacy is difficult. Simply anonymizing graphs or even releasing only aggregate results of analysis may not provide sufficient protection. Dffierential privacy is an alternative privacy model, popular in data-mining over tabular data, that uses noise to obscure individuals' contributions to aggregate results and offers a very strong mathematical guarantee that individuals' presence in the data-set is hidden. Analyses that were previously vulnerable to identification of individuals and extraction of private data may be safely released under differential-privacy guarantees. We review two existing standards for adapting differential privacy to network data and analyze the feasibility of several common social-network analysis techniques under these standards. Additionally, we propose out-link privacy and partition privacy, novel standards for differential privacy over network data, and introduce powerful private algorithms for common network analysis techniques that were unfeasible to privatize under previous differential privacy standards.