Home

The 1st IEEE ICDM International Workshop

on Privacy and Discrimination in Data Mining

 December 12, 2016 - Barcelona

ABSTRACT

In recent years large quantities of data on individuals are collected by different organizations, public and private. Such data repositories are often used for statistical research and data mining. Additionally, service providers can easily track individuals' actions, behaviors, and habits, and by analyzing the large data collections of person-specific information, they may learn patterns, models, and trends that can be used to provide personalized services. While the potential benefits of data mining are substantial, it is evident that the collection and analysis of sensitive personal data raises concerns about users’ privacy, confidentiality, freedom and potential discrimination. Discrimination may ensue from training data mining models (e.g. classifiers) on data which are biased against certain protected groups (that could be defined by ethnicity, gender, political preferences, etc.).

When addressed at a technical level, privacy-awareness fosters the dissemination and adoption of emerging knowledge-based applications. Obtaining the potential benefits of data mining with a privacy- and discrimination-aware technology can enable a wider social acceptance of a multitude of new services and applications which are based on knowledge discovery. Source data of particular importance include, for instance, biomedical patient data, web usage log data, mobility data from wireless and sensor networks, and social networking data. In all of those examples, the process of knowledge discovery from data may be very useful for all parties involved, service providers and users; but at the same time it carries a substantial risk of privacy violation and unfair practices such as discrimination.

Privacy preservation, and more recently discrimination prevention, are crucial aspects of data mining that have captured the attention of many researchers and practitioners alike. Despite the growing awareness and the significant efforts, there are still many open issues that deserve further investigation.

The workshop on Privacy and Discrimination in Data Mining (PDDM) hopes to gather researchers and practitioners that are interested in the privacy and discrimination aspects of data mining, from technical, as well as social and legal points of view. We hope to attract the interest of researchers and practitioners of privacy-preserving data mining and data publishing in a wide range of domains, such as web mining, biomedical data mining, spatio-temporal data mining, social network mining, and data mining on the cloud. A special focus will be given this year to the topics of discrimination detection and prevention.

We invite contributions addressing important privacy and discrimination problems that emerge in data mining, papers describing practical applications of privacy-preserving technologies, survey papers that critically analyze the current state-of-the-art and position papers that raise awareness and provide guidance towards open problems.

Chairs

Contact us at: pddm2016@eurecat.org