The Workshop on Data and Algorithmic Transparency (DAT'16) is being organized as a forum for academics, industry practicioners, regulators, and policy makers to come together and discuss issues related to increasing role that "big data" algorithms play in our society. Our goal is to provide a venue for fruitful discussions and high-quality academic research papers focused on increasing understanding and transparency of large-scale data collection and the systems and algorithms that it powers. The workshop is co-located with two other highly related venues: the Data Transparency Lab Conference and the FATML'16 Workshop, and we encourage attendees to consider attending these other events as well.
Updates
- November 3, 2016: All of the DAT'16 papers are now available on the Program Page.
- October 21, 2016: Registration for DAT'16 is now open — more details are available on the Venue Page.
- October 20, 2016: We have posted the schedule for DAT'16!
- October 17, 2016: We're excited to announce that DAT'16 will be hosted at NYU Law School's Lipton Hall! (a short subway ride from DTL'16, at Columbia)
- October 7, 2016: We are pleased to announce that DAT'16 accepted 15 papers!
- August 31, 2016: Due to a number of requests, we've decided to push back the paper submission deadline one week; it's now September 16, 2016 at 11:59:59pm Hawaii time.
- August 10, 2016: We're aiming for a very interactive workshop, and are soliciting submissions of both unpublished and previously-published work.
- July 24, 2016: We're thrilled to announce a terrific program committee of 18 researchers!
- June 30, 2016: The webpage for DAT'16 is now live!
Transparency and oversight of the algorithmic world: A new role for computer science research
The pervasiveness of data and algorithmic systems in society has generated a new class of research questions that the public is intensely interested in: Are my smart devices surreptitiously recording audio? Does my search history allow inferring intimate details that I haven’t explicitly searched for? Is the algorithm that decides my loan application biased? Do I see different prices online based on my browsing and purchase history? Are there dangerous instabilities or feedback loops in algorithmic systems ranging from finance to road traffic prediction?
Answering these questions requires empirical investigation of computer systems in the wild, with the goal of bringing transparency to these systems. Computer scientists are uniquely poised to carry out this research. The nascent literature on these topics makes clear that a combination of skills is called for: building systems to support large-scale, automated measurements; instrumenting devices to record and reverse-engineer network traffic; analyzing direct (leakage-based) and indirect (inference-based) privacy vulnerabilities; experimenting on black-box and white-box algorithmic systems; simulating and modeling these systems; machine learning; and crowdsourcing.
Computer science research today is largely siloed into disciplinary communities, none of which is well suited to tackle these interdisciplinary challenges. We call for the emergence of a new research community aimed providing transparency and ethical oversight of digital technologies through empirical measurement and analysis. We envision this research feeding into a broader effort that would include law, policy, enforcement, the press, privacy advocates and civil-liberties activists.
This new field is complementary to many existing disciplines. It draws techniques from measurement research, but investigates systems “from the outside” and is concerned with societal effects of systems rather than performance characteristics. It is also similar to security research, but the systems being studied do not have specifications of correct behavior. Finally, transparency research informs areas such as privacy-by-design and discrimination-aware data mining in creating systems that respect privacy and minimize bias. (Note that we are co-located with the FATML‘16 workshop.)
Investigating computer systems to identify effects of societal concern has the effect of holding companies’ feet to the fire. This will result in new ethical challenges. For example, is it a conflict of interest for a transparency researcher to accept industry funding? Regardless, we are confident that as the new community comes together to solve technical challenges, it will evolve an ethos and a set of norms to navigate these dilemmas as well.