Dear Liberation Tech colleagues, Following up on my September email, a number of recordings and resources related to the conference are online. Two of them had wide participation from the Internet, so thank you to the Internet and the liberation tech mailing list for that.
Some of the things available now: The Compressed Auditing Algorithms Film Festival - a ~45 minute program of short films and excerpts investigating depictions of algorithms in popular media and education (all videos online) The Computer Says No: The Bad News About Online Discrimination in Algorithmic Systems - a recording of a 90-minute interactive roundtable intended for a general audience How to Watch Them Watching You: Researching Social Media, Online Platforms, and Algorithmic Systems From the Outside - a recording of a 90-minute interactive roundtable intended for Internet researchers Just Google Me* - an alternative business card that you can hand out instead of your own, if you want to Algorithms Dérive - an app-based self-guided, collaboratively-produced tour intended to promote reflection about algorithms The Top 10 Kinds of Fairness - A 20-minute talk promoting a new research directions for the FAT (Fairness, Accountability, Transparency) technical community. It argues for an expansive definition of fairness that goes beyond “statistical” fairness or “comparative” fairness, and for automated monitoring for fairness as a research problem. All are at... http://auditingalgorithms.science/ ...with more things on the way. I hope this is helpful, Christian ---------- Forwarded message ---------- From: Christian Sandvig <csand...@umich.edu> Date: Tue, Sep 26, 2017 at 11:56 PM Subject: Mark Your Calendar: Live-streams of two roundtables Thu/Fri To: Liberation Technologies <liberationtech@lists.stanford.edu> Hello liberationtech colleagues, We'll be hosting a workshop with some possibly familiar people later this week. I'm writing to this list because we will be live-streaming two of our public events on YouTube and I am wondering if you would like to "attend." We will be taking questions from the Internet via Twitter and the BKC question tool. Even if you are not in the right timezone, I hope this will be of interest. I dare Europe to stay awake for the first one, and Australians to stay awake for the second one. Please forward this email as appropriate. Live-Streamed Events: THE COMPUTER SAYS NO: The Bad News About Online Discrimination in Algorithmic Systems Thursday, September 28, 2017 4:00-5:30 p.m. Eastern Daylight Time (UTC/GMT -4 hours) http://auditingalgorithms.science/?p=53 HOW TO WATCH THEM WATCHING YOU: Researching Social Media, Online Platforms, and Algorithmic Systems From the Outside Friday, September 29, 2017 10-11:30 a.m. Eastern Daylight Time (UTC/GMT -4 hours) http://auditingalgorithms.science/?p=64 The overall occasion is a workshop entitled "Auditing Algorithms: Adding Accountability to Automated Authority," explained here: http://auditingalgorithms.science/ Its goal is a white paper about this area of research, which I expect to email this list about again in the future. We are also working on some weirder participatory online activities -- if you like these topics you'll want to jump on in on them. More on that after these events. For more context, here is a related quote: "The equations of big-data algorithms have permeated almost every aspect of our lives. A massive industry has grown up to comb and combine huge data sets — documenting, for example, Internet habits — to generate profiles of individuals. These often target advertising, but also inform decisions on credit, insurance and more. They help to control the news or adverts we see, and whether we get hired or fired. They can determine whether surveillance and law-enforcement agencies flag us as likely activists or dissidents — or potential security or criminal threats….Largely absent from the widespread use of such algorithms are the rules and safeguards that govern almost every other aspect of life in a democracy. There is an asymmetry in algorithmic power and accountability…Fortunately, a strong movement for greater algorithmic accountability is now under way. Researchers hope to find ways to audit for bias….Society needs to discuss in earnest how to rid software and machines of human bugs." –Unsigned Editorial, Nature (2016) Hope you'll be "there", Christian (on behalf of the co-organizers) -- http://www-personal.umich.edu/~csandvig/
-- Liberationtech is public & archives are searchable on Google. Violations of list guidelines will get you moderated: https://mailman.stanford.edu/mailman/listinfo/liberationtech. Unsubscribe, change to digest, or change password by emailing the moderator at zakwh...@stanford.edu.