If you are considering a pivot to AI, and want to hear ideas for how to BOTH fulfil a real social need and do great philosophy, or just want to talk through your plans, feel free to reach out to MINT.
Read MoreSeth Lazar will be traveling to the US in October, giving talks in Princeton, Cornell Tech, and Cornell.
Read MoreWell done to Jake Stone for passing his Thesis Proposal Review! He’s working on ‘A Theory of Justice for Algorithmic Systems’, and it’s going to smash. Did a great job crossing this important hurdle. Well done Jake!
Read MoreCollection of FAccT Keynotes and Panels
Read MorePanel on Algorithmic Governance of the Public Sphere at FAccT, Seoul 2022
Read MoreThis panel discussion, curated and chaired by Seth Lazar, featured insights from Frederik Zuiderveen Borgesius, Min Kyung Lee, Wilneida Negrón, and Rida Qadri. Watch the whole session here:
Read MoreWelcome to FAccT video by the General Chairs
Read MoreACM FAccT happening June 21-24, MINTies and friends have played a big role in bringing it together. All now just hoping that neither covid nor WWIII gets in the way…
Read MoreSeth will give a keynote lecture at the 10th Oxford Studies in Political Philosophy Workshop, in Tucson Arizona, in October 2022.
Read MoreNew draft ready on legitimacy, authority and the political value of explanations, due to be my keynote for Oxford Studies in Political Philosophy workshop, Tucson October 2022
Read MoreWe argue that, as well as more obvious concerns about the downstream effects of ML-based decision-making, there can be moral grounds for the criticism of these predictions themselves. We introduce and defend a theory of predictive justice, according to which differential model performance for systematically disadvantaged groups can be grounds for moral criticism of the model, independently of its downstream effects. As well as helping resolve some urgent disputes around algorithmic fairness, this theory points the way to a novel dimension of epistemic ethics, related to the recently discussed category of doxastic wrong.
Read More