The investigation showed that the algorithm, built for Rotterdam by the consultancy Accenture, discriminated on the basis of ethnicity and gender. And most impressively, it demonstrated in exacting detail how and why the algorithm behaved the way it did. (Congrats to the Lighthouse/Wired team, including Dhruv Mehrotra, who readers may recall helped us investigate crime prediction algorithms in 2021.) Cities around the world and quite a few U.S. states are using similar algorithms built by private companies to flag citizens for benefits fraud. Not for lack of trying, we know very little about how they work.
The Markup
Nonprofit organization dedicated to data-driven tech accountability journalism & privacy protection.
Receive Stories from @TheMarkup
L O A D I N G
. . . comments & more!