Racial Justice Series: Algorithmic Justice Recap
By Andy Budzinksi
On May 24, 2023, panelists discussed the intersection of algorithms, machine learning, and discrimination in law and government. Panelists included:
- Enid Zhou (she/her), Senior Counsel at Electronic Privacy Information Center (“EPIC”)
- Alex Ault (he/him), Policy Counsel at Lawyers’ Committee for Civil Rights Under Law’s Digital Justice Initiative
- Kevin De Liban (he/him), Director of Advocacy at Legal Aid of Arkansas
- Co-moderators Sebastien Monzon Rueda & Bardia Bastin
As Enid Zhou explained, algorithms are rules-based code that replace human decision-making and, using mathematical formulae, marshal data to achieve a specific outcome. While algorithms are commonly associated with the private sector – how banks determine who gets a loan, or how a streaming service determines what shows to recommend to the viewer – Ms. Zhou explained how wide-spread algorithms have been introduced in the public sector, as well. Public agencies could use an algorithm to determine who gets priority for certain government benefits; to assign children to schools in a particular region; to tell police what areas to patrol and who to target; and more.
Ms. Zhou explained that tools can reinforce bias and discrimination because they are built by people that also harbor bias. The data used by algorithms — what it sees as “normal” — often reflects discrimination, bias, and inequality, which the algorithm sees as “the right outcome.” Ms. Zhou noted the importance of creating ways for people to access the data on which decisions are made, to see how algorithms are impacting our lives and communities and to allow people to challenge those decisions.
Alex Ault spoke about laws that attempt to push back on how algorithms are used, and specifically on the Stop Discrimination by Algorithms Act (SDAA) currently under consideration in the DC Council. The SDAA bans algorithmic discrimination on the basis of protected traits in “important life opportunities,” such as credit, education, employment, and housing. It also requires companies to audit algorithms for discriminatory patterns in their results, report that data to the Office of the Attorney General, and to disclose in plain language when and how an algorithm has denied an important life opportunity to a D.C. resident. Importantly, as Mr. Ault noted, the bill allows the Attorney General to enforce the law and creates a private civil claim. The bill improves public notice and oversight of private entities using algorithms. It is not clear that it applies to government agencies, but is a step in the right direction.
Kevin De Liban described his work at Legal Aid of Arkansas challenging Arkansas’s use of algorithms to cut home healthcare benefits. In 2014, the state began using an algorithm to adjust how many hours of care would be covered for recipients. Mr. De Liban’s clients began reporting dramatic cuts in their benefits, with real human consequences. Ms. De Liban’s office successfully challenged the use of the algorithm under the Due Process Clause and proved that it was coded poorly and relied on inaccurate information about the level of care required for serious chronic conditions. Mr. De Liban’s account reinforced the need for litigation to push back on unjust algorithmic outcomes, and to change the incentive structure that allowed such an outcome in the first place.
The panelists reminded us that, many times, the public cannot tell how algorithms make decisions or what data they use to do so. Private citizens’ data is being collected in large swaths. It is important to minimize how easily that data can be collected, but just as important to shed light on how algorithms are using that data to make biased, flawed, and unjust decisions.
Andy Budzinski is a Board Member of Washington Council of Lawyers and is a Co-Chair of the Communications Committee.