WASHINGTON, D.C. -- U.S. Senators Cory Booker (D-NJ) and Ron Wyden (D-OR), along with Rep. Yvette D. Clarke (D-NY) today introduced the Algorithmic Accountability Act, which requires companies to study and fix flawed computer algorithms that result in inaccurate, unfair, biased or discriminatory decisions impacting Americans.

"50 years ago my parents encountered a practice called 'real estate steering' where black couples were steered away from certain neighborhoods in New Jersey. With the help of local advocates and the backing of federal legislation they prevailed. However, the discrimination that my family faced in 1969 can be significantly harder to detect in 2019: houses that you never know are for sale, job opportunities that never present themselves, and financing that you never become aware of -- all due to biased algorithms," Booker said. "This bill requires companies to regularly evaluate their tools for accuracy, fairness, bias, and discrimination. It's a key step toward ensuring more accountability from the entities using software to make decisions that can change lives."

"Computers are increasingly involved in the most important decisions affecting Americans' lives -- whether or not someone can buy a home, get a job or even go to jail. But instead of eliminating bias, too often these algorithms depend on biased assumptions or data that can actually reinforce discrimination against women and people of color," Wyden said. "Our bill requires companies to study the algorithms they use, identify bias in these systems and fix any discrimination or bias they find."

"Algorithms shouldn't have an exemption from our anti-discrimination laws. Our bill recognizes that algorithms have authors, and without diligent oversight, they can reflect the biases of those behind the keyboard," Clarke said. "By requiring large companies to not turn a blind eye towards unintended impacts of their automated systems, the Algorithmic Accountability Act ensures 21st Century technologies are tools of empowerment, rather than marginalization, while also bolstering the security and privacy of all consumers."

The bill follows numerous reports of computer algorithms creating biased and discriminatory results. Earlier this month, the Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act by allowing advertisers to discriminate based on race, religion and disability status. Last year, for example, Reuters reported that Amazon shuttered an automated recruiting tool that was biased against women.

The Algorithmic Accountability Act would:

Authorize the Federal Trade Commission (FTC) to create regulations requiring companies under its jurisdiction to conduct impact assessments of highly sensitive automated decision systems. This requirement would apply both to new and existing systems.

Require companies to assess their use of automated decision systems, including training data, for impacts on accuracy, fairness, bias, discrimination, privacy and security.

Require companies to evaluate how their information systems protect the privacy and security of consumers' personal information.

Require companies to correct any issues they discover during the impact assessments.

The bill is endorsed by tech and civil rights groups, including Data for Black Lives, the Center on Privacy and Technology at Georgetown Law and the National Hispanic Media Coalition.

"At Data for Black Lives we are building a movement of organizers, scientists and technologists using data science to create concrete and measurable change in the lives of Black people. The Algorithmic Accountability Act of 2019 helps advance this work by putting accountability and public education front and center. We know first-hand the harmful impact that automated decision systems have on parents fighting for access to quality education, black mothers engaging health systems in how to provide care that protects their newborns, and activists fighting against community disinvestment and deprivation. Beyond regulation, we are hopeful that this legislation will lead to a broader discussion about the tremendous potential for data systems, if used ethically, to uplift, empower, and democratize our communities." - Data for Black Lives

"Automated decisions are not neutral decisions. They turn on human data. As long as humans are biased, algorithms will be biased too. This bill will force companies to reckon with that reality." - Center on Privacy & Technology at Georgetown Law

"Algorithms control various aspects of a digital economy. They determine which candidates will be interviewed and how much they will be paid; who will be targeted for or excluded from advertisements; and how much consumers will pay for goods and shipping online. Still, the public is largely in the dark about how personal data is collected, manipulated, shared, and stored. Especially for Latinos and other marginalized communities who have been frequent subjects of data harms, the Algorithmic Accountability Act of 2019 injects much-needed transparency and accountability into data frameworks that codify unconscious bias and institutional discrimination into code." - Francella Ochillo, VP of Policy and General Counsel, National Hispanic Media Coalition

The full bill text is available here.

Background on Booker's record:

Booker was one of the first lawmakers to call for increased antitrust scrutiny of major tech companies and he has consistently fought to ensure that extant discrimination and bias in our society does not become augmented and automated in the future.

Last year, Booker secured a commitment from Facebook CEO Mark Zuckerberg -- for the first time -- to conduct a civil rights audit at Facebook. Among other things the audit is meant to address the algorithmic biases on Facebook's platforms.

In July, Booker partnered with Senator Wyden to survey 39 federal law-enforcement agencies about their use of facial recognition technology, and what policies, if any, they have installed to prevent abuse, misuse and discrimination. That same month he also joined Senators in requesting the GAO study commercial and government uses of facial recognition technology, and examine what safeguards companies and law enforcement have implemented to prevent misuse and discrimination based on race, gender and age.

In October 2018, Senator Booker secured language in the FAA Authorization Act requiring the TSA to report to Congress on methods to eliminate bias on race, gender and age as TSA begins deployment of facial recognition technology for domestic flights.