Washington, D.C. – U.S. Senators Cory Booker (D-NJ), Ron Wyden, (D-OR), Sherrod Brown, (D-OH), Edward Markey, (D-MA), and Kamala Harris, (D-CA), and U.S. Representatives Yvette Clarke, (D-NY), Ayanna Pressley, (D-MA), and Rashida Tlaib, (D-MI), today asked the U.S. Department of Housing and Urban Development (HUD) to review policies regarding the use of facial recognition technologies in federally assisted housing.
In a letter to HUD Secretary Ben Carson, the lawmakers pointed out the threats that facial recognition technology pose to marginalized communities, opening the door to unchecked government surveillance.
“[HUD] is responsible for creating and ensuring discrimination-free practices in all communities,” the lawmakers wrote. “However, as numerous civil rights experts have pointed out, when public housing and federally assisted property owners install facial recognition security camera systems, they could be used to enable invasive, unnecessary and harmful government surveillance of their residents. Those who cannot afford more do not deserve less in basic privacy and protections. They should not have to compromise their civil rights and liberties nor accept the condition of indiscriminate, sweeping government surveillance to find an affordable place to live.”
“These false and biased judgments can exacerbate the vulnerabilities that marginalized groups already face in life, such as the overcriminalization of people of color and transgender individuals. Potential sharing of this data, particularly with law enforcement, further heightens concerns about the risk this technology poses to vulnerable communities,” the lawmakers continued.
The lawmakers requested a response from HUD by January 24, 2020. The letter to HUD can be found here.
Background on Booker's record fighting algorithmic bias:
Booker was one of the first lawmakers to call for increased antitrust scrutiny of major tech companies, and he has consistently fought to ensure that extant discrimination and bias in our society does not become augmented and automated in the future.
Earlier this month, Booker and Wyden sent letters to the Federal Trade Commission (FTC), the Centers for Medicare and Medicaid Services (CMS), and five of the largest health care companies in the nation following disturbing revelations of flawed algorithms that were impacting health care needs for black patients.
Booker recently introduced the No Biometric Barriers to Housing Act, which prohibits the use of facial recognition and other biometric identification technology in public housing units, an alarming trend that poses significant privacy risks for vulnerable communities.
In October, Booker submitted public comments opposing a proposed Housing and Urban Development (HUD) rule that would allow housing providers to avoid liability for algorithmic discrimination if they use algorithmic tools developed by a third party. With this proposed rule, HUD, the nation’s main civil rights enforcer for housing discrimination, effectively provides a road map for landlords and lenders who wish to discriminate by algorithm.
Booker and Wyden authored the Algorithmic Accountability Act, which requires companies to fix discriminatory algorithms and outlines methods the federal government can use to mitigate the impacts of such algorithms.
Last year, Booker secured a commitment from Facebook CEO Mark Zuckerberg -- for the first time -- to conduct a civil rights audit that, among other items, would address the algorithmic biases on its platforms. Also last year, he secured language in the Federal Aviation Administration Authorization Act requiring TSA to report to Congress on methods to eliminate bias on race, gender, and age as it begins deploying facial recognition on domestic flights.
Booker also partnered with Senator Wyden in July 2018 to survey 39 federal law-enforcement agencies about their use of facial recognition technology, and what policies, if any, they have installed to prevent abuse, misuse, and discrimination. That same month he also joined Senators in requesting the Government Accountability Office study commercial and government uses of facial recognition technology, and examine what safeguards companies and law enforcement have implemented to prevent misuse and discrimination based on race, gender, and age.