WASHINGTON, D.C. – U.S. Senator Cory Booker (D-NJ) today called on Facebook to enact a number of additional reforms on the heels of its recent announcement that it would undergo a civil rights audit and political bias review. Specifically, Booker pushed Facebook to use its data as a source for good by creating a public data trust and establishing industry-wide ethical frameworks.

“Announcing plans for a civil rights and safety audit is an important and significant step,” Booker wrote in a letter sent today to Facebook Co-Founder and CEO Mark Zuckerberg. “However, the audit is one action among many Facebook should take to provide its users a more inclusive, equitable, and safe platform. I urge Facebook to partner with outside stakeholders to use Facebook’s vast trove of data as a force for positive change.”

“I also strongly suggest that Facebook work with organizers, activists, technologists and data ethicists to establish industry-wide ethical frameworks and implement a Data Code of Ethics,” Booker added.

Booker also called on the tech giant to improve diversity within its ranks, citing unacceptable statistics such as the fact that only one percent of the firm’s U.S. technical employees identify as black.

“If Facebook is truly committed to eliminating harassment and discrimination on its platform and understanding how social media and big data affects underserved communities, then recruiting, hiring, and retaining diverse researchers and data scientists must be an imperative,” Booker wrote.

Today’s letter follows similar questions Booker asked of Zuckerberg during a hearing in the Senate last month. In response to Booker’s questions, Zuckerberg indicated openness to auditing the firm’s practices and policies as it relates to discrimination on the platform.

Full text of the letter can be found below.

Dear Mr. Zuckerberg:

I am encouraged by the announcement that Facebook will facilitate an independent civil rights and safety audit of Facebook’s products and policies in order to respond to the rampant hate speech, discriminatory practices, censorship, and surveillance occurring on the platform. In response to questions I posed to you during last month’s Senate hearing into Facebook’s data malpractice, you stated such an audit was a “very good idea.” I am heartened this idea is being bolstered with real action.  A thorough, open assessment of Facebook’s operations is a necessary step to better understanding how Facebook’s products and policies failed to protect vulnerable communities.

Yet, there is more to be done.  Building communities and truly bringing the world closer together will require more than mitigating the harm caused by Facebook’s missteps.  Rather, Facebook must commit to harnessing its data to positively affect those communities. This is not a task that Facebook should endeavor upon on its own.

Therefore, I urge Facebook to partner with outside stakeholders to use Facebook’s vast trove of data as a force for positive change.  Advocates have suggested that Facebook create a public data trust—a clearinghouse where students, community leaders, organizers, scientists and developers can access anonymized Facebook data for research in service of the public interest.[1]  Indeed there are already notable examples of Facebook using its internal researchers and technology to serve the public good by: proactively detecting suicidal posts before they’re reported[2]; increasing our understanding of housing prices[3]; and helping relief organizations around the world respond to natural disasters.[4]  The possibilities for Facebook’s user data are endless and, accordingly, its usage should not be restricted to internal Facebook researchers and scientists.  Partnering with community-led organizations in a data trust could help to scale their work in racial justice, women’s health, affordable housing, violence reduction, and many other fields in ways simply not possible outside of Facebook.

To be sure, even the most well-intentioned research demands oversight and accountability and Facebook has not always conducted its experiments in ways that were forthcoming and transparent.[5]  Given the sensitivity of the research data and potential for abuse, I also strongly suggest that Facebook work with organizers, activists, technologists and data ethicists to establish industry-wide ethical frameworks and implement a Data Code of Ethics.  Additionally, just as the civil rights and safety audit will subject Facebook’s practices and policies to external review, Facebook’s research efforts should also be rigorously reviewed and analyzed by an outside entity.

Finally, as I mentioned during the hearing Facebook must improve its diversity numbers. Silicon Valley remains disturbingly behind in matters of inclusion. You recently announced that by the end of the year Facebook will create 5,000 new positions, for, among other functions, reviewing content.[6]  However, only 1 percent of Facebook’s U.S. technical employees identify as Black.[7]  If Facebook is truly committed to eliminating harassment and discrimination on its platform and understanding how social media and big data affects underserved communities, then recruiting, hiring, and retaining diverse researchers and data scientists must be an imperative.

Announcing plans for a civil rights and safety audit is an important and significant step.  However, the audit is one action among many Facebook should take to provide its users a more inclusive, equitable, and safe platform.  Facebook’s mission statement is “Give people the power to build community and bring the world closer together.” Giving people the power means exactly that.  I urge you to make your decision to work with outside entities on an audit the genesis of a larger commitment by Facebook to share information and resources with the community that makes Facebook what it is.



[1] https://medium.com/@YESHICAN/an-open-letter-to-facebook-from-the-data-for-black-lives-movement-81e693c6b46c

[2]https://techcrunch.com/2017/11/27/facebook-ai-suicide-prevention/

[3] https://research.fb.com/publications/social-networks-and-housing-markets/

[4]https://research.fb.com/facebook-disaster-maps-methodology/

[5] https://www.nytimes.com/2014/06/30/technology/facebook-tinkers-with-users-emotions-in-news-feed-experiment-stirring-outcry.html

[6] https://www.cnbc.com/2018/03/23/facebook-privacy-scandal-has-a-plus-thousands-of-new-jobs-ai-cant-do.html

[7] Facebook Diversity Update, August 2, 2017, https://fbnewsroomus.files.wordpress.com/2017/08/fb_diversity_2017_final.pdf