It’s time for Facebook to be honest about its content management

By Sharon Y. Eubanks, Chief Counsel

Published on October 27, 2020

Please, share this page
It’s time for Facebook to be honest about its content management

October 27, 2020 – This week, Mark Zuckerberg, Facebook CEO, returns (virtually) to the Hill to testify before the Senate Committee on Commerce, Science, and Transportation.  The hearing, titled “Does Section 230’s Sweeping Immunity Enable Big Tech Bad Behavior,” will examine the consequences of Section 230’s liability shield.  In doing so, the Committee should also examine Mr. Zuckerberg about statements he has made regarding terror and hate content on Facebook. The question the Committee should focus on is whether Facebook has been honest about the ways in which it has allowed, and in many circumstances actually encouraged extremists and hate groups to thrive on their product.

The National Whistleblower Center is supporting a confidential whistleblower who exposed the auto-generating content of terror and hate groups.  The government is aware of what Facebook is doing.  It’s time now for Facebook to come clean about its content management and to stop misleading everyone about what’s really happening.

Public companies must accurately describe the material risks to their businesses.  Publicly traded companies, including Facebook, are obligated to communicate honestly with shareholders about matters that may materially impact a company’s value.  Facebook has failed miserably in this regard.  For example, Facebook has erroneously claimed that hate groups are not allowed on the platform.  CEO Mark Zuckerberg represented to Congress in April 2018 that groups whose primary purpose is spreading hate would be banned from Facebook.  In fact, many of these hate groups were actually auto-generated by Facebook itself as business pages when someone listed a white supremacist or neo-Nazi organization as an employer.  It is clear that Facebook is not merely providing a platform for speech, but it is actively contributing to radicalization.

Facebook’s role in auto-generating pages for hate and terror groups wholly undermines claims by Facebook that it bars hate and terror groups and otherwise serves as a content-neutral platform.  There is a great divide between Facebook’s statements to shareholders and the reality of its handling of hate and terror groups.

That Facebook has failed to communicate the risks to shareholders of losing its exemption from liability under Section 230 of the Communications Decency Act is clear.  Specifically, Facebook faces a clear risk, one not disclosed to its shareholders, that a court, agency or Congress may determine that auto-generation of hate and terror group pages, and the failure to remove those pages from its website, exempts it from immunity under Section 230.  Facebook’s failure to disclose to its shareholders and the public its awareness of the gravity of the problem of hate and terror content on its website and its findings that its current policies are exacerbating the problem.  Instead it has misleadingly assured shareholders and the public that it has in place a meaningful strategy to address the problem.

While terror groups are thriving on Facebook, Facebook does little to interfere with their efforts and even goes so far as to generate its own terror content.  This is exactly the kind of information that shareholders, potential investors, and the public need to know.

Report Fraud Now