Hoylman Floats Legislation Holding Tech Companies Accountable for Misinformation & Hate Speech

GettyImages-1190654841-1024×683
Social Media/GettyImages

State Sen. Brad Hoylman (D-Manhattan) on Monday announced new legislation (S.7568) to hold social media platforms accountable for knowingly promoting disinformation, violent hate speech, and other unlawful content that could harm others. 

The proposed measure comes the week before the first anniversary of the notorious January 6 insurrection by pro-Trump supporters at the U.S. Capitol, and as vaccine hesitancy continues to fuel the Omicron variant, 

While Section 230 of the Communications Decency Act protects social media platforms from being treated as publishers or speakers of content shared by users on their apps and websites, this legislation instead focuses on the active choices these companies make when implementing algorithms designed to promote the most controversial and harmful content, which creates a general threat to public health and safety.

Sen. Brad Hoylman

“Social media algorithms are specially programmed to spread disinformation and hate speech at the expense of the public good. The prioritization of this type of content has real life costs to public health and safety.  So when social media push anti-vaccine falsehoods and help domestic terrorists plan a riot at the U.S.  Capitol, they must be held accountable. Our new legislation will force social media companies to be held accountable for the dangers they promote,” said Holyman.

While social media companies claim protection from any legal consequences of their actions relating to content on their websites by hiding behind Section 230 of the Communications Decency Act, Hoylman argues these media websites are no longer simply a host for their users’ content.

Many social media companies employ complex algorithms designed to put the most controversial and provocative content in front of users as much as possible. These algorithms drive engagement with their platform, keep users hooked, and increase profits. Social media companies employing these algorithms are not an impassive forum for the exchange of ideas; they are active participants in the conversation, said Hoyman. 

Hoyman’s legislation comes as Congress is also looking to rein in social media companies through increased regulations. In October 2021, Frances Haugen, a former Facebook employee, provided testimony to U.S. Senators alleging that the company knew of research proving that its product was harmful to teenagers but purposefully hid that research from the public. She also provided testimony that the company was willing to use hateful content to retain users on the social media website.  

Social media amplification has been linked to many societal ills, including vaccine disinformation, encouragement of self-harm, bullying, and body-image issues among youth, and extremist radicalization leading to terrorist attacks like the January 6th insurrection against the U.S. Capitol. 

Specifically, Hoylman’s proposed legislation will provide a tool for the Attorney General, city corporation counsels, and private citizens to hold social media companies and others accountable when they promote content they know or reasonably should know the content:

  1. Advocates for the use of force, is directed to inciting or producing imminent lawless action, and is likely to produce such action; 
  1. Advocates for self-harm, is directed to inciting or producing imminent self-harm, and is likely to incite or produce such action; or 
  1. Includes a false statement of fact or fraudulent medical theory that is likely to endanger the safety or health of the public.