WASHINGTON, D.C. — Islamic State gunmen killed American college student Nohemi Gonzalez as she sat with friends in a Paris bistro in 2015, one of several attacks on a Friday night in the French capital that left 130 people dead.
Her family’s lawsuit claiming YouTube’s recommendations helped the Islamic State group’s recruitment is at the center of a closely watched Supreme Court case argued Tuesday about how broadly a law written in 1996 shields tech companies from liability. The law, known as Section 230 of the Communications Decency Act, is credited with helping create today’s internet.
A related case, set for arguments Wednesday, involves a terrorist attack at a nightclub in Istanbul, Turkey, in 2017 that killed 39 people and prompted a suit against Twitter, Facebook and Google, which owns YouTube.
The tech industry is facing criticism from the left for not doing enough to remove harmful content from the internet and from the right for censoring conservative speech. Now, the high court is poised to take its first hard look at online legal protections.
A win for Gonzalez’s family could wreak havoc on the internet, say Google and its many allies. Yelp, Reddit, Microsoft, Craigslist, Twitter and Facebook are among the companies warning that searches for jobs, restaurants and merchandise could be restricted if those social media platforms had to worry about being sued over the recommendations they provide and their users want.
“Section 230 underpins a lot of aspects of the open internet,” said Neal Mohan, who was just named senior vice president and head of YouTube.
Gonzalez’s family, partially backed by the Biden administration, argues that lower courts’ industry-friendly interpretation of the law has made it too difficult to hold Big Tech companies accountable. Freed from the prospect of being sued, companies have no incentive to act responsibly, critics say.
They are urging the court to say that companies can be sued in some instances.
Beatriz Gonzalez, Nohemi’s mother, said she barely uses the internet, but hopes the case results in it becoming harder for extremist groups to access social media.
“I don’t know much about social media or these ISIS organizations. I don’t know nothing about politics. But what I know is that my daughter is not going to vanish just like that,” Gonzalez said in an interview with The Associated Press from her home in Roswell, New Mexico.
Her daughter was a 23-year-old senior at California State University, Long Beach, who was spending a semester in Paris studying industrial design. Her last communication with her mother was a mundane exchange about money via Facebook, two days before the attacks, Gonzalez said.
The legal arguments have nothing to do with what happened in Paris. Instead, they turn on the reading of a law that was enacted “at the dawn of the dot-com era,” as Justice Clarence Thomas, a critic of broad legal immunity, wrote in 2020.
When the law was passed, 5 million people used AOL, then a leading online service provider, Tom Wheeler, the former chairman of the Federal Communications Commission, recalled at a recent conference at Harvard’s Kennedy School of Government. Facebook has 3 billion users today, Wheeler said.
The law was drafted in response to a state court decision that held an internet company could be liable for a post by one of its users in an online forum. The law’s basic purpose was “to protect Internet platforms’ ability to publish and present user-generated content in real time, and to encourage them to screen and remove illegal or offensive content,” its authors, Sen. Ron Wyden, D-Ore., and former Rep. Christopher Cox, R-Calif., wrote in a Supreme Court filing.
Groups supporting the Gonzalez family say companies have not done nearly enough to control content in the areas of child sexual abuse, revenge porn and terrorism, especially in curbing computer algorithms’ recommendation of that content to users. They also say that courts have read the law too broadly.
“Congress never could have anticipated when it passed Section 230 that the internet would develop in the ways it has and that it would be used by terrorists in the ways it has,” said Mary McCord, a former Justice Department official who authored a brief on behalf of former national security officials.
Mohan said YouTube is able to keep people from seeing almost anything that violates the company’s rules, including violent, extremist content. Just 1 video in 1,000 makes it past the company’s screeners, he said.
Recommendations have emerged as the focus of the Supreme Court case. Google and its supporters argue that even a narrow ruling for the family would have far-reaching effects.
“Recommendation algorithms are what make it possible to find the needles in humanity’s largest haystack,” Kent Walker and Google’s other lawyers wrote in their main brief to the Supreme Court.
“If we undo Section 230, that would break a lot of the internet tools,” Walker said in an interview.
Some sites might take down a lot of legitimate content in a display of excessive caution. Emerging forces and marginalized communities are most likely to suffer from such a heavy hand, said Daphne Keller of the Stanford Cyber Policy Center, who joined with the American Civil Liberties Union in support of Google.
The justices’ own views on the issue are largely unknown, except for Thomas’.
He suggested in 2020 that limiting the companies’ immunity would not devastate them.
“Paring back the sweeping immunity courts have read into Section 230 would not necessarily render defendants liable for online misconduct. It simply would give plaintiffs a chance to raise their claims in the first place. Plaintiffs still must prove the merits of their cases, and some claims will undoubtedly fail,” Thomas wrote..