The Supreme Court, the highest American court, examines Tuesday, February 21, 2023 two complaints filed against tech giants, Google and Twitter. They are accused of having published or promoted terrorist content via their platform and therefore contributed to the exercise of violence. These hearings could challenge an article of an American law, founding the operation of platforms on the Internet.
Magna Carta
Magna Carta of the Internet for its defenders, encouraging impunity for its critics, Section 230 of the Communications Decency Act of 1996 grants US companies legal immunity for content posted by third parties on their platforms and sites Internet.
According to this federal law, a pillar of freedom of expression on the web for more than a quarter of a century, platforms are not publishers. They do not bear criminal responsibility for their content, unlike traditional media.
No provider or user of an online service should “be treated as the publisher of information provided by another”, stipulates section 230 which immunizes them against the offenses that others commit on their sites. It also allows providers to restrict access to content.
At the time the law was enacted, the Internet was still in its infancy. It was then a question of favoring its development without stifling it in the bud by an avalanche of lawsuits. However, today, with the rise of search and social media platforms, algorithms are accused of promoting hate and violence.
Already in 2018, a US law made it illegal to aid or support sex trafficking and amended Section 230 to remove immunity from sites that profit from such trafficking.
Algorithms
The two complaints filed with the U.S. Supreme Court – Gonzalez v. Google and Twitter v. Taamneh – a game changer. They call into question the impunity enjoyed by large companies on the Internet, relying on section 230. They raise the question of responsibility in the event of hosting but also of recommending terrorist content. For the first time, it is not only the published content that is deemed offensive but the algorithm, the very functioning of the platform, which is called into question.
The Supreme Court examines in particular the complaint of the family of Nohemi Gonzalez, American student, murdered in Paris during the attacks of November 2015, at the bar La belle team. His family accuses YouTube, a subsidiary of Google, of having recommended, through its algorithms, jihadist content to its users.
“YouTube created and implemented these algorithms. By recommending the videos of the Islamic State, Google helped it spread its messages and therefore provided it with material support,” the girl’s parents accuse. For them, the use of an algorithm gives Google an editorial role, which the company denies.
Defense of Google and Meta
The tech giants denounce this argument. Google called for “not weakening a central piece of the modern internet”. “Allowing platforms to sue for recommendations would expose them to complaints for third-party content absolutely all the time,” Meta said.
Defenders of Section 230 warn of the risk of censorship of freedom of expression at a time when debates around the management by social networks of hoax and hate speech rage in the United States.
The Supreme Court is due to render its decision by June 30, 2023. While partisan divides between Republicans and Democrats hamper any legislative progress, the ruling by the Republican-dominated U.S. high court could reshape the internet landscape.