As U.S. Supreme Court weighs YouTube's algorithms, 'litigation
minefield' looms
Send a link to a friend
[February 17, 2023]
By Andrew Chung
WASHINGTON (Reuters) - In 2021, a California state court threw out a
feminist blogger's lawsuit accusing Twitter Inc of unlawfully barring as
"hateful conduct" posts criticizing transgender people. In 2022, a
federal court in California tossed a lawsuit by LGBT plaintiffs accusing
YouTube, part of Alphabet Inc, of restricting content posted by gay and
transgender people.
These lawsuits were among many scuttled by a powerful form of immunity
enshrined in U.S. law that covers internet companies. Section 230 of the
Communications Decency Act of 1996 frees platforms from legal
responsibility for content posted online by their users.
In a major case to be argued at the U.S. Supreme Court on Tuesday, the
nine justices will address the scope of Section 230 for the first time.
A ruling weakening it could expose internet companies to litigation from
every direction, legal experts said.
"There's going to be more lawsuits than there are atoms in the
universe," law professor Eric Goldman of the University of Santa Clara
Law School's High Tech Law Institute said.
The justices will hear arguments in an appeal by the family of Nohemi
Gonzalez, a 23-year-old woman from California shot dead during a 2015
rampage by Islamist militants in Paris, of a lower court's ruling
dismissing a lawsuit against YouTube's owner Google LLC seeking monetary
damages, citing Section 230. Google and YouTube are part of Alphabet.
The family claimed that YouTube, through its computer algorithms,
unlawfully recommended videos by the Islamic State militant group, which
claimed responsibility for the attacks, to certain users.
A ruling against the company could create a "litigation minefield,"
Google told the justices in a brief. Such a decision could alter how the
internet works, making it less useful, undermining free speech and
hurting the economy, according to the company and its supporters.
It could threaten services as varied as search engines, job listings,
product reviews and displays of relevant news, songs or entertainment,
they added.
Section 230 protects "interactive computer services" by ensuring they
cannot be treated as the "publisher or speaker" of information provided
by users. Legal experts note that companies could employ other legal
defenses if Section 230 protections are curbed.
Calls have come from across the ideological and political spectrum -
including Democratic President Joe Biden and his Republican predecessor
Donald Trump - for a rethink of Section 230 to ensure that companies can
be held accountable. Biden's administration urged the justices to revive
the Gonzalez family's lawsuit.
'GET OUT OF JAIL FREE'
Civil rights, gun control and other groups have told the justices that
platforms are amplifying extremism and hate speech. Republican lawmakers
have said platforms stifle conservative viewpoints. A coalition of 26
states stated that social media firms "do not just publish" user content
anymore, they "actively exploit it."
[to top of second column]
|
Silhouettes of mobile device users are
seen next to a screen projection of Youtube logo in this picture
illustration taken March 28, 2018. REUTERS/Dado Ruvic/Illustration
"It's a huge 'get out of jail free' card," Michigan State University
law professor Adam Candeub said of Section 230.
Grievances against companies vary. Some have targeted the way
platforms monetize content, place advertisements or moderate content
by removing or not removing certain material.
Legal claims often allege breach of contract, fraudulent business
practices or violations of state anti-discrimination laws, including
based on political views.
"You could have a situation where two sides of a very controversial
issue could be suing a platform," said Scott Wilkens, an attorney at
Columbia University's Knight First Amendment Institute.
Candeub represented Meghan Murphy, the blogger and writer on
feminist issues who sued after Twitter banned her for posts
criticizing transgender women. A California appeals court dismissed
the lawsuit, citing Section 230, because it sought to hold Twitter
liable for content Murphy created.
A separate lawsuit by transgender YouTube channel creator Chase Ross
and other plaintiffs accused the video-sharing platform of
unlawfully restricting their content because of their identities
while allowing anti-LGBT slurs to remain. A judge blocked them,
citing Section 230.
ANTI-TERRORISM ACT
Gonzalez, who had been studying in Paris, died when militants fired
on a crowd at a bistro during the rampage that killed 130 people.
The 2016 lawsuit by her mother Beatriz Gonzalez, stepfather Jose
Hernandez and other relatives accused YouTube of providing "material
support" to Islamic State in part by recommending the group's videos
to certain users based on algorithmic predictions about their
interests. The recommendations helped spread Islamic State's message
and recruit jihadist fighters, the lawsuit said.
The lawsuit was brought under the U.S. Anti-Terrorism Act, which
lets Americans recover damages related to "an act of international
terrorism." The San Francisco-based 9th U.S. Circuit Court of
Appeals dismissed it in 2021.
The company has attracted support from various technology
businesses, scholars, legislators, libertarians and rights groups
worried that exposing platforms to liability would force them to
remove content at even the hint of controversy, harming free speech.
The company has defended its practices. Without algorithmic sorting,
it said, "YouTube would play every video ever posted in one infinite
sequence - the world's worst TV channel."
(Reporting by Andrew Chung; Editing by Will Dunham)
[© 2023 Thomson Reuters. All rights
reserved.]This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content. |