Thousands of Facebook Groups buzzed with calls for violence ahead of
U.S. election
Send a link to a friend
[November 07, 2020]
By Katie Paul
SAN FRANCISCO (Reuters) - Before Facebook
Inc shut down a rapidly growing "Stop the Steal" Facebook Group on
Thursday, the forum featured calls for members to ready their weapons
should President Donald Trump lose his bid to remain in the White House.
In disabling the group after coverage by Reuters and other news
organizations, Facebook cited the forum's efforts to delegitimize the
election process and "worrying calls for violence from some members."
Such rhetoric was not uncommon in the run-up to the election in Facebook
Groups, a key booster of engagement for the world's biggest social
network, but it did not always get the same treatment.
A survey of U.S.-based Facebook Groups between September and October
conducted by digital intelligence firm CounterAction at the request of
Reuters found rhetoric with violent overtones in thousands of
politically oriented public groups with millions of members.
Variations of twenty phrases that could be associated with calls for
violence, such as "lock and load" and "we need a civil war," appeared
along with references to election outcomes in about 41,000 instances in
U.S.-based public Facebook Groups over the two month period.
Other phrases, like "shoot them" and "kill them all," were used within
public groups at least 7,345 times and 1,415 times respectively,
according to CounterAction. "Hang him" appeared 8,132 times. "Time to
start shooting, folks," read one comment.
Facebook said it was reviewing CounterAction's findings, which Reuters
shared with the company, and would take action to enforce policies "that
reduce real-world harm and civil unrest, including in Groups," according
to a statement provided by spokeswoman Dani Lever.
The company declined to say whether examples shared by Reuters violated
its rules or say where it draws the line in deciding whether a phrase
"incites or facilities serious violence," which, according to its
policies, is grounds for removal.
Prosecutors have linked several disrupted militia plots back to Facebook
Groups this year, including a planned attack on Black Lives Matters
protesters in Las Vegas and a scheme to kidnap the governor of Michigan.
To address concerns, Facebook announced a flurry of policy changes since
the summer aimed at curbing "militarized social movements," including
U.S. militias, Boogaloo networks and the QAnon conspiracy movement.
It says it has removed 14,200 groups on the basis of those changes since
August.
[to top of second column]
|
3D printed ballot boxes are seen in front of a displayed Facebook
logo in this illustration taken November 4, 2020. REUTERS/Dado Ruvic/Illustration/File
Photo
FACTBOX:
As pressure on the company intensified ahead of the election,
Zuckerberg said Facebook would pause recommendations for political
groups and new groups, although that measure did not prevent the
"Stop the Steal" group for swelling to more than 365,000 members in
less than 24 hours.
'MEANINGFUL CONNECTIONS'
Facebook has promoted Groups aggressively since Chief Executive Mark
Zuckerberg made them a strategic priority in 2017, saying they would
encourage more "meaningful connections," and this year featured the
business in a Super Bowl commercial.
It stepped up Groups promotion in news feeds and search engine
results last month, even as civil rights organizations warned the
product had become a breeding ground for extremism and
misinformation.
The public groups can be seen, searched and joined by anyone on
Facebook. Groups also offer private options that conceal posts - or
the existence of the forum - even when a group has hundreds of
thousands of members.
Facebook has said it relies heavily on artificial intelligence to
monitor the forums, especially private groups, which yield few user
reports of bad behavior as members tend to be like-minded, to flag
posts that may incite violent actions to human content reviewers.
While use of violent language does not always equate to an
actionable threat, Matthew Hindman, a machine learning and media
scholar at George Washington University who reviewed the results,
said Facebook's artificial intelligence should have been able to
pick out common terms for review.
"If you're still finding thousands of cases of 'shoot them' and 'get
a rope,' you're looking at a systemic problem. There's no way a
modern machine learning system would miss something like that," he
said.
(Reporting by Katie Paul; Editing by Greg Mitchell and Edward Tobin)
[© 2020 Thomson Reuters. All rights
reserved.] Copyright 2020 Reuters. All rights reserved. This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|