Facebook’s dilemma: How to police claims about unproven COVID-19
vaccines
Send a link to a friend
[August 07, 2020]
By Elizabeth Culliford and Gabriella Borter
LONDON/NEW YORK (Reuters) - Since the World
Health Organization declared the novel coronavirus an international
health emergency in January, Facebook Inc <FB.O> has removed more than 7
million pieces of content with false claims about the virus that could
pose an immediate health risk to people who believe them.
The social media giant, which has long been under fire from lawmakers
over how it handles misinformation on its platforms, said it had in
recent months banned such claims as 'social distancing does not work'
because they pose a risk of 'imminent' harm. Under these rules, Facebook
took down a video post on Wednesday by U.S. President Donald Trump in
which he claimed that children are "almost immune" to COVID-19.
But in most instances, Facebook does not remove misinformation about the
new COVID-19 vaccines that are still under development, according to the
company's vaccine policy lead Jason Hirsch, on the grounds that such
claims do not meet its imminent harm threshold. Hirsch told Reuters the
company is "grappling" with the dilemma of how to police claims about
new vaccines that are as yet unproven.
"There's a ceiling to how much we can do until the facts on the ground
become more concrete," Hirsch said in an interview with Reuters, talking
publicly for the first time about how the company is trying to approach
the coronavirus vaccine issue.
Tom Phillips, editor at one of Facebook’s fact-checking partners Full
Fact, sees the conundrum this way: “How do you fact check about a
vaccine that does not exist yet?”
For now, misinformation ranging from unfounded claims to complex
conspiracy theories about the developmental vaccines is proliferating on
a platform with more than 2.6 billion monthly active users, a review of
posts by Reuters, Facebook fact-checkers and other researchers found.
The worry, public health experts told Reuters, is that the spread of
misinformation on social media could discourage people from eventually
taking the vaccine, seen as the best chance to stem a pandemic that has
infected millions and killed hundreds of thousands worldwide, including
158,000 people in the United States alone.
At the same time, free speech advocates fret about increased censorship
during a time of uncertainty and the lasting repercussions long after
the virus is tamed.
Drawing the line between true and false is also more complex for the new
COVID-19 vaccines, fact-checkers told Reuters, than with content about
vaccines with an established safety record.
Facebook representatives said the company has been consulting with about
50 experts in public health, vaccines, and free expression on how to
shape its response to claims about the new COVID-19 vaccines.
Even though the first vaccines aren’t expected to go to market for
months, polls show that many Americans are already concerned about
taking a new COVID-19 vaccine, which is being developed at a record
pace. Some 28% of Americans say they are not interested in getting the
vaccine, according to a Reuters/Ipsos poll conducted between July 15-21.
Among them, more than 50% said they were nervous about the speed of
development. More than a third said they did not trust the people behind
the vaccine's development.
The U.K.-based non-profit Center for Countering Digital Hate reported in
July that anti-vaccination content is flourishing on social media sites.
Facebook groups and pages accounted for more than half of the total
anti-vaccine following across all the social media platforms studied by
the CCDH.
One public Facebook group called "REFUSE CORONA V@X AND SCREW BILL
GATES," referring to the billionaire whose foundation is helping to fund
the development of vaccines, was started in April by Michael Schneider,
a 42-year-old city contractor in Waukesha, Wisconsin. The group grew to
14,000 members in under four months. It was one of more than a dozen
created in the last few months which were dedicated to opposing the
COVID-19 vaccine and the idea that it might be mandated by governments,
Reuters found.
Schneider told Reuters he is suspicious of the COVID-19 vaccine because
he thinks it is being developed too fast to be safe. "I think a lot of
people are freaking out," he said.
Posts about the COVID-19 vaccine that have been labeled on Facebook as
containing "false information" but not removed include one by Schneider
linking to a YouTube video that claimed the COVID-19 vaccine will alter
people’s DNA, and a post that claimed the vaccine would give people
coronavirus. (See Reuters fact-check: https://reut.rs/30t1toW]
Facebook said that these posts did not violate its policies related to
imminent harm. "If we simply removed all conspiracy theories and hoaxes,
they would exist elsewhere on the internet and broader social media
ecosystem. This helps give more context when these hoaxes appear
elsewhere," a spokeswoman said.
Facebook does not label or remove posts or ads that express opposition
to vaccines if they do not contain false claims. Hirsch said Facebook
believes users should be able to express such personal views and that
more aggressive censorship of anti-vaccine views could also push people
hesitant about vaccines towards the anti-vaccine camp.
[to top of second column]
|
A 3D printed Facebook logo is seen in front of displayed coronavirus
disease (COVID-19) words in this illustration taken March 24, 2020.
REUTERS/Dado Ruvic/Illustration
‘IT’S KIND OF ON STEROIDS’
At the crux of Facebook’s decisions over what it removes are two
considerations, Hirsch said. If a post is identified as containing
simply false information, it will be labeled and Facebook can reduce
its reach by limiting how many people will be shown the post. For
example, it took this approach with the video Schneider posted
suggesting the COVID-19 vaccine could alter people’s DNA.
If the false information is likely to cause imminent harm, then it
will be removed altogether. Last month, under these rules, the
company removed a video touting hydroxychloroquine as a coronavirus
cure – though only after it racked up millions of views.
In March 2019, Facebook said it would start reducing the rankings
and search recommendations of groups and pages spreading
misinformation about any vaccines. Facebook’s algorithms also lift
up links to organizations like the WHO when people search for
vaccine information on the platform.
Some public health experts want Facebook to lower their removal
standards when considering false claims about the future COVID-19
vaccines. "I think there is a duty (by) platforms like that to
ensure that they are removing anything that could lead to harm,”
said Rupali Limaye, a social scientist at the Johns Hopkins
Bloomberg School of Public Health, who has been in talks with
Facebook. "Because it is such a deadly virus, I think it shouldn’t
just have to be 'imminent.'"
But Jacob Mchangama, the executive director of Copenhagen-based
think tank Justitia who was consulted by Facebook about its vaccine
approach, fears the fallout from mass deletions: "This may have
long-term consequences for free speech when this virus is hopefully
contained," he said.
Misinformation about other vaccines has rarely met Facebook's
threshold for risking imminent harm.
However, in Pakistan last year, the company intervened to take down
false claims about the polio vaccine drive that were leading to
violence against health workers. In the Pacific island state of
Samoa, Facebook deleted vaccine misinformation because the low
vaccination rate was exacerbating a dangerous measles outbreak.
“With regard to vaccines, it's not a theoretical line … we do try to
determine when there is likely going to be imminent harm resulting
from misinformation and we try to act in those situations,” Hirsch
told Reuters.
To combat misinformation that doesn’t meet its removal criteria,
Facebook pays outside fact-checkers – including a Reuters unit – who
can rate posts as false and attach an explanation. The company has
said that 95 percent of the time, people who saw fact-checkers'
warning labels did not click through to the content. [https://bit.ly/33z7Jh6]
Still, the fact-checking program has been criticized by some
researchers as an inadequate response to the amount and speed of
viral misinformation on the platforms. Fact-checkers also do not
rate politicians' posts and they do not judge posts that are
exclusively in private or hidden groups.
Determining what constitutes a false claim regarding the COVID-19
shot is much harder than fact-checking a claim about an established
vaccine with a proven safety record, Facebook fact-checkers told
Reuters.
"There is a lot of content that we see and we don't even know what
to do with it," echoed Emmanuel Vincent, founder of Science
Feedback, another Facebook fact-checking partner, who said the
number of vaccines in development made it difficult to debunk claims
about how a shot would work.
In a study published in May in the journal Nature, physicist Neil
Johnson's research group found that there were nearly three times as
many active anti-vaccination groups on Facebook as pro-vaccination
groups during a global measles outbreak from February to October
2019, and they were faster growing.
Since the study was published, anti-vaccine views and COVID-19
vaccine conspiracies have flourished on the platform, Johnson said,
adding, "It's kind of on steroids."
(Reporting by Elizabeth Culliford and Gabriella Borter, editing by
Ross Colvin and Edward Tobin)
[© 2020 Thomson Reuters. All rights
reserved.] Copyright 2020 Reuters. All rights reserved. This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content. |