Facebook failed to remove
reported extremist posts - The Times
Send a link to a friend
[April 13, 2017]
LONDON
(Reuters) - Facebook failed to remove dozens of instances of extremist
and child pornography even after the social network's moderators were
directly informed of the potentially illegal content, an investigation
by The Times showed on Thursday.
Using a fake profile set up last month, a Times journalist found images
and videos glorifying Islamic State and recent deadly attacks in London
and Egypt, along with graphic images of child abuse, and asked site
moderators to remove them.
Facebook moderators removed some of reported images but left untouched
pro-jihadist posts praising recent attacks and calling for new ones. The
company appeared to take action only after The Times identified itself
as reporting a story on the matter.
Failure to remove content which is illegal under British law after
company officials have been notified of its existence could expose
Facebook to criminal prosecution for its role in encouraging the
publication and distribution of such imagery.
The social media giant faces new laws in countries around the world to
force it to move faster to combat illegal content but it has struggled
to keep pace as illicit posts can reappear as fast as they are
identified and taken down.
[to top of second column] |
A picture illustration shows a Facebook logo reflected in a person's
eye, in Zenica, March 13, 2015. REUTERS/Dado Ruvic
A Facebook spokesman said the company had now removed all the images
identified by the Times as potentially illegal, acknowledging that they
"violate our policies and have no place on Facebook".
"We are sorry that this occurred," Facebook Vice President of Operations
Justin Osofsky said in a statement. "It is clear that we can do better,
and we'll continue to work hard to live up to the high standards people
rightly expect of Facebook.”
A spokesman for London's Metropolitan Police called for individuals to
report extremist content to it via an online form. It declined to
comment on whether it was investigating if Facebook failed to act when
notified of the illegal content.
"Where material breaches UK terrorism laws, the Counter Terrorism
Internet Referral Unit (CTIRU) will, where possible, seek the removal of
the content by working with the relevant internet hosting company," the
spokesman said.
(Reporting By Eric Auchard; editing by Stephen Addison)
[© 2017 Thomson Reuters. All rights
reserved.] Copyright 2017 Reuters. All rights reserved. This material may not be published,
broadcast, rewritten or redistributed. |