From Clubhouse to Twitter Spaces, social media grapples with live audio
moderation
Send a link to a friend
[February 25, 2021] By
Elizabeth Culliford
(Reuters) - The explosive growth of
Clubhouse, an audio-based social network buoyed by appearances from tech
celebrities like Elon Musk and Mark Zuckerberg, has drawn scrutiny over
how the app will handle problematic content, from hate speech to
harassment and misinformation.
Moderating real-time discussion is a challenge for a crop of platforms
using live voice chat, from video game-centric services like Discord to
Twitter Inc's new live-audio feature Spaces. Facebook is also reportedly
dabbling with an offering.
"Audio presents a fundamentally different set of challenges for
moderation than text-based communication. It's more ephemeral and it's
harder to research and action," said Discord's chief legal officer,
Clint Smith, in an interview.
Tools to detect problematic audio content lag behind those used to
identify text, and transcribing and examining recorded voice chats is a
more cumbersome process for people and machines. A lack of extra clues,
like the visual signals of video or accompanying text comments, can also
make it more challenging.
"Most of what you have in terms of the tools of content moderation are
really built around text," said Daniel Kelley, associate director of the
Anti-Defamation League's Center for Technology and Society.
Not all companies make or keep voice recordings to investigate reports
of rule violations. While Twitter keeps Spaces audio for 30 days or
longer if there is an incident, Clubhouse says it deletes its recording
if a live session ends without an immediate user report, and Discord
does not record at all.
Instead, Discord, which has faced pressure to curb toxic content like
harassment and white supremacist material in text and voice chats, gives
users controls to mute or block people and relies on them to flag
problematic audio.
Such community models can be empowering for users but may be easily
abused and subject to biases.
Clubhouse, which has similarly introduced user controls, has drawn
scrutiny over whether actions like blocking, which can prevent users
from joining certain rooms, can be employed to harass or exclude users.
The challenges of moderating live audio are set against the broader,
global battle over content moderation on big social media platforms,
which are criticized for their power and opacity, and have drawn
complaints from both the right and left as either too restrictive or
dangerously permissive.
Online platforms have also long struggled with curbing harmful or
graphic live content on their sites. In 2020, a live video of a suicide
on Facebook Inc spread across multiple sites. In 2019, a shooting in a
German synagogue was live-streamed on Amazon Inc-owned gaming site
Twitch.
"It's really important for these services to be learning from the
rollout of video-streaming to understand they will face all of the same
kinds of questions," said Emma Llanso, a member of Twitch's Safety
Advisory Council. She added: "What happens when people want to use your
service to livestream audio of an encounter with police or a violent
attack?"
'UP TO INFINITY'
Last Sunday, during the company's public town hall, Clubhouse co-founder
Paul Davison presented a vision for how the currently invite-only app
would play a bigger role in people's lives - hosting everything from
political rallies to company all-hands meetings.
[to top of second column] |
The social audio app Clubhouse is seen on a mobile phone in this
illustration picture taken February 8, 2021. REUTERS/Florence
Lo/Illustration
Rooms, currently capped at 8,000 people, would scale "up to infinity" and
participants could make money from "tips" paid by the audience.
The San Francisco-based company's latest round of financing in January valued it
at $1 billion, according to a source familiar with the matter. The funding was
led by Andreessen Horowitz, a leading Silicon Valley venture capital firm.
Asked how Clubhouse was working to detect dangerous content as the service
expanded, Davison said the tiny startup has been staffing up its trust and
safety team to handle issues in multiple languages and quickly investigate
incidents.
The app, which said it has 10 million weekly active users, has a full-time staff
that only recently reached double digits. A spokeswoman said it uses both
in-house reviewers and third-party services to moderate content and has engaged
advisors on the issue, but would not comment on review or detection methods.
In the year since it started, Clubhouse has faced criticism over reports of
misogyny, anti-Semitism and COVID-19 misinformation on the platform despite
rules against racism, hate speech, abuse and false information.
Clubhouse has said it is investing in tools to detect and prevent abuse as well
as features for users, who can set rules for their rooms, to moderate
conversations.
Getting audio content moderation right could help spark new waves of business
and usage for new services and features launched by the major social networks.
One source told Reuters that billionaire entrepreneur Mark Cuban's upcoming live
audio platform 'Fireside,' which describes itself as a "socially responsible
platform," would be curated to avoid the issues other networks have faced.
Twitter, which has long faced criticism for its ability to curb abuse, is
currently testing Spaces with 1,000 users that began with women and people from
marginalized groups.
Hosts are given controls to moderate and users can report problems. But Twitter
is also looking at investing in "proactive detection" - for example,
incorporating audio transcripts into tools Twitter currently uses to detect
problem tweets without users flagging, said Andrew McDiarmid of Twitter's
product trust team.
McDiarmid said Twitter was still deciding how to translate existing rules, like
labeling misinformation, which also apply to the new service, to the audio
arena.
Until Twitter nails down its moderation plan, people who have recently violated
the site's rules are not allowed access to the new feature.
(Reporting by Elizabeth Culliford; editing by Kenneth Li, Jonathan Weber and
Nick Zieminski)
[© 2021 Thomson Reuters. All rights
reserved.] Copyright 2021 Reuters. All rights reserved. This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content. |