In
an alert circulated this week, the bureau said it had recently
observed an uptick in extortion victims saying they had been
targeted using doctored versions of innocent images taken from
online posts, private messages or video chats.
"The photos are then sent directly to the victims by malicious
actors for sextortion or harassment," the alert said. "Once
circulated, victims can face significant challenges in
preventing the continual sharing of the manipulated content or
removal from the internet."
The bureau said the images appeared "true-to-life" and that, in
some cases, children had been targeted.
The FBI did not go into detail about the program or programs
being used to generate the sexual imagery but did note that
technological advancements were "continuously improving the
quality, customizability, and accessibility of artificial
intelligence (AI)-enabled content creation."
The bureau not respond to a follow-up message seeking details on
the phenomenon Wednesday.
The manipulation of innocent pictures to make sexually explicit
images is almost as old as photography itself, but the release
of open-source AI tools has made the process easier than ever.
The results are often indistinguishable from real life
photographs, and several websites and social media channels that
specialize in the creation and exchange of AI-enabled sexual
imagery have sprung up in recent years.
(Reporting by Raphael Satter; Editing by David Gregorio)
[© 2023 Thomson Reuters. All rights
reserved.] Copyright 2022 Reuters. All rights reserved. This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|
|