US President Joe Biden’s spokesperson has said fake, sexually explicit images of Taylor Swift that circulated online were “very alarming”.
White House Press Secretary Karine Jean-Pierre said on Friday that social media companies have “an important role to play in enforcing their own rules”, as she urged Congress to legislate on the issue.
The fake images of the pop star, believed to have been made using artificial intelligence (AI), were spread widely on social media this week, with one picture on X, being viewed 47 million times before the account was suspended.
The group Reality Defender, which detects deepfakes, said it tracked a deluge of nonconsensual pornographic material depicting Swift particularly on X, but also on Meta-owned Facebook and other social media platforms.
The researchers found several dozen different AI-generated images. The most widely shared were football-related, showing a painted or bloodied Swift that objectified her and in some cases inflicted violent harm on the deepfake version of her.
Ms Jean-Pierre said: “We’re going to do what we can to deal with this issue.
“So while social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation, and non-consensual, intimate imagery of real people.”
Fake AI-generated Joe Biden robocall tells people in New Hampshire not to vote
The world’s biggest election year: Why does Taiwan matter?
Donald Trump in court: Prosecutor warns of ‘frightening future’ if former president wins case
The spokesperson also added that lax enforcement against false images too often disproportionately affects women.
Researchers have said the number of explicit deepfakes has increased in recent years, as the technology used to produce such images has become more accessible and easier to use.
Read more:
Sharing explicit ‘deepfakes’ without consent to be a crime
Man charged with harassment and stalking near Swift’s home
Scarlett Johansson is latest victim of alleged deepfake advert
In 2019, a report released by the AI firm DeepTrace Labs found that explicit images were overwhelmingly weaponised against women.
Most of the victims were Hollywood actors and South Korean K-pop singers, the report said.
X wrote in a post on the site on Friday that they are “actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.
“We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed.”
Please use Chrome browser for a more accessible video player
Be the first to get Breaking News
Install the Sky News app for free
Meanwhile, Meta said in a statement that it strongly condemns “the content that has appeared across different internet services” and has worked to remove it.
“We continue to monitor our platforms for this violating content and will take appropriate action as needed,” the company said.
Taylor Swift’s representatives did not respond to a Sky News request for comment.