'Taylor Swift' searches blocked on X after deepfake images go viral

Sexually explicit and abusive fake images of the US singer began circulating widely last week on X, making her the most famous victim of a scourge that tech platforms and anti-abuse groups have struggled to fix.

Taylor Swift  becomes most famous victim of a deepfake scourge after abusive fake images of the US singer began circulating widely last week on X/ Photo: AP
AP

Taylor Swift  becomes most famous victim of a deepfake scourge after abusive fake images of the US singer began circulating widely last week on X/ Photo: AP

Elon Musk's social media platform X has blocked searches for Taylor Swift as pornographic deepfake images of the singer have circulated online.

Attempts to search for her name on the site on Monday resulted in an error message and a prompt for users to retry their search, which added, “Don’t fret — it’s not your fault.”

Sexually explicit and abusive fake images of Swift began circulating widely last week on X, making her the most famous victim of a scourge that tech platforms and anti-abuse groups have struggled to fix.

“This is a temporary action and done with an abundance of caution as we prioritise safety on this issue,” Joe Benarroch, head of business operations at X, said in a statement to multiple news outlets.

Read More
Read More

Scientists warn of AI dangers but disagree on solutions

#ProtectTaylorSwift

After the images began spreading online, the singer's devoted fanbase of “Swifties” quickly mobilised, launching a counteroffensive on X and a #ProtectTaylorSwift hashtag to flood it with more positive images of the pop star.

Some said they were reporting accounts that were sharing the deepfakes.

The deepfake-detecting group Reality Defender said it tracked a deluge of nonconsensual pornographic material depicting Swift, particularly on X.

Some images also made their way to Meta-owned Facebook and other social media platforms.

Researchers have said the number of explicit deepfakes have grown in the past few years, as the technology used to produce such images has become more accessible and easier to use.

In 2019, a report released by the AI firm DeepTrace Labs showed these images were overwhelmingly weaponised against women. Most of the victims, it said, were Hollywood actors and South Korean K-pop singers.

Loading...
Route 6