Graphic AI images of Taylor Swift went viral on X earlier this week and now searches of the singer’s name return an error message.
via Page Six:
After we discovered the issue, X told Page Six, “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue.”
The disturbing hoax images of the 34-year-old billionaire showed her in various sexual scenarios at boyfriend Travis Kelce’s Kansas City Chiefs game.
As soon as the images started going viral, the “Cruel Summer” songstress’ diehard fans came to her defense, begging people not to share them.
Swift is reportedly “furious,” too, and she’s considering taking legal action, a source told the Daily Mail Thursday.
“Whether or not legal action will be taken is being decided, but there is one thing that is clear: These fake, AI-generated images are abusive, offensive, exploitative and done without Taylor’s consent and/or knowledge,” the insider said.
“The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with,” the source added.
The 34-year-old singer has not yet publicly addressed the scandal, but her fans have been flooding X with positive messages about her in an attempt to fight back against the images known as “deepfakes.”
“people sharing the ai pics are sick and disgusting. protect taylor swift at all costs,” one fan tweeted about the images that show the singer in provocative poses.
“using ai generated pornography of someone is awful and inexcusable. you guys need to be put in jail,” another fan wrote of the person or persons behind the offensive snaps.
The White House also spoke out after the scandal, calling for legislation to protect victims of online harassment.
White House press secretary Karine Jean-Pierre said the incident was “alarming,” and that the Biden administration is focusing on the negatives of AI.
“Of course Congress should take legislative action,” Jean-Pierre said, according to The Verge. “That’s how you deal with some of these issues.”
Finally, SAG-AFTRA released a statement on the situation, writing, “The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal. As a society, we have it in our power to control these technologies, but we must act now before it is too late.”
The actor’s union said the images are “upsetting, harmful, and deeply concerning.”
Page Six reached out to Swift’s team, but we did not receive an immediate response. The X account that first shared the AI images has since been made private.
We’ve seen explicit AI images of quite a few celebrities. There definitely needs to be some type of legislation passed because it’s only going to get worse.
The post Taylor Swift No Longer Searchable on X After Calling Out Graphic AI Photos appeared first on LOVEBSCOTT.