Microsoft has closed a loophole in their Designer AI image generator app that was reportedly used to create deepfake images of Taylor Swift.
Whenever pop music fans talk about popular artists in the space, there’s no doubt that Taylor Swift’s name is brought up. She’s made quite a name for herself with her music, live shows, merch, and more.
On January 25, 2024, a bad actor reportedly used Microsoft’s Designer AI generator app to create life-like deepfake images of the singer that were promptly spread across social media.
Now, 404Media has reported that Microsoft has fixed a loophole in the app that allows users to create deepfake images of celebs.
Microsoft fixes AI app that created Taylor Swift deepfakes
The publication previously reported that the Taylor Swift deepfake images came from various communities online that were using multiple tools including Microsoft’s Designer AI.
On January 29, 2024, 404Media revealed that Microsoft has introduced more protections to Designer, closing a loophole used by the bad actors in the process. Before, they were able to use slightly misspelled names and non-sexual descriptions to create the deepfake image.
Subscribe to our newsletter for the latest updates on Esports, Gaming and more.
Now, that loophole has been removed — making it so deepfake images cannot be generated at all.
However, the site does state that the community creating these deepfake images is still doing so, but through other unnamed tools.
This update to Designer comes just days after Microsoft CEO Satya Nadella responded to the controversy.
“It’s about global, societal convergence on certain norms, and we can do it, especially when you have law and law enforcement and tech platforms that can come together—I think we can govern a lot more than we give ourselves credit for,” he said.
This isn’t the first time that Microsoft Designer has gone viral for generating images so far in 2024, either. At the beginning of January, social media erupted with users creating their own customized Funko Pops.