If you’ve been online over the past week, you’ll have been bombarded with AI images of Barbies in boxes. These include everything from the Barbie Beethoven, to the Barbie Mirror chicken and even (terrifyingly) Barbie Trump. It all sounds like fun and games – except now an expert is warning that it could be putting users at risk of becoming victims of deepfakes.
Off the back of Chat GPT 4-0’s release, which saw the AI bot gain over one million users within just one day. Unsurprisingly, people have been using the technology to do what the Internet does best: make memes. Some have sparked controversy. The recent splurge in Ghibli-style images prompted a resurfacing of Spirited Away director Miyazaki’s condemnation of AI art. Now, the latest iteration involves prompting the bot to turn users into their very own Barbie doll.
It follows on from the innocent real-life trend of cinema-goers posing in life-sized Barbie boxes, which appeared around the release of 2023’s Barbie movie. In many ways, the AI trend has allowed fans to relive the hype and excitement of the box office hit.
It’s also allowed users to create amusing images of celebrities and public figures. If you’ve ever wondered what Prince Harry or Elon Musk would like as a plastic girl’s doll, you know longer have to.
However, according to research by the AI prompt management company AIPRM, uploading your data to ChatGPT does more than allow you to relive funny moments – it could be enabling your image to live online in ways you don’t want it to. This is because ChatGPT’s privacy policy collects and stores uploaded images to fine tune its results.
According to Christoph C. Cemper, founder of AIPRM: “Images shared on AI platforms could be scraped, leaked, or used to create deepfakes, identity theft scams, or impersonations in fake content. You could unknowingly be handing over a digital version of yourself that can be manipulated in ways you never expected.”
Deepfakes are images or videos that use AI to mimic a person’s voice or facial features. The concept first entered the public consciousness in 2017, after a Redditor created a subreddit called r/deepfakes, where they used face swapping technology to post fake pornographic videos and images of celebrities.
Since then, deepfakes have been described as a “global crisis” by the European Commission in 2024. The body highlighted how these fake images can be used to convincingly impersonate or misrepresent individuals.
One of the most notable victims of this abuse of AI is pop megastar Taylor Swift. In January 2024, sexually explicit AI-generated images of the singer began to circulate on social media, leading to them being viewed millions of times. One of Taylor’s videos was reported 47 million times before being pulled from X, as reported by the BBC.
While deepfakes can be used for any kind of image manipulation, the most common use is pornography. According to a 2023 report by Home Security Heroes, pornographic deepfakes form 98% of total deepfake content. Even more concerning, 99% of its targets are women.
Earlier this year, a British man was arrested for using AI images to create pornography of women he knew in real life. They were then posted in a forum glorifying “rape culture”, as reported by the BBC. He did this by pulling images he found on social media.
More troubling still is that deepfakes are a popular search online. According to a recent study by Kapwing, there are 2,479 searches for deepfakes per million people in the UK in December 2024 – the eleventh highest search volume in Europe.
One way to protect yourself against images being scraped by ChatGPT is to change your privacy settings, Christoph recommends. Users can opt out of ChatGPT’s training data collection.
At Reach and across our entities we and our partners use information collected through cookies and other identifiers from your device to improve experience on our site, analyse how it is used and to show personalised advertising. You can opt out of the sale or sharing of your data, at any time clicking the “Do Not Sell or Share my Data” button at the bottom of the webpage. Please note that your preferences are browser specific. Use of our website and any of our services represents your acceptance of the use of cookies and consent to the practices described in our Privacy Notice and Cookie Notice.