How Artificial Intelligence Apps Are Fueling Child Sexual Exploitation and What We Can Do About It
As artificial intelligence (AI) reshapes how we communicate and create, it’s also being weaponized in disturbing new ways — most alarmingly, to generate realistic depictions of naked girls and young women and acts of child sexual abuse. With just a few clicks, users, including kids themselves, can create realistic, pornographic images of classmates or imagined minors. For victims, the psychological and social consequences are profound and long-lasting. But the impact on those exposed to these images, including the young people who create them, is also deeply damaging, reinforcing and amplifying the violent, misogynistic, and abusive culture and narrative of pornography. Beyond these images, new technology is also being used to create sex dolls and sex robots, which allow adult men to mimic child sexual abuse.
In the following conversation, we speak with Culture Reframed Research Associate Dr. Eric Silverman, a cultural anthropologist who researches, writes reports, and delivers presentations on different facets and the harms of the pornography industry. Silverman discusses the technological capabilities to make fake and modified sexual images of children and young people, the impact on victims, and the urgent need for better digital safeguards and education. Plus, learn what a truly liberating or “sex positive” culture looks like and access parent and educator resources to discuss and counteract the harms of sexualized media.
Your latest report digs deep into how AI is being used to create realistic depictions of naked young women and girls and acts of child sexual abuse. What should parents and educators know about how easily these technologies are accessed and used?
These sorts of AI websites and platforms are trivially easy for anybody to find and use — even kids. They require no technological sophistication and typically no registration, payment, credit card, or age verification. And they are all presented as mere amusement — just another form of fun and games.
As such, the pornography industry is using these AI programs to attract more and more customers in two ways. First, these platforms re-package the harm and violence of pornography as a playful way to express your “creativity.” Second, AI pornography allows each user — again, even young people — to customize sexualized images to their own individualized tastes. You don’t have to go looking for whatever you want to see — you can make it yourself! But the bottom line has nothing to do with harmless fun: the point of these platforms is to entice free users today to become paying customers tomorrow.
“They are all presented as mere amusement — just another form of fun and games.”
What kind of fake and modified images can these programs create, and what has been the impact on young victims?
These programs can ultimately make any images the user wants to create. Yes, most platforms have certain constraints. You can’t readily, for example, ask it to make an image of a 14-year-old girl. But after a middle-school or high-school child takes a few computer courses in school, they can download programs and adjust them to make anything they wish — with no restrictions.
More commonly, boys in middle and high school are using freely available platforms to ‘undress’ photos of their female schoolmates in order to email and text these fake naked images around the school and community, and post them to social media. The result is that an unsuspecting girl suddenly finds that all her peers are looking at realistic images of her in the nude — even though they are fake. And so are adult strangers, some of whom then try to contact the girl for sex which, since she is underage, is legally defined as the rape of a child.
“Once images circulate in cyberspace, it is impossible to delete them entirely.”
Once images circulate in cyberspace, it is impossible to delete them entirely. So, these images could haunt a young person for the rest of her life. Anytime someone searches her name — a potential employer, an organization where she wants to volunteer, a university admissions official, etc. — they will think that she has willingly posted naked “selfies” online. The impact of this is devastating for victims. But the impact is also that young boys are effectively told that it is fun, playful, and acceptable to forcefully undress girls and women at will.
What can parents, educators, and other adults do to protect children and young people from the harms of AI?
First and foremost, everybody needs to call upon legislators and schools to insist that we include scientifically sound, age-appropriate, and porn-critical sexual education in the curriculum. We must teach young people the harms of AI pornography and pornography in general. All adults, too, need to contact their elected officials and urge them to introduce and support legislation calling for age verification for all pornographic websites in order to ensure that children are not able to easily access these websites and platforms. We need, too, to start calling out the banks, companies, and financial institutions — which include PayPal, Visa, Mastercard, American Express, Apple Pay, Google Pay, and more — that do business with the companies that create AI pornographic platforms. And parents and caregivers need to start talking with their children today about the harms of these platforms. The resources available from Culture Reframed are a good place to start.
The pornography industry, as well as some academics, have framed technologies like AI-generated porn and sex robots as liberating or “sex positive.” Why is this framing inaccurate?
A truly “sex positive” culture would allow each person to have full ownership and agency over their own intimate lives. But this is not what AI pornography and sex robots do. Rather, the pornography industry forces particular kinds of sexuality onto young people — sexuality that is highly scripted, stereotypical, violent, coercive, and harmful. AI pornography and sex robots are just another way that the porn industry tells a young person what to desire and how to act on those desires. That is sex negative. Sex positivity would allow a young person to develop their own authentic desires, at their own pace, in ways that feel safe to them. AI pornography is the “Joe Camel” of sex: it tries to seduce young people into doing something unhealthy by presenting that harmful behavior as a fun cartoon. The pornography industry is not “sex positive.” Porn hates sex. It only loves profits. A truly liberating sexuality lets each person individually develop, in relationship to others, as they feel ready, secure, and comfortable.
“The pornography industry forces particular kinds of sexuality onto young people — sexuality that is highly scripted, stereotypical, violent, coercive, and harmful.”
What can be done to counteract the violent, misogynistic, and abusive culture of pornography and AI-pornography?
We need to teach young people, in age-appropriate ways, to critically deconstruct AI pornography in the same way that we teach young kids not to believe everything they see in a television advertisement. We need to cultivate the skills necessary for young people to see what is not shown in a pornographic image — how certain images can shape a person’s views and behaviors in unhealthy ways. Last year, global advertising exceeded $790 billion. That’s a lot of money. Corporations and advertisers know well that regularly seeing the same images and videos, which communicate the same messages, profoundly shapes how people think and behave. We need to start teaching young people how to “see” all the harmful ways that AI and non-AI pornography try to change their desires, behaviors, and relationships. And we also need to try and promote sex positive ways of thinking that foster trust, empathy, honesty, emotional connection, and safety.
Read more about this topic in Silverman’s latest report, AI, Deepfake Children, and Child Sex Dolls, and Sex Robots. This report builds on a previous Culture Reframed report, Artificial Intelligence, Virtual Reality, and Pornography: How Misogyny Constrains the Future, which discusses the growing role of Artificial Intelligence for the pornography industry. While that report focused on the impact of such technologies on adults, the latest report examines the use of AI and additional technologies, such as sex dolls and sex robots, to harm children.