Explicit deepfakes are traumatic. How to deal with the pain.

The fake imagery upends victims' lives, but healing is possible.
By Rebecca Ruiz  on 
An illustrated woman lies on a bed made of blank photos.
Sexually explicit deepfakes traumatize victims, but healing is possible. Credit: Stacey Zhu / Mashable

Therapist Francesca Rossi works with clients who've had real images of themselves turned into sexually explicit content, without their consent. All of her current clients identify as woman.

This type of image-based abuse, known as an explicit deepfake generated by artificial intelligence, is frequently perpetrated by a current or former intimate partner, or a known friend, coworker, or neighbor, as a form of harassment and stalking. Rossi, a licensed clinical social worker in New York, has seen her clients heal from this betrayal, but the journey is long, and rarely predictable.

In some states, creating and distributing an explicit deepfake might be against the law, but even so, local law enforcement may have few resources to investigate such cases.

The victim typically has to marshal her own response. Among her options are attempting to track down the imagery and issue takedown notices where it appears, but there's no guarantee she'll locate all of it. Rossi says explicit deepfakes are often traded between individuals, then downloaded, without the victim's knowledge.

Feeling successful one day doesn't mean the next day will be the same. The imagery may pop up on new platforms. The perpetrator may send it to the survivor's friends, family, and employer. Rossi says survivors naturally become hypervigilant. They often, impossibly, want to avoid the internet altogether. Sometimes they become fixated on monitoring imagery of themselves online, using the internet excessively to do so.

"Being victimized through deepfakes can erase your sense of reality," says Rossi, noting the dissonance survivors feel because the fake imagery looks real and convincing. "They distort your understanding of the world and everything you know to be true."

Why safety planning is critical for healing

Rossi says that people need to feel safe in order to restore their sense of reality. Creating that safety happens through measures big and small.

In the beginning, when the deepfakes are discovered, Rossi says that it's important to gather trusted loved ones who can offer emotional support, help locate where the deepfakes appear, and try to remove them, or develop a strategy for navigating this complex process, possibly in partnership with law enforcement or attorneys.

The U.S.-based Cyber Civil Rights Initiative has an image abuse helpline, along with a thorough guide for what to do once you've become a victim. In the United Kingdom, people can turn to the Revenge Porn Helpline, which aids survivors of intimate image abuse.

In addition, people may want to remove their personal information from databases maintained by data brokers, which can be done through paid services or by contacting the brokers directly. That data, including a person's home address and the names of their family members, can be used for doxxing, harassment, and stalking

Kate Keisel prioritizes physical and psychological safety planning in her work as cofounder of the New Jersey-headquartered Sanar Institute, which provides trauma-specific mental health services to survivors of interpersonal violence, including image-based sexual abuse.

Keisel says that survivors are often told by well-meaning supporters to stay off the internet when that's simply not an option for personal and professional reasons. That's why physical safety planning can include an understanding that even after initial successful takedown notices, there's no guarantee the imagery won't surface again.

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

Instead of hoping that the abuse will definitively end, Keisel recommends that survivors implement boundaries related to how they spend their time online, particularly if they find it distressing not to look for images. To increase their psychological safety, survivors may want to set a limit on the number of hours they devote to searching for images of themselves.

Keisel says it can be helpful for survivors to identify and stick to tasks that feel squarely within their control, like removing their personal information from the internet.

Getting and staying grounded

While practical steps, like issuing takedown notices, are key to safety planning, both Keisel and Rossi say survivors also benefit from grounding and mindfulness practices that decrease psychological distress and anxiety.

Survivors suffer particularly because their nervous system perceives a constant threat; deepfakes, after all, have the potential to re-emerge, or may still exist online or in someone else's possession.

A therapist can teach a survivor new techniques, but Keisel says activities that bring someone into the present moment, so they can fully inhabit their body, can also be powerfully calming. These can include trauma-sensitive yoga and Tai Chi.

Rossi also recommends calming strategies that stimulate the senses, such as lighting incense or a candle, and laughing, which can reduce the body's response to fear.

"We can't think our way out of trauma," says Keisel. That's why she believes "somatic," or body-based practices, help a survivor feel safe in the present moment, even if their life has been turned upside down.

Keisel says that there will be moments when a survivor's nervous system goes into panic mode because of a new development, but that it's possible to learn skills to better tolerate that distress.

The combination of safety planning, gaining more control, and self-soothing can put someone on the path to healing, Keisel says.

There is hope

Rossi and Keisel are among several therapists and professionals in the U.S. who specialize in treating survivors of image-based sexual abuse, but their expertise is uncommon. Rossi says she has more consultation requests than she can handle; they've increased markedly since AI software and apps capable of producing explicit deepfakes became more widespread late last year.

The abuse is accelerating at a pace that lawmakers and tech companies aren't matching, though the White House recently issued a call to action for digital platforms and services to tackle the problem.

The White House's recommendations included Congressional action to strengthen legal protections for survivors of image-base sexual abuse, and provide them with critical resources.

Keisel says that those who want to talk to a therapist should consider interviewing them about their treatment practices to see if they're a right fit. Survivors might avoid therapists who don't understand image-based sexual abuse, or who aren't trained to use a trauma-sensitive approach.

But Keisel doesn't want survivors to give up on the idea of healing, even if it sometimes feels unimaginable.

"There's this idea that those of us who've experienced this level of trauma are going to be stuck in place where we can't move forward," Keisel says. "When we have the right support in place, we move past these things in life."

If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.

Rebecca Ruiz
Rebecca Ruiz

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Prior to Mashable, Rebecca was a staff writer, reporter, and editor at NBC News Digital, special reports project director at The American Prospect, and staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a Master's in Journalism from U.C. Berkeley. In her free time, she enjoys playing soccer, watching movie trailers, traveling to places where she can't get cell service, and hiking with her border collie.


Recommended For You
How Big Tech is approaching explicit, nonconsensual deepfakes
A photo collage in which a woman's appearance is covered by a variety of mismatched facial features.

Google announces new tactics to curb explicit deepfakes
Laptop with Google Search pulled up.

Microsoft Bing amps up its ability to stop explicit deepfake images from appearing in Search results
The Microsoft Bing logo on a dark phone screen.

Kamala Harris deepfakes are going viral on TikTok and Elon Musk's X
Kamala Harris

Anti-deepfake legislation just took a major step toward becoming law
A photo of the U.S. capitol dome, blurred to look as if there are two of them.

More in Life
Deals under $25 still live after Prime Day: AirTags, speakers, more
An illustrated background with an Amazon Echo Pop, Amazon Fire TV Stick 4K, Apple AirTag, and Anker P20i earbuds.

The best Amazon Prime Day deals still live: Roomba, Apple, Dyson, and other top brands
various tech products

Samsung Galaxy Prime Day deals are still live: Save $250 on cult-favorite Z Flip 6 AI phones
Samsung Galaxy devices overlayed on blue and green illustration


Fidelity data breach compromises more than 77,000 customers
Fidelity logo

Trending on Mashable
Wordle today: Answer, hints for October 11
a phone displaying Wordle

Astronomers just found a galaxy way too advanced for its time
Galaxy forming in the early universe

NYT Connections today: Hints and answers for October 11
A phone displaying the New York Times game 'Connections.'

Tesla’s surprise announcements: Robovan and Optimus
Two images side by side. On the left is a screenshot of the Robovan. On the right is a Tesla promotional image of an Optimus robot serving someone a drink.

'The Platform 2's twisty ending, explained
A close-up of a topless, bald man holding a lit lighter.
The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!