Tag Archives: deepfake

No, Really, I Didn’t Take Off My Clothes

Debbie says:

If I’m talking about me, then I really did take off my clothes for several pictures in Women En Large, and I’m proud of it. I think that’s what makes me even more horrified by Jessalyn Cook’s article in Huffington Post, “A Powerful New Deepfake Tool Has Digitally Undressed Thousands of Women.” 

The website promises to make “men’s dreams come true.” Users upload a photo of a fully clothed woman of their choice, and in seconds, the site undresses them for free. With that one feature, it has exploded into one of the most popular “deepfake” tools ever created.

… this new site has amassed more than 38 million hits since the start of this year, and has become an open secret in misogynist corners of the web.

I join with HuffPost in not naming or linking to the site but of course you can find it if you want.

Launched in 2020, the site boasts that it developed its own “state of the art” deep-learning image translation algorithms to “nudify” female bodies, and that the technology is so powerful there is not a woman in the world, regardless of race or nationality, who is safe from being “nudified.” But it doesn’t work on men. When fed a photo of a cisgender man’s clothed body, the site gave him breasts and a vulva.

With female and female-presenting subjects, on the other hand, the results are strikingly realistic, often bearing no glitches or visual clues that the photos are deepfakes ― that is, synthetically manipulated imagery. This drastically increases the potential for harm. Any vindictive creep, for example, could easily “nudify” and post photos of his ex online to make it seem, very convincingly, as if her actual nudes had been leaked.

I did have to stop and giggle about what the site did to the cisgender man’s clothed body, although I know there is nothing funny about this. Cook calls out revenge porn from a vindictive creep–of course, there is also office revenge, defamation of public or semipublic figures, and the hundreds if not thousands of other ways that those “vindictive creeps” can, once they’re done jerking off to their unavailable fantasy, weaponize this platform to destroy the reputations, careers, and sometimes lives of women. We  now know how the next Gamergate will play out …

But surely, this can’t be legal? Wrong again.

The victims of deepfake porn, who are almost exclusively women and girls, are often left with little to no legal recourse as their lives are turned upside down. And although the new site claims that it doesn’t store any images, it does generate shareable links to every “nudified” photo, making it easy for users to spread these pictures all over the internet as well as consuming them privately.

It’s unknown who is behind the site, which is riddled with spelling and syntax errors, just as it’s unclear where they are based. The operators did not respond to multiple interview requests from HuffPost. Last month, the U.S. was by far the site’s leading source of traffic, followed by Thailand, Taiwan, Germany and China. Now-deleted Medium posts demonstrating how to use the site featured before-and-after pictures of Asian women exclusively.

Cook goes on to give a useful overview of what hasn’t (and what little has) been done by law to limit the use of deepfakes and other Internet violations of privacy and decency. It all boils down to “hope no one does this to you, and if someone does, hope that you can overcome the consequences.”

The technology isn’t going anywhere, and the vindictive creeps (who represent a truly disgusting percentage of cisgender men) aren’t going anywhere either. What’s needed here is a unified stand by women and girls who have not (yet) been victimized by this: we need to change the laws, and we need to change the behavior of the corporate enablers, and we need to change the norms and expectations. If we care about the right to our own bodies with regard to birth control, abortion, safety from harassment, and so much more, we need to also care about (and fight for) our right to choose who sees us nude, and when, and where.

I took my clothes off because I wanted to, and I would do it again. But I will be damned if I think it’s okay for a man and a piece of software to take my clothes off without my consent.

======================

Follow Debbie on Twitter.

Follow Laurie’s new Pandemic Shadows photos on Instagram.

======================