40

24 comments

[–] proudcatlady 21 points (+21|-0)

I nearly died laughing when it said almost all the videos exclusively feature women, but users are encouraged to upload their own faces. Because women are so eager to see their faces in depraved and dehumanizing porn sites deepfaked onto the bodies of trafficked or coerced women! Absolutely

[–] space_out 3 points (+3|-0)

Oooh, I hadn't thought about that - by allowing users to upload their own faces they can get that AGP demographic.

[–] nemesis 21 points (+21|-0)

I find deepfake technology, while it has intriguing prospects in restoring film, it has far more universal consequences that will be long lasting. I dove into deep learning awhile ago out of curiousity and the predominantly male environment definitely used it for nefarious purposes. There are countless sites and telegram channels dedicated to using the technology on not just celebrities, but their coworkers and so-called "friends" too. It is terrifying how eager they are to do this without thinking twice.

There are ways to expose the content for being fake, as there's visible signs. But I don't think that even matters because the damage is done. I'd imagine feeling like I was sexually assaulted in a way if that happened to me. This is one of the main reasons I have little to no social media presence.

[–] klo137 18 points (+18|-0)

This absolutely needs to be illegal… it puts half the global population as a target for psychological sexual abuse and harassment.

[–] [Deleted] 16 points (+16|-0)

It makes me so angry that the victims are punished for this, and never the psychopaths making or sharing the videos, or the psychopaths contacting the victim's place of employment. It should be made illegal with immediate effect to make or share deepfakes and revenge porn. It should also be illegal to try to damage someone's reputation like these people do.

[–] visits_radio 11 points (+11|-0)

There needs to be criminal consequences; it's only way these men learn.

[–] legopants 8 points (+8|-0)

I thought this was called deep fakes and was illegal

[–] RisingUp 11 points (+11|-0)

I think it is illegal in many places, but depraved coomers don’t care about the law. It’s not like this app is in the App Store.

[–] zuubat 8 points (+8|-0)

Women this, women that, women here, women there, women all over the place in this article.

Guess who’s never referenced in the article, not even once.

Men.

[–] proudcatlady 7 points (+9|-2)

My 4chan-style solution to sites like these would be to flood them with images of children’s faces. Obviously we can’t do that because of the harms to children, but if there’s one thing most people still agree on, it’s that kids need to be protected from this shit. And there are clearly zero safeguards in place even for kids. Adult women don’t matter, but involve underage boys and I bet those sites would all vanish overnight.

[–] Texture 9 points (+9|-0)

thispersondoesnotexist.com

But I don't think we need to stoop that low. Just tell normal people deepfakes are being used to create child sexual abuse images and what can be done about it, i.e. outlawing deepfakes on defamation grounds.

[–] Mmmm_Brains 5 points (+5|-0)

I believe that this was already being used to make CP, it wouldn't be that hard for these creeps. And this is probably just one of many for these types of AI's and deepfake generators. I don't think we'll ever know how deep this all goes.

[–] space_out 4 points (+4|-0)

I'm like 99% sure that app is from either Russia or China, so flooding it with Putin and Xi Jinping should do the trick. Throw in Kim Jong-un for 3-some scenes.

[–] proudcatlady 1 points (+1|-0)

I have no issue with doing this. If I come across one I’ll get to it.

[–] Texture 6 points (+6|-0)

Can't the app creators be sued for defamation?

[–] space_out 1 points (+1|-0)

If they're from Russia/China/Iran then probably no luck. Those countries have, for example, ransomware gangs causing losses of millions of dollars for US companies, their governments are aware and know the main players but won't do anything.

If they protect the criminals from huge companies, what luck is there for random women. Unless, of course, someone deepfaked a steamy scene between Putin and Winnie Pooh...

[–] Liandra 1 points (+1|-0)

They are most likely in a country where they don't care. Sure, someone can sue them, but nothing happens, except that they have big lawyer costs. It would maybe create publicity, but of course someone has to have the deep pockets to go there.

[–] kalina 5 points (+5|-0)

This is why I have completely stopped uploading any pictures of myself on the internet, ever. I even ask friends not to post anything of me. Your photos can and perhaps even are be used for literally anything online.

[–] Srfthrowaway 0 points (+0|-0)

Yep. When my dad was still alive I had to put my foot down about him posting 30-40 year old photos of us as kids all over his emails. I'm pretty sure he kept doing it though but just cut me out of the mailing list haha.

I told him point blank about random men using those photos as porn and he got super huffy. He always had it in his head that nobody he knew personally would ever do anything to hurt a child (he was so, so very mistaken ☹️).

I tried so hard to explain that once those photos are on the internet, anyone can do anything with them and that it's like making a million flyers of the photos and dropping them all over various shopping malls, but I don't think he ever really got that part.

No surprise. The very first picture I received in email (1993) was a nude woman. Every new technology is applied first to degrading us.

Load more (2 comments)