New from 404 Media: FBI arrested man for allegedly using Stable Diffusion, a text to image AI model, to create thousands of images of minors. One of the first known instances of FBI investigating someone for using AI to create child sexual abuse material 404media.co/fbi-arrests-man-fo

Follow

@josephcox and the article says he was arrested for *sending the images to minors*, the headline is a lie

@sneak @josephcox those were the state charges. Check the second link in the article for the FBI press release

@dscw @josephcox wow, that's disgusting. why do they think it's CSAM if there is no child, no sex, and no abuse?

@dscw @josephcox “The justice department will aggressively pursue those who produce and distribute child sexual abuse material – or CSAM – no matter how that material was created,” the deputy attorney general, Lisa Monaco, said in a statement after the arrest. “Put simply, CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive and increasingly photorealistic images of children.”

@sneak @josephcox The article is fairly misleading, he stands accused of quite a few things not mentioned there. He didn't just generate a few random images and that was it.

@olives @josephcox the doj claims that machine-generated imagery can be CSAM (which is plainly false) and that it's thus illegal (which seems untrue to me, but i don't know the law).

do you have more details?

Sign in to participate in the conversation
Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!