@josephcox and the article says he was arrested for *sending the images to minors*, the headline is a lie
@dscw @josephcox wow, that's disgusting. why do they think it's CSAM if there is no child, no sex, and no abuse?
@dscw @josephcox “The justice department will aggressively pursue those who produce and distribute child sexual abuse material – or CSAM – no matter how that material was created,” the deputy attorney general, Lisa Monaco, said in a statement after the arrest. “Put simply, CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive and increasingly photorealistic images of children.”
@sneak @josephcox The article is fairly misleading, he stands accused of quite a few things not mentioned there. He didn't just generate a few random images and that was it.
@olives @josephcox the doj claims that machine-generated imagery can be CSAM (which is plainly false) and that it's thus illegal (which seems untrue to me, but i don't know the law).
do you have more details?
@sneak @josephcox those were the state charges. Check the second link in the article for the FBI press release