Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To take an extreme example, child pornography is available on the internet but society does it's best to make it hard to find.


It's a silly thing to even attack, and that doesn't mean be ok with it, I just mean that shortly, it can be generated on the spot, without ever needing to be transmitted over a network or stored on a hard drive.

And you can't attack the means of generating either, without essentially making open source code and private computers illegal. The code doesn't have to have a single line in it explicity about child porn or designer viruses etc to be used for such things, the same way the cpu or compiler doesn't.

So you would have to have hardware and software that the user does not control which can make judgements about what the user is currently doing, or at least log it.


Did its best. Stable Diffusion is perfectly capable of creating that on accident, even.

I’m actually surprised no politicians have tried to crack down on open-source image generation on that basis yet.


I saw a discussion a few weeks back (not here) where someone was arguing that SD-created images should be legal, as no children would be harmed in their creation, and that it might prevent children from being harmed if permitted.

The strongest counter-argument used was that the existence of such safe images would give cover to those who continue to abuse children to make non-fake images.

Things kind of went to shit when I pointed out that you could include an "audit trail" in the exif data for the images, including seed numbers and other parameters and even the description of the model and training data itself, so that it would be provable that the image was fake. That software could even be written that would automatically test each image, so that those investigating could see immediately that they were provably fake.

I further pointed out that, from a purely legal basis, society could choose to permit only fake images with this intact audit trail, and that the penalties for losing or missing the audit trail could be identical to those for possessing non-fake images.

Unless there is some additional bizarre psychology going on, SD might have the potential to destroy demand for non-fake images, and protect children from harm. There is some evidence that the widespread availability of non-CSAM pornography has led to a reduction in the occurrence of rape since the 1970s.

Society might soon be in a position where it has to decide whether it is more important to protect children or to punish something it finds very icky, when just a few years ago these two goals overlapped nearly perfectly.


> I saw a discussion a few weeks back (not here) where someone was arguing that SD-created images should be legal, as no children would be harmed in their creation, and that it might prevent children from being harmed if permitted.

It's a bit similar to the synthetic Rhino horn strategy intended to curb Rhino poaching[0]. Why risk going to prison or getting shot by a ranger for a 30$ horn? Similarly, why risk prison (and hurt children) to produce or consume CSAM when there is a legal alternative that doesn't harm anyone?

In my view, this approach holds significant merits. But unfortunately, I doubt many politicians would be willing to champion it. They would likely fear having their motives questioned or being unjustly labeled as "pro-pedophile".

[0] https://www.theguardian.com/environment/2019/nov/08/scientis...




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: