Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm pretty sure that they're aware of the concept but not the term CSAM.

CSAM is just the latest iteration of the term we use for the concept due to the euphemism treadmill.

https://youtu.be/hSp8IyaKCs0?si=l5BbV39-rxC4UY8t



I mean, if you expand the acronym it isn't euphemistic at all!

I think it's actually a little too specific. It's trying to exclude "innocent" things like selfies or "simulated" things like drawings, but those are just as illegal in some or all countries.


> It's trying to exclude "innocent" things like selfies or "simulated" things like drawings, but those are just as illegal in some or all countries.

Selfies shared between teenagers are innocent! When an adult enters the picture is when it becomes a problem, because yes: it’s abuse and exploitation that we’re objecting to. Unless we think morality is a question of aesthetics, then yes, this also means drawings are not in and of themselves our concern.

Why do think this is unreasonable?


Because the production process didn't involve abuse, but it's still illegal to produce them. The distribution of them does though.


It’s not entirely clear what this is in reference to or what you’re attempting to say.


It's more nuanced than you think.

How do you feel about 17 year olds collecting nudes of 14 year olds to gawk at and bully them over?


The supposition that this is “more nuanced than I think” given the particular example you’ve chosen strikes me as quite bizarre.

Does “17 year olds collecting nudes of 14 year olds to gawk at and bully them over” not strike you as abusive and/or exploitative? Because it certainly does to me.

I think you’ve chosen to interpret my post in an excessively literal manner (i.e only adults abuse or exploit teenagers) rather than the far more obvious alternatives I intended (e.g a 16 year old “sexting” their 17 year old partner).

Or, put another way, if I say we’re opposed to abuse and exploitation, and then you present me with a situation involving abuse and exploitation, of course I’d be opposed to it. Certainly you’re capable of figuring out this isn’t the sort of thing I was talking about.


I'm glad that you agree that selfies shared between teens are not intrinsically innocent and that adults entering into the picture is not the only thing to worry about in this situation.


Shellshock is to PTSD as kiddy porn is to CSAM.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: