Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How come this is only happening in Spain?


This has been discussed on HN before and the overwhelming HN response was “why would you care, it’s not an actual nude?”

Lots of tasteless people doubled down that they wouldn’t even care if it were their own children or wives (let’s be real, it’ll be 99% women and children) being portrayed like this against their consent (“consent” doesn’t exist in such a context, the argument goes).

The only defense against this, which is already too late for most extant children to employ, is to never post photos of your children online. But hey whatever it takes to avoid scrutiny and regulation eh?

TBC, I’m not saying it’s obvious how to regulate/control/mitigate it, but that’s a totally different argument from the one I encountered which was “there’s no relevant concept of consent.”


For me, it’s not that Im tasteless, it’s that I don’t think consent is needed to make an image with my face put into a naked body. At least if it’s for personal use.

In the US at least, there are laws that prevent commercialization of my likeness, or harassment, or defamation. And those can be applied if someone sells images of me on a nude bode.

But for personal use, it seems as pointless to outlaw as someone imagining me naked. Is that “tasteless?” Perhaps. Of my child, that’s gross.

Im not sure how to stop this though as I think any control to try to outlaw perverts from staring at people, or drawing images of people, or using these apps without consent would cause more harm than the harm they seek to prevent.


Insofar as we create laws to protect people from undue harm, it is a well known fact that victims of CSAM are victimized twice: first when the CSAM is produced and again when it’s viewed.

“Yeah everyone at school is looking at it, but don’t worry it’s only pixels that represent your naked body!”

The fact people can now produce CSAM that doesn’t inflict the first victimization does not eliminate the second, and the fact the current law wasn’t designed with this technology in mind doesn’t mean no harm is inflicted and we shouldn’t be responding to it.

Of course you’ll argue that someone could paint a nude child with a real person’s likeness and I’ll point out that an actual photograph of a nude child is also not “really them.” We gauge the badness of any location on this spectrum by the harm felt. A thing that ~never happens, ~never causes harm, and is very difficult to produce (high quality paintings of nude children) is strictly less bad than something that costs $0, 0 hours, 0 talent to produce at significantly higher quality (synthetic CSAM).


I don’t buy the “it’s easier to do now so the law must change” argument. You actually just admitted that there isn’t anything fundamentally different between a hand drawn depiction and an AI generated one. Both may be tasteless in most situations, but in neither case is there a victim.


> Insofar as we create laws to protect people from undue harm, it is a well known fact that victims of CSAM are victimized twice

I note you didn't say harmed twice.


> it’s that I don’t think consent is needed to make an image with my face put into a naked body

Can you explain more of your reasoning why you don't think consent is needed here?

I think the main thing here is that there is a very clear difference between mentally imagining someone nude and a fake version of their nude body being created. One is in the mind only and another actually exists.


> I think the main thing here is that there is a very clear difference between mentally imagining someone nude and a fake version of their nude body being created. One is in the mind only and another actually exists.

I think things through to what it will be like when we have digital consciousness. I think that imagining someone naked if our brain is simulated and running digitally so the memory can be saved and retrieved is the same as sketching a nude or writing in someone’s diary. It’s private thoughts and it’s best to not dig into others’ private thoughts.

I think the problem to focus on is when images are sold or traded or distributed.

I think things in my mind exist as much as a private sketch. It’s just that the sketch can be found by others but the thought cannot.


Not only exists, but also possibly difficult to distinguish from a non fake.


Even shourd you disagree, it's useful to respect other's wants not to have their faces on those bodies.


How do you know if people don’t want their faces on bodies? Should everyone ask everyone’s consent? Should I register my consent to be nudified like registering for an organ donor?

There’s a debate in cosplay over whether consent is needed to take someone’s photo. Many people like to require consent. I think it’s polite to ask, but don’t feel bad if someone takes a picture without asking.

And certainly not if someone takes a picture without me even knowing they did. This seems like something that it’s better for me not to worry about and focus on other things.

The energy to pass a law or make sure all photos have consent would be such a pain to implement.


[flagged]


Lots of judgements here.

> you specified perverts. these tend to be male. men.

No, I don’t think this is true. There’s all genders of perverts and I don’t think they tend to be male. Happy to see your evidence though.

> you are likely male

No

> bucketing yourself as good

I don’t bucket myself into a good or bad bucket.

> women are an afterthought

Not at all. It’s funny you would bring this up as I don’t think my comment really has any position on men or women.


hilarious.

women were the victims in the article.

in case you forgot.


Women are the victims in this article, but we’re discussing what should be done regarding the topic overall, not just in relation to this article or these victims.

I don’t think anyone proposed banning nudes only for women, or only prohibiting men from making deepnudes. Or even proposing rules only for Spain. This issue affects a really broad population.


> you’re afraid to curtail the rights of bad men for the sake of protecting women because you are afraid of what might happen to good men.

You could of course ascribe bad motives to your ideological foes, but I find it weird you don't allow "this law is bad for humanity" as a possible motive.


because it is lazy. good and bad are both subjective.

shutting conversation down with “this is bad for humanity” additionally marginalizes a large portion of it. the alternative is also bad for humanity.

i find it weird that you view it as binary.


> i find it weird that you view it as binary.

This implies that you believe in binary right and wrong and I'm "out" but you lack conviction


you described it in a way that implies it to be binary. my noticing is not my belief.


You finding it weird implies you believe there is a right and a wrong way to behave.


These images aren't personal use, though. They are being shared around. How does that not constitute harassment or defamation?

Also, child porn laws apply to people who acquire the material for personal use. You can go to prison for possession, not just for selling. It would take only a small update to the law to have this apply to AI-generated imagery of real children as well. Even that may not be necessary, as it becomes increasingly difficult to differentiate AI-generated photos from real photos.


> These images aren't personal use, though. They are being shared around. How does that not constitute harassment or defamation?

This is the key part: nobody would be upset if these were only for the personal use of the creator - teenagers have been fantasizing about their peers as long as they’ve existed - but the article mentions multiple cases of girls being confronted with non-consensual imagery and at least one of those was an attempt to extort real pictures. Whatever nuanced discussion there is about creating these images doesn’t apply to them being used as weapons.


I’m saying that they are harassment. So prosecute that.

I think in the US any images of children are illegal, even if they are computer generated. So it’s currently illegal to possess.


> I think in the US any images of children are illegal

Citation needed, but unlikely.


I would be interested to hear how they would view the situation if someone was creating deepfakes of their mutilated corpses. Or their family's mutilated corpses. You know, because since it's all synthetic no harm could have possibly been done.


You are changing the question to a different one and then answering it. There should be a fallacy name for this.


What’s the original question?


> How come this is only happening in Spain?

This one I suppose


“How come this is only happening in Spain?”


Answer: it’s not only happening in Spain, it has been discussed on HN before with similar amounts of pedantry around the corners of the issue and not the real crux of it which is we are facing a proliferation of technology that will proliferate this sort of material. I really just have a hard time interpreting “why only in Spain” as a good faith question rather than something meaning to steer the conversation around the real problem or portray it as not-real (such as the related "if true, why don't we see this of every celebrity" comment above). Maybe it was good faith and I’ll readily take the L for interpreting it otherwise.

In any case I see now that I forgot to put the “it’s not” in my original comment.

This was at least one article that was previously discussed here (not in Spain): https://www.bleepingcomputer.com/news/security/sextortionist...


would it not be “moving the goalpost”?


This is a fine line. There are people who care. But at the same time this crosses boundaries of freedom of speech.

The content is completely generated and and there is effort to censor this content.

Think about it. Censoring generated content is equivalent to censoring art. It's morally equivalent to censoring art because you don't like the art.

There's a problem from many perspectives and no one perspective is completely right. There's the female perspective where their beauty, rank and character is judged on their perceived promiscuity levels. Pictures obviously effect the womens ranking and the spouses care too so this is a factor. But it does not negate the existence of the freedom of speech perspective too.

The problem here is that it's too fuzzy you can't strictly define what is creative art and what is a reputation bashing porno. What if the art was generated but just happens to look like the daughter off of pure coincidence? Similar to how a real person can look like another person randomly.

What if someone that looks extremely similar to you starts modelling for porn. Do you ban and censor that person? Do you ban art that coincidentally looks like you? What if there's nude art where half the population finds it offensive and the other half does not?

Once you cross this line because the boundaries are so fuzzy it's easy to overstep.

China a very hated country by the US doesn't consider freedom of speech. They jump right in with what they believe to be the most reasonable decision. They would of course severely punish anyone who makes AI generated porn of real people. But we all know the consequence of this is china crosses too many fuzzy lines.

If you have a wife and daughter do you prefer the china way or the more "freedom of speech" way?


There’s plenty of material that lands near the fine line and there’s plenty of material that lands nowhere close to it.

Let’s start by saying “synthetically generated pornography of a real person created without their consent is bad.”

This conversation is very very far from trying to figure out the intricacies of complicated cases. This is people saying, “there’s no relevant concept of consent” and “there’s no harm inflicted by pixels which just so happen to look like a real person.”

For the former, you merely have to ask why we have any concept of consent anywhere at all, and the answer is: to make it illegal to do certain things which inflict certain harms. So here is a new specific way to produce a new specific harm.

Don’t think it’s a harm? You merely have to ask someone who has had this done to them if they feel harmed or not. A bunch of 20-50 year old dudes on HN saying “I wouldn’t care lol” doesn’t qualify. I personally am more inclined to trust the victim advocacy groups and researchers who overwhelmingly agree that sharing nonconsensual sexual imagery of someone does inflict harm upon them. This argument is apparently so obvious that we in fact have curtailed freedom of speech along various boundaries of this area, and only complete weirdos or “I am very smart because I logic good” people have a hard time understanding why that is.


>”Synthetically generated pornography of a real person created without their consent is bad.”

I would agree with the premise, but not the line of reasoning that follows. Personally, I would say that the freedom of speech will always be more relevant and much more important than the subjective experience of perceived harm.

Much like I don’t think the government should be able to dictate to a woman whether she’s allowed to have a medical procedure or not, I don’t think a corrupt group of politicians should be the ones to dictate what kind of images my computer is allowed to make, assuming that I’m starting with an entirely ethically-sourced dataset.

I think it’s also worth considering why laws like these exist in the first place. They’re good and important, but I don’t think the root reason for them is to stop these kinds of images from being distributed. I think the root reason for these laws is to stop the production of CSAM using children, which absolutely is incredibly harmful and disgusting. I think that’s a good thing to stop, and if some guy in his basement looking at AI porn that stops him from supporting material that actually hurts real, living, breathing kids, then I think I’m conceptually okay with it. If it protects children and reduces the CSAM market, both consumption and production, isn’t that an overall good thing?


The reason it's illegal to distribute and possess CSAM is, to my knowledge, not to mitigate demand for its production, but specifically because we have every indication that its distribution (really, knowledge of its distribution) does inflict subjective harm on the victims.

The reason anything is illegal is because of the subjective experience of perceived harm. There is no harm that exists other than the "subjective", "experienced", and "perceived" kind. The question is just whether the rest of society empathizes enough with that harm to protect against it.

I do find the argument around allowing (fully) synthetically produced imagery in order to satiate demand for CSAM without harming real people an interesting one. I'm not sure where I land but probably probably somewhere near "if it actually reduces demand for harm-producing imagery, then good, and if it does not reduce that demand, then bad." I.e. this position would probably increase everyone's chances of coming across synthetic CSAM which in turn might actually increase the market for "authentic" material. Not sure how this would play out.


If we just banned things because it harms people then journalism would be out of business. Any damning article with an associated damning photograph would be censored. In your blindness you crossed several lines already.

It's your attitude that creates places like china where they censor and stop freedom of speech. I wouldn't want people like you in power. I would prefer someone like me who can see both sides of the story.

> “synthetically generated pornography of a real person created without their consent is bad.”

It's not of a real person. The argument can easily be made that the person just happens to look like the real person. I guarantee you for every nude painting of a fictitious person there is at least one person in the world who looks identical to that photo.

In fact the science says there's roughly seven people per person. https://www.audacy.com/y98/news/you-have-at-least-six-doppel....

That renders almost all realistic nude art of fictitious people illegal under your rules.

>This argument is apparently so obvious that we in fact have curtailed freedom of speech along various boundaries of this area, and only complete weirdos or “I am very smart because I logic good” people have a hard time understanding why that is.

Nah. I'm not like that. Let me spell out what your saying here without that "logic". You are calling me a pedantic over logical ass hole while trying to skirt the rules of HN. But your intention is clear you basically called me a weirdo. Haha well fuck you too.

The opposite of that would be an idealistic idiot out to protect the sanctity of all women without rational thought. A simp who who tries to display his false confidence by getting angry at anything that even questions things that slightly harm women. And I'll be straight here, that sinp is you.

You realize women have the upper hand in many arenas in life that are unfair towards men? They can ruin you by declaring rape or harassment. They can fucking end you on on just an allegation. In divorces they can take your child and half your wealth. And MANY women exploit this imbalance. The current status quo exists because of some idiot like you didn't think shit through and just wanted to "protect" women so they made these things into laws.

This does not mean that what happened here is not wrong. This topic deserves fair discussion and women deserve to be treated as equals. But this blind allegiance to protect women at all costs and get fucking angry when someone even questions these things is blindness. You put women on some sort of pedestal. Wake the hell up. They are human and it's stupid to suddenly make sweeping censorship laws just to protect their sacred purity. Open your fucking mind.

This topic and the intricacies and far reaching consequences deserves a genuine discussion without you trying to skirt the rules and call me some over pedantic weirdo. Don't be a Dick.

Engage in conversation with people who talk about these things. Don't be all uppity like you're on some moral high ground and think you're 100 percent justified to just rage because your protecting women. I definitely don't see this type of behavior when people on HN talk about other issues. Those people who like to pretend to be overly "rational" don't get called out on other topics. But when it comes to women's issues then that's when cliche characters like you appear... trying to be all protective for no reason.


Glad I'm dealing with a sensible interlocutor here.

I was not calling you a weirdo or logic pedant actually, I was referring to people who have a hard time understanding why we have already curtailed freedom of speech in similar areas, such as making it illegal to produce or distribute or view CSAM. Clear "violation" of free speech, yet we seem to be just fine having done it. If you also struggle with that concept due to "we could become China" then yeah, you're a weirdo and/or pedant, but I don't think you've made that argument. Rather my position is that you likely do see why we've made it illegal to produce/distribute/view CSAM, and the same logic powering that position can inform a position here.

And yes I agree that this deserves genuine discussion, which is why I've been laying out my arguments for it and drawing analogies to existing protections we already have which already curtail freedom of speech, as a response to your "we could become China" refrain.


Csam laws make sense of course. But there are fuzzy boundaries here too.

If your 18 and you have sex with a 17 year old. That can be classified as rape. Not sure the age of csam, but a 17 year old would enter this boundary. It doesn't make sense that it's illegal at 17 and suddenly legal at 18.

I think this is a necessary downside to a necessary law. But the thing with AI generated art is that the boundaries are too fuzzy. You will get worse consequences then just the fuzziness of the 17 and 18 year old boundary. And those consequences are farther reaching.

What you're not seeing is that csam despite it's necessity means we are already traveling down that path of "China." You are already using that law and laws related to argue for more laws. Draw the trend line. Personally I think censoring AI generated art could represent a sort of tipping point where everything changed.


What is your opinion on revenge porn, nonconsensual upskirt photos, or peeping Toms (such as hidden cameras in bathrooms)?


Is a test?. The app was offered only targeting Spanish children and riding the current wave of moral panic. The more transgressive it seems, the more attractive will be seen by children craving to be targeted as "malote" by their peers. So yep.

Hey boy, try this "free" app and create fake nudes of your friendenemies with the chain of selfies that they drop all the time. Lots of lols guaranteed.

Soon in your own children's room with a 150% probability. This is just a preheating (IMAO).


It's been found, talked about, and got enough notice to make news which made it to HN... from Spain. Other places are either not aware yet, or not making enough noise. The models are available everywhere - I'm sure it's happening all over the place.


> I'm sure it's happening all over the place.

Meanwhile last week in Sweden

https://www.theguardian.com/world/2023/sep/11/sweden-says-ba...

Probably related.


Some teenager with right skills and intention just happened to be living in Spain.

Not all of the tech world lives in the SV.


It is news in Spain because, in Spain, there is a 1000+ euro fine for transmitting intimate photos of a person (of any age) without their consent. This applies to anyone who transmits the photo, including third parties. The controversy and legal question is whether or not these photos constitute intimate photos.

This probably happens elsewhere, but there is no legal mechanism to prevent it.


You wouldn’t say that about a news story in the New York Times would you?

This is a Spanish newspaper covering news in Spain.


I’ll bet a substantial amount of money it’s happening elsewhere.

And sooner or later I’d expect the developers of at least one of these apps to end up in serious legal trouble.


[flagged]


This seems like a pretty grass roots effort to me: mothers horrified that nude pictures of their 12yo daughters are being created with generative AI. These young girls (not yet women) are indeed victims, no narrative construction required, and stories like this can easily spread virally among parents, esp. mothers of girls, without any push needed. I'm not even sure tech regulation is needed or sought: a few lawsuits making this unprofitable as a mobile-first service will probably reduce the barrier-to-entry enough to protect most youngsters. Only the most motivated zoomers can use the command line. I am suspicious of the fact that this appears to stem from a free service: I assume they are collecting data, but I thought using or storing images of minors required consent from a guardian in Europe?


The mass media (including social networks) get to pick and choose what they broadcast. No matter how hard you try or how despicable something that happened to you is, if the mass media don't want to give you coverage, no one will ever hear of you.

So yes, this is getting publicity because the government wills it.


Does Spain's government control mass media and decide which articles get published and which are quelled?


yes, it is called "institutional advertisements", this way they can poor money on your newspaper if you behave well.


You might want to reconsider claiming sexual harassment as a right-wing position. It’s clear that you don’t like progressives but there’s no evidence even remotely supporting dismissal of this as propaganda, and this kind of thing will affect everyone regardless of their politics.


That is not what I said.

Edit to reply to the comment below (I'm rate limited):

I am not dismissing it as if it was something that didn't matter. I am saying that the coverage of this issue is absolutely overblown, hours and hours nonstop of reporting. I assume you don't live here which is why you don't see it. Every news story where the victim is a female is given an absurd amount of coverage because the government is pushing this narrative where women are always victims and men are always aggressors in the interest of maintaining a divided society.

Things are so bad that Spain currently has unconstitutional laws where men and women receive different sentences for the same crimes, but the three most popular parties are progressive and no one is batting an eye or taking it to court.


Then what is your goal in dismissing it as “propaganda”? That’s not how you’d express disapproval and if we’re talking about crimes being committed against children - which we are - your right-wing political beliefs wouldn’t be relevant. Sexual harassment isn’t supposed to be a right or left wing issue.


Spain has probably the most progressive government in Europe so they'll use specter of harming women to pass more laws outlawing more software. It'll then go up to the EU so they can control us a little more.

I must say the story sounds like an older scam in which you "romance" someone to get them to send you naked pictures of themselves then extort them. The AI just skips the time cost to get the material.


> Spain has probably the most progressive government in Europe

Uh, citation needed on that one.

https://www.reuters.com/world/europe/spains-far-right-vox-op...

https://sensationalspain.com/is-spain-liberal/


Easy. The election in July has not resulted in a new government yet. (FYI to Americans lamenting FPTP: this is what you get with proportional representation). The previous government is now a caretaker. That government, specifically the PM, is/was in its second term. That government is a coalition of various socialist parties. Your bogeyman far-right party lost seats.

Sources: https://en.wikipedia.org/wiki/2023_Spanish_general_election and my own recollection of media coverage


It seems like you’re here to accuse me of taking sides in a culture war. I won’t engage in that.

What do you think about the part of the article I posted that ranks Spain near the bottom of Europe for progressive policies?

I am incredibly skeptical of the idea that Spain is one of the most progressive/liberal/left wing countries in Europe on a factual level. This is a country that is more religious than most of the rest of Western and Northern Europe.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: