Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I realized they jumped the shark when they announced the pivots to ads and porn. Markets haven’t caught on yet.




The porn pivot makes perfect sense. Porn is already quite fake and unconvincing and none of that matters.

It might not matter as far as profitability is concerned, ethically the second order effects will be very problematic. I am no puritan but the widespread availability of porn has already affected peoples sexual expectations greatly. AI generated porn is going to remove even more guardrails for behavior previously considered deviant, people will view and bring those expectations back to real life.

This is the same argument that people used for video games, "rock music" and violent movies.

I would argue that AI generated porn might be more ethical than traditional porn because the risk of the models being abused or trafficked is virtually zero.


> because the risk of the models being abused or trafficked is virtually zero.

That's not really true. Look at one if the more common uses for AI porn: taking a photo of someone and making them nude.

Deepfake porn exists and it does harm


The harms associated with someone creating a deep fake of you are real but they're pretty insignificant compared to the harms associated with being sex trafficked or being exposed to an STI or being unable to find traditional employment after working in the industry.

You couldn’t just photoshop that before ai came out?

What if you get a model that is 99% similar to your “target” - what we do with that?


Think about the change we saw in combat death tolls when things went from flintlock muskets to machine guns, or when battleships gave way to aircraft, and how many people died unnecessarily due to the generals who were slow to update their tactics. Deepfakes are like that because they lower the cost and improve the success rates enough to be transformative and they cause harm which can’t easily be countered. We’re not going to instantly train society to be better at media literacy and the police can’t just ignore reports of sex crimes, so we’re just having to accept that it’s easier to hurt people than it used to be.

Sure, someone skilled could spend an hour or so photoshoping someone nude. But any teenager can do that to a classmate in 30 seconds with ai

So just because the poor can do what the rich could do before what it means?

Before only rich can afford to pay a pro to do photoshop. Now any poor person can get.

So why when rich can is fine and when everyone can is a problem?


Uhh it wasn’t fine when the rich did it?

Would you support installing public spy cams in everyone's bedrooms so as to end the demand for human trafficking in porn?

No? And I didn't suggest deepfakes should be legal.

I was just pointing out that when you're talking about the scale of harm caused by the existing sex industry compared to the scale of harm caused by AI generated pornographic imagery, one far outweighs the other.


To perhaps make the same point as you in a different way, I have no issue with "deviancy" but I think it can accelerate the cycle of chasing a sugar high.

People spin up ablated models for pennies. You don’t need advanced reasoning for this crap. OpenAI has 8 billion plus in burn. I guess it’s all effectively paying for brand awareness?

Unfortunately, the porn pivot might be their path to "profitability".

Global porn industry revenue is 100B. They won’t take 10% of that. Real humans are already selling themselves pretty cheap or free en masse.

And there's no escape. The Internet was built for gambling and this.

They know where the money is.

I think people hugely overestimate how profitable porn (at least "actual" porn) is. Aylo (the owner of Pornhub) makes peanuts compared to Youtube or Disney.

It’s standard practice for VC companies to enshittify after building a moat, relying on user lock-in. What’s remarkable is how quickly they’ve had to shift gears. And with this rapid pivot it’s questionable how large that moat really is.

The porn / sex-chat one is really disappointing. It seems they've given up even pretending that they are trying to do something beneficial for society. This is just a pure society-be-damned money grab.

They've raised far too much money for those kinda ethics, unfortunately.

My hunch is that they don't have a way to stop anything, so they are creating verticals to at least contain porn, medical, higher-ed users.

Ah... The classic "If we don't do it, someone else will"

Tell that to the thousands of 18 year olds who'll be captured by this predatory service and get AI psychosis


I'm pretty sure that if they didn't deliberately chose to train on sex chat/stories, etc, then the LLM wouldn't be any good at it. The model isn't getting this capability by training on WikiPedia or Reddit.

So, it's not a matter of them not being able to do a good job of preventing the model from doing it, therefore giving up and instead encouraging it to do it (which anyways makes no sense), but rather them having chosen to train the model to do this. OpenAI is targetting porn as one of their profit centers.


>The model isn't getting this capability by training on WikiPedia or Reddit

I don't know about the former, but the latter absolutely has sexually explicit material that could make the model more likely to generate erotic stories, flirty chats, etc.


OK, maybe bad example, but it would be easy to create a classifier to identify stuff like that and omit it from the training data if they wanted to, and now that they are going to be selling this I'd assume they are explicitly seeking out and/or paying for creation of training material of this type.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: