It might not matter as far as profitability is concerned, ethically the second order effects will be very problematic. I am no puritan but the widespread availability of porn has already affected peoples sexual expectations greatly. AI generated porn is going to remove even more guardrails for behavior previously considered deviant, people will view and bring those expectations back to real life.
This is the same argument that people used for video games, "rock music" and violent movies.
I would argue that AI generated porn might be more ethical than traditional porn because the risk of the models being abused or trafficked is virtually zero.
The harms associated with someone creating a deep fake of you are real but they're pretty insignificant compared to the harms associated with being sex trafficked or being exposed to an STI or being unable to find traditional employment after working in the industry.
Think about the change we saw in combat death tolls when things went from flintlock muskets to machine guns, or when battleships gave way to aircraft, and how many people died unnecessarily due to the generals who were slow to update their tactics. Deepfakes are like that because they lower the cost and improve the success rates enough to be transformative and they cause harm which can’t easily be countered. We’re not going to instantly train society to be better at media literacy and the police can’t just ignore reports of sex crimes, so we’re just having to accept that it’s easier to hurt people than it used to be.
No? And I didn't suggest deepfakes should be legal.
I was just pointing out that when you're talking about the scale of harm caused by the existing sex industry compared to the scale of harm caused by AI generated pornographic imagery, one far outweighs the other.
To perhaps make the same point as you in a different way, I have no issue with "deviancy" but I think it can accelerate the cycle of chasing a sugar high.
People spin up ablated models for pennies. You don’t need advanced reasoning for this crap. OpenAI has 8 billion plus in burn. I guess it’s all effectively paying for brand awareness?
I think people hugely overestimate how profitable porn (at least "actual" porn) is. Aylo (the owner of Pornhub) makes peanuts compared to Youtube or Disney.
It’s standard practice for VC companies to enshittify after building a moat, relying on user lock-in. What’s remarkable is how quickly they’ve had to shift gears. And with this rapid pivot it’s questionable how large that moat really is.
The porn / sex-chat one is really disappointing. It seems they've given up even pretending that they are trying to do something beneficial for society. This is just a pure society-be-damned money grab.
I'm pretty sure that if they didn't deliberately chose to train on sex chat/stories, etc, then the LLM wouldn't be any good at it. The model isn't getting this capability by training on WikiPedia or Reddit.
So, it's not a matter of them not being able to do a good job of preventing the model from doing it, therefore giving up and instead encouraging it to do it (which anyways makes no sense), but rather them having chosen to train the model to do this. OpenAI is targetting porn as one of their profit centers.
>The model isn't getting this capability by training on WikiPedia or Reddit
I don't know about the former, but the latter absolutely has sexually explicit material that could make the model more likely to generate erotic stories, flirty chats, etc.
OK, maybe bad example, but it would be easy to create a classifier to identify stuff like that and omit it from the training data if they wanted to, and now that they are going to be selling this I'd assume they are explicitly seeking out and/or paying for creation of training material of this type.