Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You know, if I've noticed anything in the past couple years, it's that even if you self-host your own site, it's still going to get hoovered up and used/exploited by things like AI training bots.

So? What do I care? If some stuff I posted to my website (with no requirement for attribution or remuneration, and also no guarantee that the information is true or valid) can improve the AI services that I use, great.



Wouldn't you feel just a little bad if you worked really hard on something, gave it out for free in the spirit of sharing, and someone came along and said thanks, loser, and sold it for money? Would you want to go on making it for free for them to sell?


No, not really? If others can get my stuff for free, then that means that whoever sells it for money must have done something to make it worth money. So they've earned it.


The ones who really lose are the one who buy their stuff while yours stays free.


No. If I cared, I wouldn't have posted the information in the first place... or I would have erected a paywall.


Even if no attribution etc is your personal policy that’s not everyone else’s.

The end result is that any authors who care about copyright protection will become less accessible. It’s a gold rush for AI bots to capture the good will of early internet creators before the well runs dry,


+1

My content is still MY content, and I'd prefer that if an entity is going to make money off of it directly (i.e., it's not a person learning how to code from something I wrote but rather a well-funded company pulling my content for their gain), that I at least have some semblance of consent to it.

That being said, I think there is no longer a point of crying over spilled milk. The LLM technology is out of the bag, and for every company that attempts to ethically manage content (are there any?) there will be ten that will disregard any kind of license/copyright notices and pull that content to train their models anyway.

I write because I want to be a better writer, and I enjoy sharing my knowledge with others. That's the motivation. If it helps at least one person, that's a win in my book, especially in the modern internet where there's so much junk scattered around.


Pretty much all of my work has been published on the internet over the last twenty years. Some of it has been commercial, some open source and some that is just for myself.

I’m pretty much done with that now, I doubt I will publish anything online again.


Even if no attribution etc is your personal policy that’s not everyone else’s.

That's up to the courts. As usual, we will all lose if the copyright maximalists win.


To me it looks like individual creators are the ones most likely to lose.

I was watching an interview with John Warnock (one of the founders of Adobe) and he was proud of the fact that the US went from having 25,000 graphic designers to 2,500,000 largely thanks to software his company created.

I do wonder if we are on the verge of reversing that shift.


The question you should be asking is if we need 2,500,000 graphic designers. Humans have a higher purpose than doing a robot's job.


Humans have a higher purpose than doing whatever job robots can't.


Whatever that purpose is, you're not going to achieve it while doing a job that robots can.


Which is the point I was making in the first place.


Last I checked creators of a work held copyright for it and that hasn’t changed. So no, this is not a new legal question


That's not how copyright law works.

That's not how anything works.


Ok. Thanks for your contribution to the discussion.


I think the source of the contrary sentiment goes something like this: AI stuff (especially image generation) is competition for artists. They don't much like competition that can easily undercut them on price, so they want to veto it somehow and lean on their go-to of accusing anybody who competes with them of theft.

The problem in this case is that it doesn't matter. The AI stuff is going to exist, and compete with them, whether the AI companies have to pay some pittance for training data or not.

But the chorus is made worse by two major factors.

First, many of the AI companies themselves are closed-source profiteers. "OpenAI" stepping all over themselves to be the opposite of their own name etc. If all the models got trained and then published, people would be much more inclined to say "oh, this is neat, I can use this myself and it knows my own work". But when you have companies hoovering everything up for free and then trying to keep the result proprietary, they look like scumbags and that pisses people off.

Second, then you get other opportunistic scumbags who try to turn that legitimate ire into their own profit by claiming that training for free should be prohibited so that only proprietary models can be created.

Whereas the solution you actually want is that anybody can train a model on public data but then they have to publish the model/weights. Which is probably not going to happen because in practice the law is likely to end up being what favors one of the scumbags.


I think that's an overly reductive way of looking at it. Artists, are by their definition, creators of art. AI-generated "art" (it's not art at all in my eyes) is effectively a machine-based reproduction of actual art, but doesn't take the same skill level, time, and passion for the craft for a user to be able to generate an output, and certainly generates large profits for those that created the models.

So, imagine the scenario where you, an artist, trained for years to develop a specific technique and style, only for a massively funded company to swoop in, train a model on your art, make bank off of your skill while you get nothing, and now some rando can also create look-alikes (and also potentially profit from them - I've seen AI-generated images for sale at physical print stores and Etsy that mimic art styles of modern artists), potentially destroying your livelihood. Very little to be happy about here, to be frank.

It's less about competition and more about the ethical way to do it. If another artist would learn the same techniques and then managed to produce similar art, do you think there would be just as visceral of a reaction to them publishing their art? Likely not, because it still required skill to achieve what they did. Someone with a model and a prompt is nowhere near that same skill level, yet they now get to reap the benefits of the artist's developed craft. Is this "gatekeeping what's art"? I don't think so. Is this fair in any capacity? I don't think so either. Because we're comparing apples to pinecones.

All that being said, I do agree that the ship has sailed - the models are there, the trend of training on art AND written content shared openly will continue, and we're yet to see what the consequences of that will be. Their presence certainly won't stop me from continuously writing, perfecting my craft, and sharing it with the world. My job is to help others with it.

My hunch is that in the near-term we'll see a major devaluing of both written and image material, while a premium will be put on exceptional human skill. That is, would you pay to read a blog post written and thoroughly researched by Molly White (https://mastodon.social/@molly0xfff@hachyderm.io) or Cory Doctorow (https://pluralistic.net/), or some AI slop generated by an automated aggregator? My hunch is you'd pick the former. I know I would. As an anecdotal data point, and speaking just for myself, if I see now that someone uses AI-generated images in their blog post or site, I almost instantly close the tab. Same applies to videos on YouTube that have an AI-generated thumbnail or static art. It somehow carries a very negative connotation to me.


> It's less about competition and more about the ethical way to do it. If another artist would learn the same techniques and then managed to produce similar art, do you think there would be just as visceral of a reaction to them publishing their art? Likely not, because it still required skill to achieve what they did.

Now suppose that the other artist studies to learn the techniques -- several of them do -- and then Adobe offers them each two cents and a french fry to train a model on it, which many accept because the alternative is that the model exists anyway and they don't even get the french fry. Is this more ethical somehow? Even if you declined the pittance, you still have to compete with the model. Even if you accept it, it's only a pittance, and you still have to compete with the model. It hasn't improved your situation whatsoever.

> My hunch is that in the near-term we'll see a major devaluing of both written and image material, while a premium will be put on exceptional human skill.

AI slop is in the nature of "80% as good for 20% of the price" except that it's more like 40% as good for 0.0001% of the price. What that's going to do is put any artists below the 40th percentile out of work, make it a lot harder for the ones at the 60th percentile and hardly affect the ones at the 99th percentile at all.

But the other thing it's going to do is cause there to be more "art". A lot of the sites with AI-generated images on them haven't replaced a paid artist, they've replaced a site without images on it. Which isn't necessarily a bad thing.


AI-generated "art" (it's not art at all in my eyes) is effectively a machine-based reproduction of actual art, but doesn't take the same skill level, time, and passion for the craft for a user to be able to generate an output, and certainly generates large profits for those that created the models.

(Shrug) Artists were wrong when they said the same thing about cameras at the dawn of photography, and they're wrong now.

If you expect to coast through life while everything around you stays the same, neither art nor technology is a great career choice.


There is no great career choice when AI can do most intellectual work for a fraction of the cost.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: