Nobody will use this, it takes too much time and it ruins the art by adding a weird texture.
I understand the disappointment of real artists that the tech bros are stealing their lunch, but sadly it's the way all these things go. Pandora's box is opened, things aren't going to go back the way they were.
Even if an artist manages to completely guard themselves against exploitation by AI, the market expectation for art commissions and work will be 100x higher than it was before AI art in a few years. It's like the scene in There Will be Blood, and the milkshake of traditional artists has been drunk already.
This, trying to use this to stop AI art, is like trying to stop climate change, except 10000 times harder.
1.Like climate change, its a commons issue. AI learns from the artists at large, so a single artist protecting themselves does nothing. Nobody copies specific artists except the very top tier who have distinct and beautiful styles.
2.AIs will still have all the art produced before 2023 to train on. Which is a gigantic amount with huge room for optimization in training.
3.This can be trivially circumvented. Given the requirement of the glazed image to look identical to the human eye, it must be possible informationally to reproduce the image without the glaze.
4. It should be trivial to train a quick GAN to revert this glaze, given you can trivially artificially create before-after datasets using this very tool.
I don't really like this view point, respectfully. It reads like "learn to code" except now that it is coming for programming as well. Perhaps the new slogan should be learn to plumb. Your point about climate change is apt, just because it is hard doesn't mean there should not be some sort of protections put in place to protect intellectual property of the artists. Overtrained AI produce something similar to plagiarism. There should be legal protections against that.
OK. Let's assume legal protections are put in place tomorrow and today's living artists magically (because it skips all the details of "How do you make that work?") acquire all the rights and enforcement abilities they could wish for. What do we expect changes?
I think the uncomfortable answer is that for a lot of commercial art uses, nothing changes. In many cases people just want something to fulfill a need and aren't really all that picky about it. Maybe the style changes to be one of the Old Masters instead of something current. Getting it fast and cheap means they don't have to think about it much. A great many commercial art jobs will vanish just as they will outside this hypothetical.
Perhaps we should pause and identify what outcome we want before prescribing policy. Are rights our priority, or are we trying to secure the incomes of artists?
The real desired outcome seems to be "artists can make a living from doing art (without needing to be famous)". Unfortunately that goal has been dubious at the best of times, even before AI. The same questions are happening in fiction writing.
It's been dubious in fiction and music for a long time as well. At a remove, this feels less like a major shift and more like a collective agony as a dream that felt within reach recedes into the distance.
I don’t know I truly don’t. I don’t have any answers. Because you might be right. Especially for commercial art. Just the same as climate change, I don’t see any solution. It all gets worse and worse for everyone involved, except a select few. The outcome I would assume, would be that the livelihoods of artists are protected, or at the minimum that if they do not want a program to mimic a clearly unique style, that it doesn’t. That might just be pie in the sky dreaming at this point though. All desk work is going to disappear, everyone will be displaced.
What is your goal here? What outcome do you want to achieve? It sounds like your primary goal is financial, with rights being a means to an end. Is that accurate?
That might be accurate, however it is not just financial right? I would say that is the bulk of it though. To the self-worth side of it, someone posted below, but humans tie self worth to their profession as well right? That is probably something wrong with society as a whole but that is how people work as of this day. In the case of an artist (I am not an artist just to be clear), they've worked decades to hone this craft correct? To have it scraped and used to a level that only a machine can do. Something about that feels wrong? I get that is an emotional appeal. Which is probably why I will never write policy.
Once upon a time, when thread and cloth was all made by hand, the equivalent of a t-shirt cost the equivalent of $5000 in labor alone. Today we consider that absurd. The spinners, fullers, and weavers of the time considered it their livelihoods, the way they fed their children, the way they achieved economic and social status, and thus just and right. Today we're far enough removed from those days to consider their perspective transparently self-serving for all that their distress must have been immensely real and sincere. We use mechanical looms now. We broadly agree that people have better things to do with their lives than spin rough thread, better ways to contribute to society.
So I get why it feels wrong to many, but I also have no trouble placing it and those objections in a context of a recurring historical pattern.
You are ultimately right. Progress cannot be stopped, won't be stopped. My only wish is to help mitigate the pain. You brought up t-shirt manufacturing. That was true and while the luddites ultimately lost, I thought maybe a lesson society could have gained from that was some level of compassion to who it is happening to. Because it's not just artists this time, it's not just the factory, this time it is every profession that can be done at a computer. I am a programmer and I feel like I see the writing on the wall for us too. Soon we will be the Luddite who is scorned, but yes, society and economics don't owe me or any artist a damn thing. This is an emotional post, but reading these stories I feel like the only answer I have come up with is to just allow it to up-end all of our worth, security, and jobs. If you are a 50 year old corporate artist, hopefully something help you on your way to a new profession. It's the same way why I feel like society doesn't do nearly enough for coal miners either. Coal is a nasty product, yes, and a coal miner should clearly not "just learn to code". I don't know what should be done, but something of compassion clearly needed to have been done ethically.
Why don't you say what you want? You don't want compassion. Compassion is an internal emotional experience. I am having compassion right now for the hypothetical hordes of unemployed former commercial artists as I consign them to history's scrapheap. Compassion is a story we tell to tug at the heartstrings of ourselves and others.
Compassion is not a policy. Compassion is not a plan. Compassion isn't what coal miners or Luddites wanted. What they wanted was to freeze in amber a way of life that served them well and could be passed on. That's both a reasonable thing to want and a very unreasonable thing to expect.
Or maybe you don't know what you want as an outcome. That's OK. It might be worth thinking about that.
For my own part, I'm not too worried about programmers quite yet. After all, programming is the easy part. All you have to do is get the business to decide precisely what they want and communicate it clearly. For the future, well, adaptability and rapid learning have been the hallmarks of every good engineer I've ever worked with. We're a flexible lot.
I don't know what I want as an outcome. I think that is part of the discussion. You have come at me with a lot of really good questions. I think the discussion is the most important before we hit that hypothetical soon.
For sake of clarity I am an engineer as well. I hope you are right? As I see it now, it seems like every single job is on the chopping block before I hit retirement age. We already see a few experts arguing about this already.
I think I am doing a poor job conveying what I want or mean by compassion, as you said that is true that the Luddites or coal miners in those examples want to pause time. That is clearly a very bad idea when it comes to something like coal, but at the same time I think when I say compassion it is not that I want to tug at the heartstrings of ourselves and others. I mean it in a selfish way that when we are able to answer the question as to what people do when their livelihoods are removed from them, we get a better functioning society. When we use coal miners as an example there has been a few studies that have shown much higher usage of drugs as despair has grown in those communities. I see that and have extrapolated that out to the large white-collar hypothetical hordes of unemployed commercial artists. Perhaps all desk workers (my self included).
It legitimately has kept me up at night trying to think what sort of policy, what responsibility do we all have now? Economically nobody owes anyone anything, but is that an ethical answer?
I don't know what that looks like. Which goes back to my original answer, I don't know what I want as an outcome. I just think that perhaps, we will see less animosity and less anxiety when we have better answers to our modern Luddites.
Unlike the luddites, information is so available today, that it'd be easy for an artist to predict the demise of their profession (presumably).
Therefore, why is it not the artist's responsibility now, to look for alternatives, and not wait till they truly become obsolete, and personal resources run out? If you went back in time and told the luddites that in 5 years time, their services would no longer be required by society, would they not retrain themselves instantly, instead of waiting for the 5 years and hope that society has some sort of welfare program ready for them?
And yet watermarks are a thing. I'm skeptical that no one will use this, especially if you can put it into a pipeline of multiple image transformations that "protect art". It could be bundled into tools or on platforms themselves as part of their value add. YouTube already transcodes your videos, why not DeviantArt encode your art except to verified human accounts or something?
Their efficacy is the more interesting question to me.
The full-resolution examples I’ve seen have shown this as much more destructive to the underlying art than usual watermarks, to the point where I’m not even convinced people would use it on samples when the underlying art wasn’t displayed online but sold through other channels.
> YouTube already transcodes your videos, why not DeviantArt encode your art except to verified human accounts or something?
So the idea is to be “protected” against automated scrapers building datasets for training, say, the core StabilityDiffusion models, but not against the huge number of individuals training (and remixing) checkpoints/loras/TIs/Aesthetic Gradients and exchanging them that are driving the ecosystem?
Maybe you have a different viewpoint of the ecosystem than me? Everyone I know that's used this is not an AI expert but still tech savvy. None of them have trained their own models. The path of least resistance for them would be to work around the "protected" art if they were doing something more custom
For someone with more effort and time to dedicate to cloning a specific artist, maybe not as useful. But I know less of this type of use case/the tools in that part of the ecosystem
> Maybe you have a different viewpoint of the ecosystem than me? Everyone I know that’s used this is not an AI expert but still tech savvy. None of them have trained their own models.
Look at something like Civitai [0] and the number of users submitting things there: pretty much everything there (there’s a few things like poses and tools that aren’t) is one kind or another of trained model (though some are merges of existing models and extractions of one kind of model from another.)
> * Nobody will use this, it takes too much time and it ruins the art by adding a weird texture.*
New Cloudflare offering:
ECA: Edge Cloaked Artwork.
Enabled at the edge with our blazing fast CDN, ECA is a brand new Cloudflare service that offers a groundbreaking solution for digital artists and creators! With our CDN network, we can now cloak artwork to protect it from unauthorized use, while still maintaining its visual appeal to the human eye. Our technology modifies the data of your artwork in a way that's invisible to the naked eye, making it nearly impossible for AI companies to use your artwork in their training samples. This not only ensures that your artwork is protected, but also helps prevent the proliferation of unethical AI practices. Trust us to safeguard your artwork and its value like never before. Try Cloudflare's artwork cloaking service today and experience the peace of mind that comes with true digital security.
The description (as well as the project as a whole) is incredibly disingenuous and capitalized heavily on artist's fear of being trained by AI.
Even one pass of SD's img2img with low denoise is enough to bypass this data poisoning attack. It's a useless attack that makes training a tidbit more inconvenient.
The project also gained infamy when it stole code from an open source project with GPL license without giving credits.
Additionally, SD now offers opt outs for artists and MD likely does not train on these artists at all, and there are way more effective ways to protect art than using this algorithm.
This project and a lot of projects in the future will be in a similar vein. Something that appeals to the fear of being cheated or made irrelevant by AI by returning primacy to the human.
It all seems good intentioned but also naïve, like when robots take over doing all surgery and massively reduce the risk of accidental death, surgeons will get together and insist they ban the robots and that humans still do it better, even if they don't.
This is entirely false. My s.o is relatively well known in comic art communities and they (and many others) are using this right now. I don't think HN quite grasps how much artists are trying to ensure ownership and governance over their art.
I understand the disappointment of real artists that the tech bros are stealing their lunch, but sadly it's the way all these things go. Pandora's box is opened, things aren't going to go back the way they were.
Even if an artist manages to completely guard themselves against exploitation by AI, the market expectation for art commissions and work will be 100x higher than it was before AI art in a few years. It's like the scene in There Will be Blood, and the milkshake of traditional artists has been drunk already.