on AI art, copyright and theft
on AI art, copyright and theft
on AI art, copyright and theft
AI images try to replicate the style of popular artists by using their work, often including work that was behind a paywall and taken without payment, thus denying the artists revenue. AI has taken something from the artist, and cost the artist money. Until such a time as we come up with a new word for this new crime, we'll call it by the closest equivalent: theft.
Also, someone did an experiment and typed "movie screenshot" into an AI and it came back with a nearly identical image from Endgame. Not transformative enough to be anything but copyright infringement.
Defining "taken something behind a paywall and thus denied them revenue" as theft is the exact same argument movie studios make when you pirate a movie. Theft implies that the original is gone. If I steal your car, you don't have a car. If I pirate a movie, we both have that movie. As someone who supports piracy, I would be careful to conflate piracy with theft. I think that's the entire point the post is making.
Fuck AI slop. There's enough other arguments against it. It destroys the environment and artists' livelihoods. We can point that out without supporting corporate copyright talking points.
AI images try to replicate the style of popular artists by using their work, often including work that was behind a paywall and taken without payment, thus denying the artists revenue. AI has taken something from the artist, and cost the artist money. Until such a time as we come up with a new word for this new crime, we'll call it by the closest equivalent: theft.
I'd argue it's much closer to piracy or freebooting. Generally, its use doesn't hurt artists, seeing as a random user isn't going to spend hundreds or thousands to hire a talented artist to create shitposts for them. Doesn't necessary make it okay, but it also doesn't directly hurt anyone. In cases of significant commercial use, or copyright infringement, I'd argue its closer to freebooting: copying another's work, and using it for revenue without technically directly damaging the original. Both of these are crimes, but both are more directly comparable and less severe than actual theft, seeing as the artist loses nothing.
Also, someone did an experiment and typed "movie screenshot" into an AI and it came back with a nearly identical image from Endgame. Not transformative enough to be anything but copyright infringement.
Copyrighted material is fed into an AI as part of how it works. This doesn't mean than anything that comes out of it is or is not copyrighted. Copyrighted matterial is also used in Photoshop, for example, but as long as you don't use Photoshop to infringe on somsone else's copyright, there isn't anything intrinsically wrong with Photoshop's output.
Now, if your compaint is that much of the training data is pirated or infringes on the licensing its released under, thats another matter. Endgame isn't a great example, given that it can likely be bought with standard copyright limitations, and ignoring that, its entirely possible Disney has been paid for their data. We do know huge amounts of smaller artists have had their work pirated to train AI, though, and because of the broken nature of our copyright system, they have no recourse - not through the fault of AI, but corrupt, protectionist governments.
All that said, theres still plenty of reasons to hate AI (and esspecially AI companies) but I don't think the derivative nature of the work is the primary issue. Not when they're burning down the planet, flooding our media with propaganda, and bribing goverments, just to create derivative, acceptable-at-best """art""". Saying AI is the problem is an oversimplification - we can't just ban AI to solve this. Instead, we need to address the problematic nature of our copyright laws, legal system, and governments.
No, it is theft. They use an artist's work to make an image they would otherwise pay the artist to make (a worse version, but still). And given how I've seen an image with a deformed patreon logo in the corner, they didn't pay what they should have for the images. They stole a commission.
And it is copyright violation. There have been successful lawsuits over much less than a direct image of RDJ in the iron man suit with the infinity stones on his hand. And if they won't pay an artist's rates, there's no way they'd pay whatever Disney would charge them
Yes, there's a lot of problems with AI. And yes, AI is a part of larger issues. That doesn't mean theft isn't also an issue with AI.
AI is a nazi-built, kitten blood-powered puppy kicking machine built from stolen ambulance parts. Even if stealing those ambulance parts is a lesser sin than killing those kittens, it's still a problem that needs to be fixed. Of course, AI will never be good, so we need to get rid of the whole damn thing.
...thus denying the artists revenue.
This assumes they would otherwise pay for it, and that they measurably harmed the artist's revenue. Those aren't a given.
Until such a time as we come up with a new word...
Use of copyrighted material without permission and possible deprivation of revenue. It doesn't need to be a single word.
I think libel is a good word to start using
Yeah, I don't agree. Unfortunately I'm not articulate enough to explain why I feel this way. I feel like they are glossing over things. How would you describe corporations willfully taking art/data/content form others without any permission, attribution, or payment and creating a tool with said information for the end goal of making profits by leveraging the work of others into a derivative work that completes with the original?(Holy run on sentence) If there is a better word or term than theft for what generative ai does then they should use it instead.
It's basically for-profit piracy. Which is still kind of a shitty term because actual pirates weren't copying any of the goods they were taking.
The most neutral term might be copyright infringement, though that carries all the baggage of the 'should copyright even exist'-discussion.
Alternatively, you could shout 'they took our jobs' to complain that they are letting algorithms and engineers do the work that artists want to do. IDK what to call this, but 'theft' or 'robbery' doesn't sound right.
I think the biggest problem is that the idea of copyright is good, but the implementation - in most places, anyways - is complete dogshit.
Like, I'm fairly certain the original implementation of copyright in the US only lasted 10 years or thereabouts. Like, that's more than enough time to profit off whatever you made but short enough that it'll be usable by others within their lifetimes. This whole "life of the author + 100 years" shit needs to die.
I guess what feels off to me is that the generative AI itself does nothing of the sort; the corporations creating the product AI models do. There are already attempts to make generative AI models that are trained exclusively on data that was licensed for it. I imagine some people would still like to push regulation against companies producing those models, though I am not one of them. I'd like to decouple the arguments of "this use of technology is bad, because it (devalues human works / takes away jobs / ...)" from "the corporations train their generative AI models in an unethical manner".
i get what you mean, but at the same time, i feel like there's not much you can do with AI that you can't do without it (at least, in terms of art), it just takes more time and you'd most likely need to pay someone to do it. so in this case AI would take work opportunities away from people, that's bad, but that's not copyright infringement nor theft.
and i'm worried that by propagating the idea that training an AI model is theft, that it could lead to even more regulation on copyright that would just end up hurting smaller artists in the end. like, most people agree that making AI art in the style of Studio Ghibli movies is scummy, but would an indie artist making art in that style be wrong too? you may think not, but if it becomes a crime to emulate an art style using AI, it takes very little extrapolation to make it a crime to emulate an art style without AI. and do i need to say why being able to copyright an art style would be awful?
so in this case AI would take work opportunities away from people, that’s bad, but that’s not copyright infringement nor theft.
I think it's quite literally copyright infringement, assuming the models are fed with work from actual artists who typically don't agree to it. Whether copyright should work this way is another matter.
“It’s fine if a person does it” is a fantastic argument. Saying that it’s ok to allow robots to continue to replace every part of human life will only lead to suffering for literally everything in existence. Computers can destroy and create in milliseconds what might take humans a lifetime to achieve. If this isn’t an incredibly good reason to regulate the shit out of ai then what is?!?!?
Like yes, currently generative AI use is incredibly difficult to get something non-derivative, e.g. using it as a tool like Photoshop. But that most definitely will not be the case in a few years. This is by far the steepest, slipperiest, most ridiculous slope we could be on and it’s not even close.
This is the biggest problem with technology, regulation is reactionary and not preemptory. Not taking action immediately has already gotten earth into a ridiculously bad situation. If we continue to allow it it’s only going to get worse and harder to undo.
so its harrison bergeron then? ai must be limited to be equal with humans?
This is interesting. I agree that stealing isn't the right category. Copyright infringement may be, but there needs to be a more specific question we are exploring.
Is it acceptable to make programmatic transformations of copyrighted source material without the copyright holder's permission for your own work?
Is it acceptable to build a product which contains the copyrighted works of others without their permission? Is it different if the works contained in the product are programmatically transformed prior to distribution?
Should the copyright holders be compensated for this? Is their permission necessary?
The same questions apply to the use of someone's voice or likeness in products or works.
Is it acceptable to build a product which contains the copyrighted works of others without their permission? Is it different if the works contained in the product are programmatically transformed prior to distribution?
Somebody correct me if I'm wrong, but my understanding of how image generation models and training them works is that the end product, in fact, does not contain any copyrighted material or any transformation of that copyrighted material. The training process refines a set of numbers in the model, But those numbers can't really be considered a transformation of the input.
To preface what I'm about to say, LLMs and image models are absolutely not intelligent, and it's fucking stupid that they're called AI at all. However, if you look at somebody's art and learn from it, you don't contain a copyrighted piece of their work in your head or a transformation of that copyrighted work. You've just refined your internal computers knowledge and understanding of the work, I believe the way image models are trained could be compared to that.
the generated product absolutely contains elements of the things it copied from. imagine the difference between someone making a piece of art that is heavily inspired by someone else's work VS directly tracing the original and passing it off as entirely yours
This is generally correct, though diffusion models and GPTs work in totally different ways. Assuming an entity had lawful access to the image in the first place, nothing that persists in a trained diffusion model can be realistically considered to be a copy of any particular training image by anyone who knows wtf they're talking about.
The magic word here is transformative. If your use of source material is minimal and distinct, that's fair use.
If a 4 GB model contains the billion works it was trained on - it contains four bytes of each.
What the model does can be wildly different from any particular input.
Yep. I really hope that the conversation around LLMs moves away from words like "theft". Show me evidence of an artwork that an LLM has concealed artwork from the public. I've looked, and all that I've found is that they make more media more accessible. That's not theft. That's piracy. That's culture-jamming.
So if you want to call it appropriation, fine. It's classic EEE methods, applied to Gonzo, Dada & Punk ideas. Embrace - taking in everything and culture-jamming with meaningless text & images (painfully Dada). Extend - by turning this into both a toy and a corpo "tool", they extended Dada into programming, articles, news media, you name it. Extinguish - when everything is a punk remix, or everything is meaningless Dada, nothing is. Therefore the true punks will be the classicist, reconstructionists, the Bible-beaters, and their ilk. And then Punk is dead.
"AI slop is punk" has the ring of when brew dog claimed they were doing a "punk" equity offering. 😂
Vibe coding is dadaist. If you say so, but I figure intent matters...
The world needs more people like you, and I would love hearing more about all this from you 💝!
Holding private individuals to the same standards as megacorporations doesn't make sense, nor does the reverse.
Megacorporations must be held to stricter standards because of the wealth and power they wield vs ordinary people.
It is not theft when I download a copy of a game that cannot be purchased anymore.
It is theft when openai dumps my novel notes into "the pile" to train their ais for monetary gain and without my permission.
This is a really good point.
I've always believed - against all legal definitions - that the theft in piracy was (e.g.) copying music and then selling it. Copying for self-use is not theft.
I freely admit the distinction is morally shakey, and you could argue that in both cases the actual theft is depriving the owner of potential monetary gain, but reselling something pirated - in my mind - crosses an ethical line.
AI companies are unethical not because they pirate media, but because they then resell derivatives. If they trained their LLMs and then gave the models away for free, that would be another matter.
Another example: for decades, I resold my programming skills to companies. However, I paid good money for those skills, via my CIS degree, with the explicit understanding of the instructors and the outrageously priced books, that I would be reselling it. LLM scrapers aren't paying anything for the training data. They avoid what little opportunity they have for moral justification.
The theft was the scraping and regurgitating of art that then puts the original artists out of work.
Capitalists found a way to exploit artists even harder, so that now they don't even need to pay them.
I don't think people would care quite as much if gen AI merely existed (I'm sure many would still dislike it, but just for being soulless). But it doesn't just do that, it also destroys artists' livelihoods and prevalence of their art using their own work. I don't really care if it's technically theft or not, it's doing bad for society regardless.
"I believe in you. You can come up with a better argument than just theft."
Nah, fuck that shit. It OOP feels so strongly that it's not theft and they wanna change how the population at large is referring to something, then it's on them to provide an alternative and convince others. This weird ass attempt to shame people into doing things their way, especially when they haven't really defined what they consider their way, is absolute horse shit.
This whole post is full of this. The OOP tries to completely remove intent and method from the analysis of whether something is art theft. Those things absolutely factor into it and they're only discounting them in order to push their weird narrative.
AI scrapping tons of work belonging to artists and then regurgitating that as original work is fucking gross, no matter what you call it. Theft seems fine to me, but I am open to calling it something else. Unfortunately OOP won't be the I've to convince me since they neither provide reasoning for why calling it theft is bad or what we should call it instead and why.
The alternative is to call out copywrong in every version and every facet of existence. This isn't theft, it's duplication. The argument is simple: LLMs are the new printing press.
Fan-created derivative works are usually only tolerated (because even capitalists realise that banning fanart would be economic suicide), they are, strictly taken, already illegal in most places. Yes, even if you don't make money off them. The US fair use thing is an exception and can still be challenged by owners of the IP.
yea, but that's already a bad thing (IMO, at least)
copyright law is already awful and over-reaching, we shouldn't make it worse!
AI art generation isn't theft, it's the training that's problematic. These companies are using artists' work for free and without credit to generate massive amounts of profits, while simultaneously putting these artists out of work.
While I'm on this soapbox, making AI art doesn't make you an artist any more than commissioning an art piece does. There is literally no difference between telling the AI what you want it to draw, and telling a human what you want them to draw. You are not an artist, you are a client.
While I’m on this soapbox, making AI art doesn’t make you an artist any more than commissioning an art piece does. There is literally no difference between telling the AI what you want it to draw, and telling a human what you want them to draw. You are not an artist, you are a client.
Except humans are smart and can fill in the blanks of what you mean when you tell them to draw a picture. You don't need any skill because the artist is skilled.
The slop generators are dumb af, massaging them to produce good results is definitely a skill. They aren't good enough to fill in the blanks like a human artist and it's up to the prompt generator to convince it to draw something that doesn't look like shit.
I agree with the comment here that AI image generation is more like piracy in that you are appropriating other artists' works without their permission.
So I mean personally I agree that AI art is soulless and possibly copyright infringement but not theft, whereas for people that consider piracy theft, calling AI art "theft" is not an inconsistent or hypocritical argument, in my opinion. Machine or human doesn't make it stealing or not.
I mean even Zuck's Meta is claiming in court that they aren't stealing books when torrenting massive amounts of them for AI training, lol, so he's consistent there. But Nintendo on the other hand are probably seething at AI Nintendo art being "stolen" from them.
Lemmy's respect for copyright only in relation to the magic content robot is endlessly amusing.
I don't give a shit what public data gets shredded into a gigabyte of linear algebra. That process is transformative. If the result is any good at reproducing a specific input, you did it wrong.
Disclaimer: I've not workshopped this much, so idk if these are the right words to convey how I feel
I feel like using AI to generate images is akin to taking someone's art and applying a light gaussian blur to it or putting an uncredited artist's work in a big game.
I know it's done in a much more intricate way, and it's genuinely impressive how AI companies got it to work so well, but if I try to sell AI generated images, especially if they're meant to be made similar to an artist's work, then that's all I'm doing.
I don't necessarily see it as stealing from artists (though it is threatening the livelihood of a lot of artists), but more as exploiting artists but with a new buzzword.
If I arrange 4 pieces of art in a jpeg and then apply a whacky filter, am I actually creating anything, or am I just exploiting artists and doing something similar to copying and pasting different bits of an essay and then changing every instance of a word to a different synonym?
I believe AI does something similar to that, albeit in a more sophisticated way that looks like creativity.
How do you get this style of thread? I normally can only get the newer, more Twitter-esque UI.
this is just the theme of their blog!
you can see a blog with it's custom theme by going to [blog name].tumblr.com instead of tumblr.com/[blog name], and by doing this you also get less restrictions (for example, there's no cookie banner or sign-in wall if you scroll too much)
ofc not all blogs have a retro theme like this, in fact many blogs don't have custom themes at all
Copying is not theft - https://youtu.be/IeTybKL1pM4
I believe the difference to be that, no matter how much they attempt to copy or derive the art they create a person will always include their own style (for want of a better word). The subtlties from how they learned to draw/paint/whatever, the unconsious biases (good or bad) that lead to their creative decisions. A person imparts themselves, a collection of a lifetime of experiences, unto their art, whether they try to or not. AI does not have these experiences, it can only attempt to recreate from what it has seen. AI cannot add itself to the art it creates, because it doesn't have a self to add. AI art is intrinsically bad because it cannot exist in a vacuum and cannot add anything of artistic value when it exists in a world of artists.
AI people intentionally misunderstanding how the tech works but defending it anyway lets me know that nobody smart actually supports this garbage.
If it needs the training data to make the output then it is copying the training data
And if the training data was stolen that means AI is theft.
FFS
The premise here is wrong.
The theft isn't "ai made a derivative work". The theft is "human ai bros scraped all the stuff I made, without permission or compensation, and used it as training data".
The problem is that art is being used for purposes the artist explicitly disagrees with. Imagine your artwork as a backdrop for company that steals candy from babies to feed elephant poachers. In a normal world, you can at least sue that company to take it down.
But when OpenAI uses your artwork to pump thousands of tons of CO2 into the air, you can't do shit, and according to OP, you shouldn't even complain about your work being taken.
Isn't OP's whole point that you should complain, just not call it theft?
op's complaint is that "ai is theft/copyright infringement" is a fundamentally different argument than "ai is energy-intensive/pollutive" and that if youre upset about the co2 then argue against the co2; if youre upset about copyright infrengement, why are other derivative works valid. it was in the op actually.
No, the problem is "my work is being used for something I don't want, and didn't agree too". That could be pollution, it could be people who dislike vowels. Artists get a say in the use of their work, and AI bros break those laws.