Skip Navigation

‘Mass theft’: Thousands of artists call for AI art auction to be cancelled

Thousands of artists are urging the auction house Christie’s to cancel a sale of art created with artificial intelligence, claiming the technology behind the works is committing “mass theft”.

The Augmented Intelligence auction has been described by Christie’s as the first AI-dedicated sale by a major auctioneer and features 20 lots with prices ranging from $10,000 to $250,000 for works by artists including Refik Anadol and the late AI art pioneer Harold Cohen.

You're viewing a single thread.

72 comments
  • The question about if AI art is art often fixates on some weird details that I either don't care about or I think are based on fallacious reasoning. Like, I don't like AI art as a concept and I think it's going to often be bad art (I'll get into that later), but some of the arguments I see are centered in this strangely essentialist idea that AI art is worse because of an inherent lack of humanity as a central and undifferentiated concept. That it lacks an essential spark that makes it into art. I'm a materialist, I think it's totally possible for a completely inhuman machine to make something deeply stirring and beautiful- the current trends are unlikely to reliably do that, but I don't think there's something magic about humans that means they have a monopoly on beauty, creativity or art.

    However, I think a lot of AI art is going to end up being bad. This is especially true of corporate art, and less so for individuals (especially those who already have an art background). Part of the problem is that AI art will always lack the intense level of intentionality that human-made art has, simply by the way it's currently constructed. A probabilistic algorithm that's correlating words to shapes will always lack the kind of intention in small detail that a human artist making the same piece has, because there's no reason for the small details other than either probabilistic weight or random element. I can look at a painting someone made and ask them why they picked the colors they did. I can ask why they chose the lighting, the angle, the individual elements. I can ask them why they decided to use certain techniques and not others, I can ask them about movements that they were trying to draw inspiration from or emotions they were trying to communicate.

    The reasons are personal and build on the beauty of art as a tool for communication in a deep, emotional and intimate way. A piece of AI art using the current technology can't have that, not because of some essential nature, but just because of how it works. The lighting exists as it does because it is the most common way to light things with that prompt. The colors are the most likely colors for the prompt. The facial expressions are the most common ones for that prompt. The prompt is the only thing that really derives from human intention, the only thing you can really ask about, because asking, "Hey, why did you make the shoes in this blue? Is it about the modern movement towards dull, uninteresting colors in interior decoration, because they contrast a lot with the way the rest of the scene is set up," will only ever give you the fact that the algorithm chose that.

    Sure, you can make the prompts more and more detailed to pack more and more intention in there, but there are small, individual elements of visual art that you can't dictate by writing even to a human artist. The intentionality lost means a loss of the emotional connection. It means that instead of someone speaking to you, the only thing you can reliably read from AI art is what you are like. It's only what you think.

    I'm not a visual artist, but I am a writer, and I have similar problems with LLMs as writing tools because of it. When I do proper writing, I put so much effort and focus into individual word choices. The way I phrase things transforms the meaning and impact of sentences, the same information can be conveyed so many ways to completely different focus and intended mood.

    A LLM prompt can't convey that level of intentionality, because if it did, you would just be writing it directly.

    I don't think this makes AI art (or AI writing) inherently immoral, but I do think it means it's often going to be worse as an effective tool of deep, emotional connection.

    I think AI art/writing is bad because of capitalism, which isn't an inherent factor. If we lived in fully-automated gay luxury space communism, I would have already spent years training an LLM as a next-generation oracle for tabletop-roleplaying games I like. They're great for things like that, but alas, giving them money is potentially funding the recession of arts as a profession.

72 comments