it can old produce an algorithmically-derived malange of its source-data recomposited in novel forms
Right, it produces derivative data. Not copyrighted material.
By itself without any safeguards, it absolutely could output copyrighted data, (albeit probably not perfectly but for copyright purposes that’s irrelevant as long as it serves as a substitute). And any algorithms that do do that should be punished, but OpenAI’s models can’t do that.
Hammers aren’t bad because they can be used for bludgeoning, and if we have a hammer that somehow detects that it’s being used for murder and then evaporates, calling it bad is even more ridiculous.
but it is still capable - by your own admission - of doing it
…
And if you are comparing LLMs and hammers, you’re just proving how you fundamentally misunderstand what LLMs are and how they work
And a regular hammer is capable of being used for murder. Which makes calling a hammer that evaporates before it can be used for murder “unethical” ridiculous. You’re deliberately missing the point.
And it still profits from the unlicensed use of copyrighted works by using such material for its training data
I just don’t buy this reasoning. If I look at paintings of the Eiffel Tower and then sell my own painting of the building, I’m not violating the copyright of any of the original painters unless what I paint is so similar to one of theirs that it violates fair use.
it is a composite of copyrighted work
It’s stable diffusion, not a composite. But even if they were composites, I’m allowed to shred a magazine and make a composite image of something else. It’s fair use until I use those pieces to create a copyrighted image.
AI, unlike a human, cannot create unique works of art. it can old produce an algorithmically-derived malange of its source-data recomposited in novel forms
Find me a single sentence in that entire article that suggests AI art is composites of source data
You can’t, because how it actually works is wildly different than how you want to believe it works.
Removed by mod
Right, it produces derivative data. Not copyrighted material.
By itself without any safeguards, it absolutely could output copyrighted data, (albeit probably not perfectly but for copyright purposes that’s irrelevant as long as it serves as a substitute). And any algorithms that do do that should be punished, but OpenAI’s models can’t do that.
Hammers aren’t bad because they can be used for bludgeoning, and if we have a hammer that somehow detects that it’s being used for murder and then evaporates, calling it bad is even more ridiculous.
Removed by mod
And a regular hammer is capable of being used for murder. Which makes calling a hammer that evaporates before it can be used for murder “unethical” ridiculous. You’re deliberately missing the point.
I just don’t buy this reasoning. If I look at paintings of the Eiffel Tower and then sell my own painting of the building, I’m not violating the copyright of any of the original painters unless what I paint is so similar to one of theirs that it violates fair use.
It’s stable diffusion, not a composite. But even if they were composites, I’m allowed to shred a magazine and make a composite image of something else. It’s fair use until I use those pieces to create a copyrighted image.
Removed by mod
Coming from someone who claimed stable diffusion was a composite image
Removed by mod
Find me a single sentence in that entire article that suggests AI art is composites of source data
You can’t, because how it actually works is wildly different than how you want to believe it works.
Removed by mod