- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
- Nick Clegg, former Meta executive and UK Deputy Prime Minister, has reiterated a familiar line when it comes to AI and artist consent.
- He said that any push for consent would “basically kill” the AI industry.
- Clegg added that the sheer volume of data that AI is trained on makes it “implausible” to ask for consent.
I don’t really disagree with your other two points, but
They sure do, of which that is not one. That’s de facto copyright infringement or plagiarism. Especially if you then turn around and sell that product.
The key point that is being made is that it you are doing de facto copyright infringement of plagiarism by creating a copy, it shouldn’t matter whether that copy was made though copy paste, re-compressing the same image, or by using AI model. The product being the copy paste operation, the image editor or the AI model here, not the (copyrighted) image itself. You can still sell computers with copy paste (despite some attempts from large copyright holders with DRM), and you can still sell image editors.
However, unlike copy paste and the image editor, the AI model could memorize and emit training data, without the input data implying the copyrighted work. (exclude the case where the image was provided itself, or a highly detailed description describing the work was provided, as in this case it would clearly be the user that is at fault, and intending for this to happen)
At the same time, it should be noted that exact replication of training data isn’t exactly desirable in any case, and online services for image generation could include a image similarity check against training data, and many probably do this already.