The question about the legal and moral aspects of training on works of other artists is related, but a different discussion.

  • Thorny_Insight@lemm.eeOP
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    3 months ago

    If AI can create better content than humans can then people will rather consume that. I don’t see why you should artificially limit this. If someone thinks that AI content is not better then that’s who the audience is for the remaining human creators. AI can already create better looking photos than I can, but it has zero effect on my desire to do photography. I don’t see what the issue is.

    • macniel@feddit.org
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      3 months ago

      If AI can create better content than humans can then people will rather consume that.

      it can’t create better content though

      I don’t see why you should artificially limit this.

      LLMS limit themselves already, no need to additionally artifical limit it.

      If someone thinks that AI content is not better then that’s who the audience is for the remaining human creators

      Corpos don’t care, ordinary people don’t care. Does it make it still a good thing that Corpos can pump out slop without paying a living wage to artists or atleast royalties to those they took the training data from (with or without their consent)?

      AI can already create better looking photos than I can, but it has zero effect on my desire to do photography. I don’t see what the issue is.

      It pretty much can’t. It only mix and pattern matches existing photos.

      Coming back to my first half sentence:

      AI can’t create and when it only trains on it self it collapes, a short to this:

      https://www.youtube.com/watch?v=ZWvTr5wKGCA