• @[email protected]
    link
    fedilink
    English
    369 months ago

    A small team of 7 was able to create something of this magnitude , all thanks to the various tools of today like Generative AI.

    We talk about the bad stuff of AI. But here’s the good… small mom and pop shops being able to release top tier products like the big companies.

    • circuitfarmer
      link
      fedilink
      English
      239 months ago

      It’s arguably not good that we’re normalizing people being able to use this while its training relied on other creators who were not compensated.

      • Ethanice
        link
        fedilink
        509 months ago

        My programming training relied on other creators who were not compensated.

        • @[email protected]
          link
          fedilink
          English
          59 months ago

          Were they in public forums and sites like stack overflow and GitHub where they wanted people to use and share their code?

          • Armok: God of Blood
            link
            fedilink
            English
            2
            edit-2
            9 months ago

            Stable Diffusion uses a dataset from Common Crawl, which pulled art from public websites that allowed them to do so. DeviantArt and ArtStation allowed this, without exception, until recently.

          • Echo Dot
            link
            fedilink
            English
            -19 months ago

            Where did the AI companies get their code from? Is scraped from the likes of stack overflow and GitHub.

            They don’t have the proprietary code that is used to run companies because it’s proprietary and it’s never been on a public forum available for download.

        • Franzia
          link
          fedilink
          English
          59 months ago

          I imagine creators who… released their work for free, and/or open source?

            • Franzia
              link
              fedilink
              English
              49 months ago

              When we’re talking about instructional content and source code, yeah. Visual art online follows a different paradigm.

        • Ech
          link
          fedilink
          English
          -99 months ago

          Humans using past work to improve, iterate, and further contribute themselves is not the same as a program throwing any and all art into the machine learning blender to regurgitate “art” whenever its button is pushed. Not only does it not add anything to the progress of art, it erases the identity of the past it consumed, all for the blind pursuit of profit.

          • @Sethayy
            link
            English
            99 months ago

            Oh yeah tell me who invented the word ‘regurgitate’ without googling it. Cause the its historical identity is important right?

            Or how bout who first created the internet?

            Its ok if you dont know, this is how humans work, on the backs of giants

            • Ech
              link
              fedilink
              English
              -39 months ago

              Me not knowing everything doesn’t mean it isn’t known or knowable. Also, there’s a difference between things naturally falling into obscurity over time and context being removed forcefully.

              • @Sethayy
                link
                English
                39 months ago

                And then there’s when its too difficult to upkeep them, exactly like how you can’t know everything.

                We probably ain’t gonna stop innovation, so we mine as well roll with it (especially when its doing a great job redistributing previously expensive assets)

                • Ech
                  link
                  fedilink
                  English
                  -39 months ago

                  If it’s “too difficult” to manage, that may be a sign it shouldn’t just be let loose without critique. Also, innovation is not inherently good and “rolling with it” is just negligent.

      • moon_matter
        link
        fedilink
        24
        edit-2
        9 months ago

        Devil’s advocate. It means that only large companies will have AI, as they would be the only ones capable of paying such a large number of people. AI is going to come anyway except now the playing field is even more unfair since you’ve removed the ability for an individual to use the technology.

        Instituting these laws would just be the equivalent of companies pulling the ladder up behind them after taking the average artist’s work to use as training data.

        • @Corkyskog
          link
          English
          19 months ago

          How would you even go about determining what percentage belongs to the AI vs the training data? You could argue all of the royalties should go to the creators of the training data, meaning no one could afford to do it.

          • moon_matter
            link
            fedilink
            19 months ago

            How would you identify text or images generated by AI after they have been edited by a human? Even after that, how would you know what was used as the source for training data? People would simply avoid revealing any information and even if you did pass a law and solved all of those issues, it would still only affect the country in question.

      • @[email protected]
        link
        fedilink
        English
        12
        edit-2
        9 months ago

        Then we shouldn’t have artists because they looked at other art without paying.

      • @[email protected]
        link
        fedilink
        English
        119 months ago

        Oonga boonga wants his royalty checks for having first drawn a circle 25,000 years ago.

      • @mindbleach
        link
        English
        69 months ago

        As distinct from human artists who pay dividends for every image they’ve seen, every idea they’ve heard, and every trend they’ve followed.

        The more this technology shovels into the big fat network of What Is Art, the less any single influence will show through.

      • @[email protected]
        link
        fedilink
        English
        19 months ago

        Literally the definition of greed. They dont deserve royalties for being an inspiration and moving a weight a fraction of a percentage in one direction…

      • @Grumpy
        link
        English
        79 months ago

        If AI art is stolen data, then every artists on earth are thieves too.

        Do you think artists just spontaneously conjure up art? No. Through their entire life of looking at other people’s works, they learned how to do stuff, they emulate and they improve. That’s how human artists come to be. Do you think artists go around asking permission from millions of past artists if they can learn from their art? Do artists track down whoever made the fediverse logo if I want to make a similar shaped art with it? Hell no. Consent in general is impossible too because whole lot of them are likely too dead to give consent be honest. Its the exact same way AI is made.

        Your argument holds no consistent logic.

        Furthermore, you likely have a misunderstanding of how AI is trained and works. AI models do not store nor copy art that it’s trained on. It studies shapes, concepts, styles, etc. It puts these concepts into matrix of vectors. Billions of images and words are turned into mere 2 gigabytes in something like SD fp16. 2GB is virtually nothing. There’s no compression capable of anywhere near that. So unless you actually took very few images and made a 2GB model, it has no capability to store or copy another person’s art. It has no knowledge of any existing copyrighted work anymore. It only knows the concepts and these concepts like a circle, square, etc. are not copyrightable.

        If you think I’m just being pro-AI for the sake of it. Well, it doesn’t matter. Because copyright offices all over the world have started releasing their views on AI art. And it’s unanimously in agreement that it’s not stolen. Furthermore, resulting AI artworks can be copyrighted (lot more complexity there, but that’s for another day).

        • @[email protected]
          link
          fedilink
          English
          -49 months ago

          L take, AI is not a person and doesn’t have the right to learn like a person. It is a tool and it can be used to replicate others art.

          • @Grumpy
            link
            English
            69 months ago

            What gives a human right to learn off of another person without credit? There is no such inherent right.

            Even if such a right existed, I as a person who can make AI training, would then have the right to create a tool to assist me in learning, because I’m a person with same rights as anyone else. If it’s just a tool, which it is, then it is not the AI which has the right to learn, I have the right to learn, which I used to make the tool.

            I can use photoshop to replicate art a lot more easily than with AI. None of us are going around saying Photoshop is wrong. (Though we did say that before) The AI won’t know any specific art unless it’s an extremely repeated pattern like “mona lisa”. It literally do not have the capacity to contain other people’s art, and therefore it cannot replicate others art. I have already proven that mathematically.

            • @[email protected]
              link
              fedilink
              English
              19 months ago

              Yep, these ppl act like they get to choose who or what ingest their product when they make it available willingly on the internet…oftentimes for free.

              This whole argument falls on its face once u realize they don’t want AI to stop…they just want a cut.

          • Echo Dot
            link
            fedilink
            English
            09 months ago

            That doesn’t make it bad.

            It’s a tool that can be used to replicate other art except it doesn’t replicate art does it.

            It creates works based on other works which is exactly what humans do whether or not it’s sapient is irrelevant. My work isn’t valuable because it’s copyrightable. On a sociopath things like that

  • kae
    link
    fedilink
    English
    199 months ago

    Good interview. They didn’t let them off the hook, but weren’t pushing an agenda either.

    This is going to be a moving target that someone is going to pay big bucks to figure out in court. International laws are not up to speed on what is or isn’t ok here, and the ethical discussion is interesting to watch unfold.

  • Archmage Azor
    link
    fedilink
    English
    139 months ago

    I didn’t see the sub at first and thought it was a kickstarter for a real-life mars terraformation project

  • @[email protected]
    link
    fedilink
    English
    29 months ago

    Awesome, I didn’t know they had a kickstarter going. No such thing as bad press I guess.

    • @Kerfuffle
      link
      English
      189 months ago

      Doubled down on the “yea were not gonna credit artist’s our AI stole from”. What a supreme douche

      I don’t think it’s as simple as all that. Artists look at other artists’ work when they’re learning, for ideas, for methods of doing stuff, etc. Good artists probably have looked at a ton of other artwork, they don’t just form their skills in a vacuum. Do they need to credit all the artists they “stole from”?

      In the article, the company made a point about not using AI models specifically trained on a smaller set of works (or some artist’s individual works). Doing something like that would be a lot easier to argue that it’s stealing: but the same would be true if a human artist carefully studied another person’s work and tried to emulate their style/ideas. I think there’s a difference between that an “learning” (or learning) for a large body of work and not emulating any specific artist, company, individual works, etc.

      Obviously it’s something that needs to be handled fairly carefully, but that can be true with human artists too.

      • [email protected]
        cake
        A
        link
        English
        109 months ago

        I swear I’m old enough to remember this exact same fucking debate when digital tools started becoming popular.
        It is, simply put, a new tool.
        It’s also not the one and done magic button people who’ve never used shit think it is.

        The knee-jerk reaction of hating on every art made with AI, is dangerous.
        You’re free to like it or not, but it’s already out of the hat.
        Big companies will have the ressources to train their own model.
        I for one would rather have it in the public domain rather than only available to big corps.

      • loobkoob
        link
        fedilink
        1
        edit-2
        9 months ago

        I wouldn’t call myself a “good artist” at all, and I’ve never released anything, I just make music for myself. Most of the music I make starts with my shamelessly lifting a melody, chord progression, rhythm, sound, or something else, from some song I’ve heard. Then I’ll modify it slightly, add my own elements elsewhere, modify the thing I “stole” again, etc, and by the time I’ve finished, you probably wouldn’t even be able to tell where I “stole” from because I’ve iterated on it so much.

        AI models are exactly the same. And, personally, I’m pretty good at separating the creative process from the end result when it comes to consuming/appreciating art. There are songs, paintings, films, etc, where the creative process is fascinating to me but I don’t enjoy the art itself. There are pieces of art made by sex offenders, criminals and generally terrible people - people who I refuse to support financially in any way - but that doesn’t mean my appreciation for the art is lessened. I’ll lose respect for an artist as a person if I find out their work is ghostwritten, but I won’t lose my appreciation for the work. So if AI can create art I find evocative, I’ll appreciate that, too.

        But ultimately, I don’t expect to see much art created solely by AI that I enjoy. AI is a fantastic tool, and it can lead to some amazing results when someone gives it the right prompts and edits/curates its output in the right way. And it can be used for inspiration, and to create a foundation that artists can jump off, much like I do with my “stealing” when I’m writing music. But if someone gives an AI a simple prompt, they tend to get a fairly derivative result - one that’ll feel especially derivative as we see “raw output” from AIs more often and become more accustomed to their artistic voice. I’m not concerned at all about people telling an AI to “write me a song about love” replacing the complex prog musicians I enjoy, and I’m not worried about crappy AI-generated games replacing the lovingly crafted experiences I enjoy either.

      • Franzia
        link
        fedilink
        English
        09 months ago

        Artists who look at art are processing it in a relatable, human way. An AI doesnt look at art. A human tells the AI to find art and plug it in, knowing that work is copyrighted and not available for someone else’s commercial project to develop an AI.

        • @Kerfuffle
          link
          English
          2
          edit-2
          9 months ago

          Artists who look at art are processing it in a relatable, human way.

          Yeah, sure. But there’s nothing that says “it’s not stealing if you do it in a relatable, human way”. Stealing doesn’t have anything to do with that.

          knowing that work is copyrighted and not available for someone else’s commercial project to develop an AI.

          And it is available for someone else’s commercial project to develop a human artist? Basically, the “an AI” part is still irrelevant to. If the works are out there where it’s possible to view them, then it’s possible for both humans and AIs to acquire them and use them for training. I don’t think “theft” is a good argument against it.

          But there are probably others. I can think of a few.

          • Franzia
            link
            fedilink
            English
            19 months ago

            I just want fucking humans paid for their work, why do you tech nerds have to innovate new ways to lick the boots of capital every few years? Let the capitalists make aeguments why AI should own all of our work, for free, rights be damned, and then profit off of it, and sell that back to us as a product. Let them do that. They don’t need your help.

            • @Kerfuffle
              link
              English
              39 months ago

              I just want fucking humans paid for their work

              That’s a problem whether or not we’re talking about AI.

              why do you tech nerds have to innovate new ways to lick the boots of capital every few years?

              That’s really not how it works. “Tech nerds” aren’t licking the boots of capitalists, capitalists just try to exploit any tech for maximum advantage. What are the tech nerds supposed to do, just stop all scientific and technological progress?

              why AI should own all of our work, for free, rights be damned,

              AI doesn’t “own your work” any more than a human artist who learned from it does. You don’t like the end result, but you also don’t seem to know how to come up with a coherent argument against the process of getting there. Like I mentioned, there are better arguments against it than “it’s stealing”, “it’s violating our rights” because those have some serious issues.

        • @Grumpy
          link
          English
          29 months ago

          That’s not how AI art works. You can’t tell it to find art and plug it in. It doesn’t have the capability to store or copy existing artworks. It only contains the matrix of vectors which contain concepts. Concepts cannot be copyrighted.

          • @Kerfuffle
            link
            English
            -19 months ago

            You can’t tell it to find art and plug it in.

            Kind of. The AI doesn’t go out and find/do anything, people include images in its training data though. So it’s the human that’s finding the art and plugging it in — most likely through automated processes that just scrape massive amounts of images and add them to the corpus used for training.

            It doesn’t have the capability to store or copy existing artworks. It only contains the matrix of vectors which contain concepts.

            Sorry, this is wrong. You definitely can train AI to produce works that are very nearly a direct copy. How “original” works created by the AI are is going to depend on the size of the corpus it got trained on. If you train the AI (or put a lot of weight on) training for just a couple works from one specific artist or something like that it’s going to output stuff that’s very similar. If you train the AI on 1,000,000 images from all different artists, the output isn’t really going to resemble any specific artist’s style or work.

            That’s why the company emphasized they weren’t training the AI to replicate a specific artist’s (or design company, etc) works.

            • @Grumpy
              link
              English
              29 months ago

              Sorry, this is wrong.

              As a general statement: No, I am not. You’re making an over specific scenario to make it true. Sure, if I take 1 image and train a model just on that one image, it’ll make that exact same image. But that’s no different than me just pressing copy and paste on a single image file. The latter does the job whole lot better too. This entire counter argument is nothing more than being pedantic.

              Furthermore, if I’m making such specific instructions to the AI, then I am the one who’s replicating the art. It doesn’t matter if I use a pencil to trace out the existing art, using photoshop, or creating a specific AI model. I am the one who’s doing that.

              • @Kerfuffle
                link
                English
                19 months ago

                As a general statement: No, I am not.

                You didn’t qualify what you said originally. It either has the capability or not: you said it didn’t, it actually does.

                You’re making an over specific scenario to make it true.

                Not really. It isn’t that far-fetched that a company would see an artist they’d like to use but also not want to pay that artist’s fees so they train an AI on the artist’s portfolio and can churn out very similar artwork. Training it on one or two images is obviously contrived, but a situation like what I just mentioned is very plausible.

                This entire counter argument is nothing more than being pedantic.

                So this isn’t true. What you said isn’t accurate with the literal interpretation and it doesn’t work with the more general interpretation either. The person higher in the thread called it stealing: in that case it wasn’t, but AI models do have the capability to do what most people would probably call “stealing” or infringing on the artist’s rights. I think recognizing that distinction is important.

                Furthermore, if I’m making such specific instructions to the AI, then I am the one who’s replicating the art.

                Yes, that’s kind of the point. A lot of people (me included) would be comfortable calling doing that sort of thing stealing or plagiarism. That’s why the company in OP took pains to say they weren’t doing that.

    • @[email protected]
      link
      fedilink
      English
      89 months ago

      How would they credit the artists? Generative AI is trained on thousands and millions of images and data points from equally numerous artists. He might as well say, “I give credit to humanity.”

      • [email protected]
        cake
        A
        link
        English
        99 months ago

        I only consume art from people born of mute mothers isolated from society during their pregnancy and then born into sensory deprivation chambers.
        It is the only way to ensure proper pure art as all other artists are simply rehashing prior work.

      • Ech
        link
        fedilink
        English
        -29 months ago

        Generative AI is trained on thousands and millions of images and data points from equally numerous artists.

        Congrats on pinpointing the problem.

    • @[email protected]
      link
      fedilink
      English
      -19 months ago

      That’s over. Just let it go. It’s never going back in the bottle and artists will never see a penny from ai that trained their art. It’s not fair but life isn’t fair.