Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use::The tool’s creators are seeking to make it so that AI model developers must pay artists to train on data from them that is uncorrupted.

  • @[email protected]
    link
    fedilink
    English
    605 months ago

    Ben Zhao, the University of Chicago professor behind this stole GPLv3 code for his last data poisoning scheme. GPLv3 is a copyleft license that requires you share your source code and license your project under the same terms as the code you used. You also can’t distribute your project as a binary-only or proprietary software. When pressed, they only released the code for their front end, remaining in violation of the terms of the GPLv3 license.

    Nightshade also only works against open source models, because the only models with open models are Stable Diffusion’s, companies like Midjourney and OpenAI with closed source models aren’t affected by this. Attacking a tool that the public can inspect, collaborate on, and offer free of cost isn’t something that should be celebrated.

    • @[email protected]
      link
      fedilink
      English
      195 months ago

      It kinda seems sus. Glosed source binary and you’re the one tagging an AI. Why should I trust these people with my art if they even admit that I’m labeling my stuff for their algorithm and it’s closed source?