• gsfraley@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    6
    ·
    edit-2
    10 months ago

    Sure, but this is less than nothing. It literally applies 0 friction against AI and is completely and totally unenforceable. AND it’s a laughing stock for everyone and sucks the oxygen out of better AI regulation groups and think-tanks.

    • Imgonnatrythis
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      10 months ago

      Why? If a California corporation is pumping out AI content and it doesn’t have watermarks, why can’t this be enforced? It’s not an all use solution, but I fail to see how it fails completely.

    • FatCrab@lemmy.one
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      This is actually an effective measure when you sit down to actually think about this from a policy perspective. Right now, the biggest issue with AI generated content for the corporate side is that there is no IP right in the generated content. Private enterprise generally doesn’t like distributing content that it doesn’t have ability to exercise complete control over. However, distributing generated content without marking it as generated reduces that risk outlay potentially enough to make the value calculus swing in favor of its use. People will just assume there are rights in the material. Now, if you force this sort of marking, that heavily alters the calculus.

      Now people will say wah wah wah no way to really enforce. People will lie. Etc. But that’s true for MOST of our IP laws. Nevertheless, they prove effective at accomplishing many of their intents. The majority of private businesses are not going to intentionally violate regulatory laws of they can help it and, when they do, it’s more often than not because they think they’ve found a loophole but were wrong. And yes, that’s even accounting for and understanding that there are many examples of illegal corporate activity.