I’m so glad I’m not growing up in this age of smartphones, social media, and bullshit generators. Life was hell enough in the 90s without all that noise.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    31
    ·
    3 months ago

    TBH kids need a new culture/attititude towards digital media, where they basically assume anything they consume on their phones is likely bogus. Like a whole new layer of critical thinking.

    I think it’s already shifting this direction with the popularity of private chats at the expense of “public” social media.

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      3 months ago

      Yep, at this point your real nudes could leak and you can just plausibly claim they’re faked.

      Something will change, but I’m not sure where society will decide to land on this topic.

        • xmunk
          link
          fedilink
          arrow-up
          18
          ·
          3 months ago

          If no one knows about it you could claim not to have it.

          • leftzero@lemmynsfw.com
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            Birthmarks don’t seem like the kind of thing AI would generate (unless asked), though…

            (And, as model collapse sets in and generated images become more and more generic and average, things like birthmarks will become more and more unlikely…)

            • xmunk
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              AI is quite unpredictable… it’s sort of only useful because of how random it is. But my point is that either the knowledge is public or private - there’s no situation where you can’t either deny or attribute it to public knowledge.

    • ObjectivityIncarnate@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      3 months ago

      That’s definitely going to happen organically, especially since this is a genie that is definitely never going back in the bottle. It’s only going to become more convincing, more accessible, and more widespread, even for simply ‘self-contained’ use, especially by hormone-flooded teenagers.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        Even if all AI developement is outlawed, this minute, everywhere, the genie is already out of the bottle. Flux 1.dev is bascially photorealsitic for many situations, and there will always be someone hosting it in some sketchy jurisdiction.

    • kibiz0r@midwest.social
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      3 months ago

      I kinda doubt anyone is getting “fooled” by these at this point, though that is a whole nother layer of horrible hell in store for us…

      Right now, we’re dealing with the most basic questions:

      • Is it immoral (and/or should it be illegal) for people to be trading pornographic approximations of you?
      • Is it immoral (and/or should it be illegal) for people to privately make pornographic approximations of you?
      • Is it immoral (and/or should it be illegal) to distribute software which allows people to make pornographic approximations of others?
      • yeather@lemmy.ca
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        3 months ago
        1. Illegal
        2. Immoral
        3. If built specifially for? Illegal. If it’s a tool with proper blockages that ends up having a way to break away and make them anyway? No.