• FiniteBanjo@lemmy.today
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      10
      ·
      3 months ago

      If the product is as unethical as mass ip theft and replacement of workers with ethics or comprehension skills, then I get the feeling expectations aren’t the major issue.

      • kata1yst
        link
        fedilink
        English
        arrow-up
        33
        arrow-down
        1
        ·
        3 months ago

        There are many ML/AI models that are doing a lot more good than harm. The shitty mass market chat bots and art generators are mostly hype and greed.

        But Mathematics, physics, healthcare, and many other industries have embraced models that accomplish amazing things humans with similar resources just could not.

        It’s a problem of application.

        • FiniteBanjo@lemmy.today
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          1
          ·
          3 months ago

          Sure, and I thank you for the clarification, but the AI in this context seemed pretty clearly the mainstream LLM products.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        3 months ago

        mass ip theft

        That’s not a thing, both because “IP” is dishonest loaded language and because copyright infringement is different from theft.

        I 1000% agree that what they’ve done is completely unethical – particularly because including copyleft works in the training data ought to require every single output to be copyleft – but I do not concede to your framing.

      • mindbleach
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Because copyright is sooo important and beloved.

        Nevermind that distilling all books ever written down to a gigabyte of linear algebra is pretty darn transformative.

        • FiniteBanjo@lemmy.today
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 months ago

          If you take an artist’s work and sell it as your own without giving any credit or remuneration at all to said artist, you’re an asshole regardless of legality, but since it’s also illegal then there should at least be some repercussions for the assholery.

          Fun fact, books get written because publishers pay people to write them.

          • mindbleach
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 months ago

            Who’s talking about copies? We’re talking about an inscrutable network that tried to guess the next letter when shown part of a book. When it was right, weird math shit happened through umpteen layers of random numbers, and a zillion guesses later, it’ll churn out new books. Not good ones. Not sensible ones. But a lot more than the ctrl+c / ctrl+v accusations.

            We already had the technology to copy text on a computer. In fact you can find any book ever published, for free, sometimes by accident. Yet the industry wasn’t strangled by piracy. That’s never how it works. Books were written for thousands of years before copyright existed. You’re on a website rooted in open-source ethos, demonstrating that artificial control and even monetary incentive are not strictly necessary.

  • blandfordforever@lemm.ee
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    4
    ·
    3 months ago

    Whatever. I just used a.i. to write my performance evaluation at work. I fed it a bunch of garbled, incoherent nonsense and made me sound productive AF.

  • SlopppyEngineer@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    3 months ago

    That was to be expected. It’s not the answer to everything and has inherent limitations that can’t be solved quickly by throwing money at it.

    The problem is that so much money is tied up in this. It has been pushing up the stock market. People have been fired because AI was going to take over the job. The fallout from that is going to be painful. Dot com crash like, maybe subprime mortgage crash painful.

    • MajorHavoc@programming.dev
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      3 months ago

      There fallout from that is going to be painful. Dot com crash like, maybe subprime mortgage crash painful.

      Yep. And folks on the news are gonna be all confused how this could happen.

      They laid off their talent to bet on bullshit. That’s how it happened.

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 months ago

    What’s disturbing to me is this:

    Coworker: There’s a study in Denmark where they were able to train ten penguins to do clerical work. Three of them make as few errors as humans.

    Upper Manager Excellent. Lay off the entire office staff and find us four-hundred penguins.