• 9point6@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    4
    ·
    10 months ago

    This looks more like a floating point issue than a mistake an LLM would make

    • Cosmicomical@kbin.social
      link
      fedilink
      arrow-up
      9
      ·
      10 months ago

      There are no LLMs involved in this picture, to train an llm you’d need 100x the training data. The panel is about a normal ML model.

    • CodeMonkey@programming.dev
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      10 months ago

      But a floating point issue is the exact type of issue a LLM would make (it does not understand what a floating point number is and why you should treat them differently). To be fair, a junior developer would make the same type of mistake.

      A junior developer is, hopefully, being mentored by more senior coworkers who are extra careful with code reviews and would spot the bug for the dev. Machine generated code needs an even higher level of scrutiny.

      It is relatively easy to teach a junior developer to write code that is easy to read and conforms to the teams style guide.