• BudgetBandit
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    IMHO as long as no new random “neurons” form, it’s not AI as in Artificial Intelligence, just “a lot of ifs”

    • 31337
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      I think the human brain works kind of the opposite of that. Babies are born with a shitload of neural connections, then the connections decrease over a person’s lifetime. ANNs typically do something similar to that while training (many connection weights will be pushed toward zero, having little or no effect).

      But yeah, these LLMs are typically trained once, and frozen during use. “Online learning” is a type of training that continually learns, but current online methods typically lead to worse models (ANNs “forget” old things they’ve “learned” when learning new things).