Office space meme:

“If y’all could stop calling an LLM “open source” just because they published the weights… that would be great.”

  • mindbleach
    link
    fedilink
    arrow-up
    2
    ·
    13 days ago

    I’m down for talking about licenses instead.

    If your pile of bullshit wizard math is MIT enough for me to take it wherever the fuck I please, I don’t care what it was trained on.

    We can talk about reproducible builds when compile times come down below one million hours.

    • Prunebutt@slrpnk.netOP
      link
      fedilink
      arrow-up
      1
      ·
      13 days ago

      This time, it’s about getting insight on the training data: Any benchmark of an LLM is moot, if independent auditors don’t know the training dataset.