Office space meme:
“If y’all could stop calling an LLM “open source” just because they published the weights… that would be great.”
Office space meme:
“If y’all could stop calling an LLM “open source” just because they published the weights… that would be great.”
I’m down for talking about licenses instead.
If your pile of bullshit wizard math is MIT enough for me to take it wherever the fuck I please, I don’t care what it was trained on.
We can talk about reproducible builds when compile times come down below one million hours.
This time, it’s about getting insight on the training data: Any benchmark of an LLM is moot, if independent auditors don’t know the training dataset.