Fun fact: If you google those codes you find out that they are “real” codes, but they don’t actually activate Windows. I think they are something that are used as placeholders in the upgrade from Windows 8 to 10 or something, but don’t know the specifics.
ChatGPT actually can’t create new “words”, just regurgitate words that it’s seen somewhere before!
That isn’t actually what’s important. It’s the frequency of the token, which could be as simple as single characters. The frequency of those is certainly not zero.
LLMs absolutely can make up new words, word combinations, or sentences.
That’s not to say chatgpt can actually give you good windows keys, but it isn’t a fundamental limitation of LLMs.
I’ve never ever, in many hours of playing with ChatGPT as a toy, had it make up a word. Hallucinate wildly, yes, but not stogulate a word out of nothing.
I’d love to know more, though. How does it combine new words? Do you have any examples of words ChatGPT has made up? This is fascinating to me, as it means the model is much less chained to the training data than I thought.
A lot of compound words are actually multiple tokens so there’s nothing stopping the LLM from generating the tokens in a new order thereby creating a new word.
It can create new words, I just verified this. First word it gave me: flumjangle. Google gives me 0 results. Maybe Google is missing something and it exists in some data out there, Idk.
I’m not sure what is so impressive about this though. Language models can string words together in unique ways, why would it be different for characters?
Sure it can create new words. It can’t create new tokens, would be more correct, I think. But a token is just a text fragment, and, as far as I know, they can range from being several words to being single characters.
Tokens are usually never multiple words. Think of them like “information units”. If you have a plural word “hats”, there are two tokens: the “hat” one and the "s’ that adds more information about it being plural. Combinations of words only really occur for proper names.
Fun fact: If you google those codes you find out that they are “real” codes, but they don’t actually activate Windows. I think they are something that are used as placeholders in the upgrade from Windows 8 to 10 or something, but don’t know the specifics.
ChatGPT actually can’t create new “words”, just regurgitate words that it’s seen somewhere before!
Yep yep, statistical analysis as to the frequency of tokens in the training text.
Brand new, never-before-seen Windows keys have a frequency of zero occurrences per billion words of training data.
That isn’t actually what’s important. It’s the frequency of the token, which could be as simple as single characters. The frequency of those is certainly not zero.
LLMs absolutely can make up new words, word combinations, or sentences.
That’s not to say chatgpt can actually give you good windows keys, but it isn’t a fundamental limitation of LLMs.
Okay, I’ll take your word for it.
I’ve never ever, in many hours of playing with ChatGPT as a toy, had it make up a word. Hallucinate wildly, yes, but not stogulate a word out of nothing.
I’d love to know more, though. How does it combine new words? Do you have any examples of words ChatGPT has made up? This is fascinating to me, as it means the model is much less chained to the training data than I thought.
A lot of compound words are actually multiple tokens so there’s nothing stopping the LLM from generating the tokens in a new order thereby creating a new word.
It can create new words, I just verified this. First word it gave me: flumjangle. Google gives me 0 results. Maybe Google is missing something and it exists in some data out there, Idk.
I’m not sure what is so impressive about this though. Language models can string words together in unique ways, why would it be different for characters?
I’m just surprised to hear that google hasn’t found out about my flumjangles yet.
Sure it can create new words. It can’t create new tokens, would be more correct, I think. But a token is just a text fragment, and, as far as I know, they can range from being several words to being single characters.
If it was Windows 95 it could generate them
https://www.youtube.com/watch?v=cwyH59nACzQ&t=306s
I got it it say vilumplox. It doesn’t return any Google search results.
Tokens are usually never multiple words. Think of them like “information units”. If you have a plural word “hats”, there are two tokens: the “hat” one and the "s’ that adds more information about it being plural. Combinations of words only really occur for proper names.