Edit: i see now it’s an article and not just you asking a question lol. I’ll leave it up anyway.
Edit: i see now it’s an article and not just you asking a question lol. I’ll leave it up anyway.
You know when your typing on your phone and you have that bar above the keyboard showing you what word it thinks you are writing? If you click the word before you finish typing it, it can even show you the word it thinks you are going to write next. Gpt works the same way, it just has waaaay more data that it can sample from.
It’s all just very advanced predictive text algorithms.
Ask it a question about basketball. It looks through all documents it can find about basketball and sees often they reference, hoops, Michael Jordan, sneakers, NBA ect. And just outputs things that are highly referenced in a structure that makes grammatical sense.
For instance, if you had the word ‘basketball’ it knows it’s very unlikely for the word before it to be ‘radish’ and it’s more likely to be a word like ‘the’ or ‘play’ so it just strings it together logically.
That’s the basics anyway.
I wonder what an ELI5 version of ‘stored weights’ would be in this context.