Large language models like ChatGPT learn by practicing with lots of examples, just like you learn to read by reading many books.
Imagine you have a friend who wants to learn how to cook. Every time they watch a cooking show or try a new recipe, they get better at guessing what ingredients are needed for a dish. That’s kind of like how ChatGPT learns, it practices with millions of sentences and paragraphs from all over the internet.
Learning by Example
Getting Better with Practice
Every time ChatGPT reads a sentence or answers a question, it gets feedback, like when you get a sticker for doing well on a test. Over billions of these practice sessions, it becomes really good at predicting what word comes next, and that helps it write full sentences and even answer your questions!
Examples
- A child learns to speak by listening to their parents and repeating words.
- Imagine a robot that memorizes millions of sentences and tries to guess the next word in a sentence.
- Like a student studying for a test with every book ever written.
Ask a question
See also
- How do large language models learn to talk like humans?
- How do large language models like ChatGPT work?
- How do AI language models generate text like humans?
- What are convolutional neural networks?
- How do large language models like GPT-4o actually generate text?