Context Window

128K ChatGPT Context Window

“Larger context basically means you can write a larger text prompt, and get a larger and more detailed response back. So you could for example copy the text from multiple pages from a book (up to 300 pages, if the claims from the announcement are accurate), and then ask it to summarize the content, analyze, identify key points or themes, etc.” https://www.reddit.com/r/ChatGPT/comments/17pa61n/what_does_the_128k_context_window_mean_for/

“It also means that the AI will remember more of your long conversations. For example, let’s say you ask it to give you ideas for a story and it says, “This is a character named Paul whose brother is named Lenny.”” https://www.reddit.com/r/ChatGPT/comments/17pa61n/what_does_the_128k_context_window_mean_for/

“Then you keep asking for more and more details about the story, and it comes up with a story about Paul traveling to France and doing all of these interesting things. If you chat long enough and then ask it for the name of Paul’s brother, that first message could land outside of the context window, which means the AI will forget the answer it previously gave you. It might reply that Paul’s brother is named Dave, or it might even say that Paul doesn’t have a brother.” https://www.reddit.com/r/ChatGPT/comments/17pa61n/what_does_the_128k_context_window_mean_for/

“A longer context window allows you to have much longer conversations before it starts to “forget” things.” https://www.reddit.com/r/ChatGPT/comments/17pa61n/what_does_the_128k_context_window_mean_for/

The largest models, such as Google's Gemini 1.5, presented in February 2024, can have a context window sized up to 1 million (context window of 10 million was also “successfully tested”).

Snippet from Wikipedia: Large language model

A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation.

The largest and most capable LLMs are generative pretrained transformers (GPTs), which are largely used in generative chatbots such as ChatGPT, Gemini or Claude. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained in.