"Human: you seem to remember the context of the chat, how many context you keep, the most recent previous output, or a few conversations
AI: I generally remember the last few conversations and I'm also capable of keeping context in longer conversations."
I don't think there is a way to feed context back to the script efficiently, maybe need a local database or a queue of certain length to sustain the subject. To mimic the web-browser experience at terminal.
the real magic for chatgpt vs old chatbots is that it maintains context, how to keep that via API calls is something I do not fully understand yet. Need read the API reference I guess
ChatGPT feeds the context of the conversation back into itself. When it hits max tokens, it then summarizes the conversation to feed back into itself. This is one of the things where ChatGPT can start losing track of what it was talking about if that summary doesn't have key things in it or missummrizes the conversation.
For
The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.
Human: Hello, who are you?
AI: I am an AI created by OpenAI. How can I help you today?
Human: I am fine. Working hard.
It will then add in some text in line:
The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.
Human: Hello, who are you?
AI: I am an AI created by OpenAI. How can I help you today?
Human: I am fine. Working hard.
AI: That's great to hear! Do you need any assistance with anything?
Human: How many peks are in a bushel?
This is 93 tokens.
And then I get:
The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.
Human: Hello, who are you?
AI: I am an AI created by OpenAI. How can I help you today?
Human: I am fine. Working hard.
AI: That's great to hear! Do you need any assistance with anything?
Human: How many peks are in a bushel?
AI: A bushel contains 8 US (or 32 UK) pecks. Is there anything else I can help you with?
Human:
And then...
The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.
Human: Hello, who are you?
AI: I am an AI created by OpenAI. How can I help you today?
Human: I am fine. Working hard.
AI: That's great to hear! Do you need any assistance with anything?
Human: How many peks are in a bushel?
AI: A bushel contains 8 US (or 32 UK) pecks. Is there anything else I can help you with?
Human: What produces does that measure?
AI: A bushel is a dry measure used for measuring fruits, vegetables, and other grain products.
Human:
You will see each time it is feeding in the entire chat history as part of the prompt. That is how you maintain the context of the chat.
To do this as part of a command line interface, yep - it would need something like .gpthistory which gets fed in to each message... though this eats tokens.
You can see some of this in the playground if you look at the 'view code' button at the top.
Feed the output of the previous prompt back into itself.
If you look at https://platform.openai.com/playground/p/default-chat?model=... you will see that the prompt for the next line is all of the previous generation conversation.
This can consume tokens at an accelerating rate.