$20 is equivalent to what, 10,000,000 tokens? At ~750 words/1k tokens, that’s 7.5 million words per month, or roughly 250,000 words per day, 10,416 words per hour, 173 words per minute, every minute, 24/7.
I uh, do not have that big of a utilization need. It’s kind of weird to vastly overpay
Remember that the previous replies and responses are fed back in. If you’re 20 messages deep in a session, that’s quite a few tokens for each new question. An incredible deal nonetheless!
Presumably the paid api also will give you access when the chatgpt website is at capacity, and for most people it is probably orders of magnitude cheaper.
Same here. That was the sole reason I upgraded. There were a few times where I really needed ChatGPT at a specific time and got the "we're at capacity" message. $20/mo is nothing to have that go away.
There were a few outages that also locked me out (obviously) as a paying subscriber. Not sure how often I was able to access it even though the service was ‘at capacity’. Knowing something like that might make me feel better about the value of Premium.
That’s a bummer to hear that outages can lock out paying subscribers. That hasn’t happened to me yet but if it does that would cause me to reconsider the premium subscription.
> 10,416 words per hour, 173 words per minute, every minute, 24/7.
Unless I'm misunderstanding something, it does not sound like that much when every query you make carries several hundred words of prompt, context and "memory". If the input you type is a couple words, but has 1k extra words automatically prepended, then the limits turn into 10 queries per hour, or one per 6 minutes.
Not now, but if it'll end up powering next gen Copilot, email suggestions, search interfaces, etc. you might end up interacting with it a lot more each day, without realizing it.
Well, let's put it differently: all those hypothetical services are using the API in question, so your marginal cost for them taken together adds to $20/month, which they'll pass onto you, and you'll then happily pay, because you find the services useful.
Maybe. I’m pretty frugal and a big fan of doing things myself. I certainly hope that they can some day can provide me with enough value to make spending $20 a no-brainer, but until that’s obvious or unavoidable, I’m not giving them $20 ¯\_(ツ)_/¯
"The main input is the messages parameter. Messages must be an array of message objects, where each object has a role (either “system”, “user”, or “assistant”) and content (the content of the message). Conversations can be as short as 1 message or fill many pages."
"Including the conversation history helps when user instructions refer to prior messages. In the example above, the user’s final question of “Where was it played?” only makes sense in the context of the prior messages about the World Series of 2020. Because the models have no memory of past requests, all relevant information must be supplied via the conversation. If a conversation cannot fit within the model’s token limit, it will need to be shortened in some way."
So it looks like you pass in the history with each request.
I used the same trick with the previous GPT3 API (da-vinci) and it worked well, I'd pass as one big prompt:
User: hello (previous prompt)
Bot: hi (previous response)
User: who are you? (new prompt)
Bot: (here it continues conversation)
I wonder how the new ChatGPT API differs, other than the fact that it's structured (you use JSON to represent the conversation memory separately instead of one large prompt).
I guess I will spend the next day playing around with the new API to figure it out.
$20 is equivalent to what, 10,000,000 tokens? At ~750 words/1k tokens, that’s 7.5 million words per month, or roughly 250,000 words per day, 10,416 words per hour, 173 words per minute, every minute, 24/7.
I uh, do not have that big of a utilization need. It’s kind of weird to vastly overpay