TL; DR it costs around 1.2 cents per chat-block
Some basic napkin math since I couldn’t find the answer anywhere:
GPT-4 8k API costs 3 cents per 1000 tokens for input, 6 cents per 1000 tokens for output
according to this table GPT spit out, the answer length will depend on complexity
| Question Complexity | Estimated Tokens in Response|
|---------------------|-----------------------------|
| Very simple | 5-20 |
| Simple | 20-50 |
| Moderate | 50-150 |
| Complex | 150-300 |
| Very Complex | 300+ |
Therefore, let’s say 150 tokens for a response and 50 tokens for an input query. Furthermore we’ll simply assume the ceiling of 6 cents for 1000 token output, giving us 200*(6/1000) = 1.2 cents per question-answer pair (hereafter referred to as chat-block).
(Correspondingly, the gpt-3.5-turbo API will yield a cost of 200 * (0.2/1000) = 0.04 cents per chat.)
Now, let’s compare this to the $20 ChatGPT Plus subscription that OpenAI offers. On my plan, I am able to make 50 queries to GPT-4 every 3 hours. Assuming I only use daily all 50 queries with 200 tokens per chat, it will cost me (50*200) * (6/1000) = 60 cents per day if I use the API to make these calls instead. Assuming 30 days in a month, this comes out to $18 in API usage costs.
However, the product offering will be more economical if you send > 50 chats per day, or if your chat-blocks are longer. Additionally, the UI may be more enjoyable for those who don’t want to run their own scripts.
Sources:
[1]: https://openai.com/pricing