I'm trying to feed ChatGPT huge amounts of data but I'm hitting the token limit. I remember there used to be the ability to have sequenced ChatGPT prompts in the same column (with the same context). Is there a way to do that?
Hey, thanks for reaching out! That may be something like a waterfall, where you ask the next chatGPT column to use the previous response as input instead.
The team was actually checking out some options to optimize this soon too, but may be coming later. ChatGPT 4 turbo can take a lot more tokens but openAI's rate limits are still very low for it
I'd say you could ensure you can include as much context in the first prompt/response and ask chatGPT to return a long enough value, like at least 200 words or whatever you may need. Then that should have enough context for the following one.