Is there anyway to tell how many input and output tokens you use on a coloumn for GPT - i’ve got one sitting in my sheet that is chewing alot of GPT4 inputs, costing alot, but can’t figure out where… 🙂
Hey Liam, will ask if it would be possible to add a token counter for this... Took a look at the table and I'd say the most likely column using varying amounts of tokens is the column called "✅ AI: Find Business Location" since it's taking input from Scrape Website and the results/input lengths seem to vary quite a bit there. In case it helps calculate a few of the longer prompts here's OpenAI's token calculator: https://platform.openai.com/tokenizer