Question: when I make an API call to gpt-3.5-turbo, I see that it goes to your backend and not openai directly. That's ok. I understand you might want to do this for many technical reasons. But I have a question. Do you store the calls I am making to openAI in a internal database?
I think there might be a partnership with Pinecone possible. After the post I made it looks like they're the ones who have the most resource and the willingness to build this integration.
the day I see this in a no-code tool like Clay I will be so happy, because it will enable to go to the next level.
SO much of what we do is classifying. But LLMs are overkill for it, and they're too unpredictable to do it reliably at scale.