hello! would love to see Clay have native integrations for LLM models hosted on TPUs/fast inference chips (eg Llama on Cerebras or Groq). Would make some of the deeper research tasks on large lists so much faster!
Currently, we offer native integrations with several powerful LLM models including OpenAI's GPT models, Google's Gemini Pro, and Anthropic's Claude. These integrations are designed to support various AI tasks including data cleaning, summarization, copywriting, and deeper research tasks.While we don't currently have native integrations for TPU/fast inference chip-hosted models like Llama on Cerebras or Groq, our existing AI integrations provide robust capabilities for research tasks. You can access these models either through our Clay-managed account or by using your own API keys for specific models.
Sources:Gemini: Generate text with Google GeminiUse AI integration overview | Documentation | Clay UniversityPrompt Engineering Crash Course [Limitless Research] - Clay University