Clay Icon

Request for Clay to Integrate LLM Models on TPUs for Faster Research

ยท
ยท

hello! would love to see Clay have native integrations for LLM models hosted on TPUs/fast inference chips (eg Llama on Cerebras or Groq). Would make some of the deeper research tasks on large lists so much faster!

  • Avatar of Channeled
    Channeled
    APP
    ยท
    ยท

    Currently, we offer native integrations with several powerful LLM models including OpenAI's GPT models, Google's Gemini Pro, and Anthropic's Claude. These integrations are designed to support various AI tasks including data cleaning, summarization, copywriting, and deeper research tasks.While we don't currently have native integrations for TPU/fast inference chip-hosted models like Llama on Cerebras or Groq, our existing AI integrations provide robust capabilities for research tasks. You can access these models either through our Clay-managed account or by using your own API keys for specific models. Sources: Gemini: Generate text with Google Gemini Use AI integration overview | Documentation | Clay University Prompt Engineering Crash Course [Limitless Research] - Clay University

  • Avatar of Channeled
    Channeled
    APP
    ยท
    ยท

    Is that what you were looking for?

  • Avatar of Channeled
    Channeled
    APP
    ยท
    ยท

    Hi Wes H.! This thread was recently closed by our Support team. If you have a moment, please share your feedback: