Hey guys, quick help needed. I’m creating an automation in n8n that sends my Apollo leads with unidentified emails to Clay for enrichment. To be cautious, I’d like to know Clay’s API limits for sending data. How many rows of unidentified leads can I send to Clay to avoid hitting any rate limits and minimizing the chances of failures?
For our API, the general guideline is 10 requests per second (or 20 per second in bursts), which amounts to around 864,000 requests per day.
For webhooks specifically, the limit is 10 records per second per workspace, with a maximum burst capacity of 20 records. Each payload must not exceed 100KB. [2]
To stay within safe limits, send your Apollo leads at 10 rows per second or batch them appropriately to avoid rate limit issues.
Did that answer your question?
If you need any more help with your automation or API usage, please let me know. Would you like to provide more details about what you're trying to achieve or any challenges you're facing?
I'm using https://api.clay.com/v3/sources/webhook/ URL to send the rows. Should I apply a loop to wait for 1 second per 10 rows of data and just send the leads in bulk?
Yes, that approach works well. The webhook limit is 10 records per second with burst capacity of 20 records. [1]
You can send up to 10 rows per second safely. A 1-second wait per 10 rows ensures you stay within limits. The Google Sheets example shows similar timing with Utilities.sleep(1000) between sends. [2]
Keep each payload under 100KB and you'll avoid rate limit issues.
Was that helpful?
If you still need help with your webhook automation or have any other questions, I’m here to assist. Would you like to share more about what you’re trying to resolve or understand?
So in my use case, I'll be sending an Array of JSON with 10 objects per second. (Where each object represents a lead for enrichment). Is that the highest safe limit? How far can I push it?
Yes, 10 objects per second is the safe limit for webhooks. You can push it to 20 records per second in bursts, but the sustained rate should stay at 10 records per second per workspace. [1]
If you have multiple webhook sources active in your workspace, you'll need to adjust limits accordingly to stay within the 10 records/second total.
Is that what you were looking for?
If you need any further assistance with your webhook usage or API limits, I’m here to help. Would you like to provide more details about your setup or any other questions you have?
On average, if I'm using Clay to enrich Work Emails, how long would it take to enrich 10 rows vs 100 rows and respond back to a webhook via the HTTP API?
Our enrichment model means responses might take a minute or more per batch. [1] The processing time isn't directly proportional to row count - both 10 and 100 rows could take similar timeframes since enrichments run in parallel within Clay tables.
Are you planning to use our auto-delete feature to stream data through without storing it in Clay?
Yeah, I don't think I'll need to keep the data in Clay. I asked this question to understand when I should proceed with my workflow in n8n since I don't want to depend on the webhook response of Clay. The response is going to be a separate async function where it will only update a Google sheet with the enriched data findings. So I'll create a hack of a Wait node to proceed with my workflow regardless whether Clay responded or not.
Perfect approach! Since you don't need to store data in Clay, consider using our Auto-delete feature. It automatically enriches incoming webhook data, sends results to your destination (like Google Sheets), then deletes the rows—so Clay streams data through rather than storing it. [1]
This keeps things fast and lightweight for high-volume enrichment jobs, which sounds exactly like your use case. [2] Your async approach with the Wait node will work well given Clay's enrichment timing.
