How do i manage large look ups?
For large lookups, use the Limit parameter in the "Lookup Multiple Rows in Other Table" action to cap the number of returned rows—this cuts down response size when you have many columns. [1]
If you're dealing with extremely large datasets, consider writing excessive data to Google Sheets with an ID reference, then importing summarized data back into Clay using lookup actions.
Also, if you only need one row, use "Lookup Single Row in Other Table" instead—it's faster than the multiple row lookup.
but you can only write 300 rows per minute in google sheet
For large lookups, use the Limit parameter in the "Lookup Multiple Rows in Other Table" action to cap the number of returned rows—this cuts down response size when you have many columns. [1]
If you're dealing with extremely large datasets, consider writing excessive data to Google Sheets with an ID reference, then importing summarized data back into Clay using lookup actions.
Also, if you only need one row, use "Lookup Single Row in Other Table" instead—it's faster than the multiple row lookup.
For large lookups, use the Limit parameter in the "Lookup Multiple Rows in Other Table" action to cap the number of returned rows—this cuts down response size when you have many columns. [1]
If you're dealing with extremely large datasets, consider writing excessive data to Google Sheets with an ID reference, then importing summarized data back into Clay using lookup actions.
Also, if you only need one row, use "Lookup Single Row in Other Table" instead—it's faster than the multiple row lookup.
Was that helpful?
You're right—Google Sheets has a limit of 300 read requests per minute per project. When you hit that limit, Google temporarily blocks further requests until the quota resets (typically after a minute).
The best workaround is to retry the request after a brief pause once the quota resets.
Are you running into this limit when writing data to Sheets, or when looking up data from Sheets back into Clay?
I need to insert 10.000 rows at once 😉
For bulk inserts like that, the standard Google Sheets "Add Row" action won't work due to API rate limits.
Your best option is Google Apps Script with a webhook approach. You'd set up a script that sends batches of data from your Sheet to a Clay webhook table (ideally a passthrough table if you're on Enterprise). The script can control the send rate with Utilities.sleep() to match your enrichment speed. [1]
The webhook method lets you process high volumes over time without hitting the 300/min wall, since you're controlling the flow from the Google side.
Are you on the Enterprise plan (which would give you access to passthrough tables)?
I'll built a postgresql database instead with nodejs 😉
That sounds like a solid plan. If you have any questions or need assistance with integrating your PostgreSQL setup, feel free to ask.
If you still need help with your PostgreSQL integration or any other part of your setup, I’m here to assist. Would you like to share more details about what you’re trying to achieve or any challenges you’re facing?
.png)