Hey team, I have a webhook automation set up for a clay table but starting this morning attempts to write records to the table using the webhook are failing with Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}. I have inspected the payload for each erring record and they all look benign (see sample logs below). The requests are also spaced out every 1s so I don't suspect it's a rate limit issue either. My only hunch is the table I am writing to in Clay is almost at 50K rows (49,896 to be exact) but I suppose being on the enterprise plan (which we just signed up for) should ensure that's not an issue. Please advise.
[info] [sendPeopleSourcingRequestsToClay] Sourcing people via Clay for cdp_people_sourcing_for_account_task of 53555, account_id=0016Q000027uik****
[info] [sendPeopleSourcingRequestsToClay] {
[info] companyInfo: {
[info] sfdcAccountId: '0016Q000027uik****',
[info] name: 'Lake Bluff, Village Of',
[info] domain: 'lakebluff.org',
[info] linkedinUrl: 'http://www.linkedin.com/company/lake-bluff-school-district-65'
[info] }
[info] }
[info] --- Running function: sendPeopleSourcingRequestToClay ---
[info] Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}
[info] --- Retrying query sendPeopleSourcingRequestToClay (attempt 2/4) in 1000 ms ---
[info] Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}
[info] --- Retrying query sendPeopleSourcingRequestToClay (attempt 3/4) in 1000 ms ---
[info] Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}
[info] --- Retrying query sendPeopleSourcingRequestToClay (attempt 4/4) in 1000 ms ---
[info] Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}
[info] --- Failed running function: sendPeopleSourcingRequestToClay ---
While the Enterprise plan does offer unlimited rows in general, this error is likely occurring because you're not using Passthrough tables, which are specifically designed for processing large volumes of data efficiently.To resolve this, you should set up your table as a Passthrough table. This feature will: ** Automatically process incoming data ** Enrich it as needed ** Forward it to a designated destination ** Delete the original entries from the tableThis approach helps maintain table manageability while continuously handling new data.Also, keep in mind that webhooks have rate limits of: ** 10 records per second processing ** Maximum burst capacity of 20 records Sources: Import Data to Clay Using Webhooks Configuring your table settings | Documentation | Clay University Workspace administration documentation | Documentation | Clay University
Looks like you're still looking for help. We're escalating this over to our support team and someone should be in touch soon. In the meantime, feel free to include any other information that may be helpful in solving your issue!
Hey Arindam - Thanks for providing those details! The issue isn't with your table size, but rather with the webhook itself - each webhook has a 50k record limit. While I check with the team about potentially raising this limit for your enterprise account, there's a quick workaround: You can create a new webhook for the same table by clicking the + sign to add a fresh webhook. This will let you continue adding records while maintaining everything in the same table, but just to clarify - Standard tables do have a 50k row limit (unless they're passthrough tables) as well. Let me know if you need help setting up the new webhook, and I'll get back to you shortly about the webhook limit if there's anything we can do here. https://downloads.intercomcdn.com/i/o/w28k1kwz/1341169530/be83ca757fc34c10de2fff232026/CleanShot+2025-01-18+at+_06bDXI6CpH%402x.png?expires=1737209700&signature=e31c6c0c8ee7c13f94ff9e6c7642bcb471f96bac1315b97d04eb998c850d5689&req=dSMjF8h4lIRcWfMW1HO4zS9EEMTRlUT287MosJy43Rf8VBra9q9HH%2BjrTrkd%0A%2Finf%0A
Hey Arindam - I heard back from our team. The issue occurs because the table is set as a regular table rather than a pass-through table. I noticed your account didn't have the passthrough table feature enabled, but we've now activated it. This will resolve the webhook record limit issue since passthrough tables automatically delete rows after processing, preventing the 50k record cap. You can follow the steps here to activate it and create a new webhook inside this table. Let me know if you need any extra help since it's a new feature.
Ok gotcha. Just to add some more context on how we use this table currently:
the destination table for this webhook is a list of companies
we have people search action enabled on the table which feeds the results to another table
So IIUC once I reconfigure this as a passthrough table, the rows that were already used for people sourcing would be purged allowing the new webhook to start working. And going forward the auto delete will kick in preventing a repeat occurrence?
Hey Arindam! Thanks for reaching out. Happy to help✨. Once you configure your table for "pass-through", rows will auto-delete once enriched. However, because your workflow also includes a people search in a different table, we also want to safe-guard against records being deleted before the people search is complete. One approach is to add a Lookup Row column to search the people table and confirm if a record was found. The "pass-through" logic can then be configured based on whether the Lookup integration has run. Additionally, you can include "delay" columns to ensure the Lookup integration only runs once the people search for a company record is complete. Here's a videoguide showing this. If you can share your table URL, I’d be happy to review your current setup and provide suggestions for your workflow.
Gotcha. Here's the destination table of companies - https://app.clay.com/workspaces/239721/tables/t_xRYMHrqkiufZ/views/gv_sZaSQFhg8dA6 and here's the people table - https://app.clay.com/workspaces/239721/tables/t_H2CvgEDXZmsV/views/gv_dv3iRKtshHjw
Also if I wanted to unblock myself in the meantime while we figure out the best way forward, can I just create a new webhook (that doesn't have the 50k limit) and let things run?
Hey Arindam! Thanks for sharing. Yeah, you can send records to a new webhook table. We made a video covering an example workflow (specific to your table flow) that should help with preventing company records from being deleted before the search & company data sync is complete in the people table: https://www.loom.com/share/0cff503e57f649979ab8ecc1611f7bd9 Let me know if you have any questions!
Ok I have set up the table based on the guidance here. Some followups:
Can someone vet the setup real quick and chime in if I missed anything?
zooming out a bit, now that the table is set up as a passthrough, I wanna confirm the 50K webhook limit no longer applies ie even if 50K records do get inserted by the new webhook it wouldn't stop working since old records would continuously get purged upon enrichment?
When does the auto delete kick in? I see new rows that were added and subsequently used for people sourcing but the rows are still around.
This is more about people sourcing action than the original problem - I want to confirm that a particular row (company) has already been used for people sourcing in the second table. Is there some indicator for that in the table? I see People Search column getting filled up (see screenshot) but I do not see any contacts found for many of the rows in the second table. Does that mean no contacts were found for this company?
Hey Arindam! Apologies for the delay. Your setup looks good. We confirmed that the auto-delete function is working as it should. Visually it may look like it's not, because rows are being imported as quickly as they are being deleted. For context, It is configured to delete 100 rows every time the limit is reached. People Source Action: We escalated the example you shared to the engineering team. Based on our tests, the search should have imported a record for that company. We're looking into this and will get back with an update. Additional tests also do not show that this impacted other records in the table. For context, records are being imported based on the search criteria. In some instances, the company size filter applied may prevent contacts from being imported for records that do not have company size data in the provider's database. That means it will only import based on a company that matches the criteria provided and if it does not match that criteria, it will not import any contacts for the company record.
Great. Thanks for validating the setup and looking into the people source action 🙏
I wanna confirm the 50K webhook limit no longer applies ie even if 50K records do get inserted by the new webhook it wouldn't stop working since old records would continuously get purged upon enrichment?
If you could confirm this part for me, that will be appreciated 🙏
Hey Arindam! Happy to help✨. That is correct. With auto-delete enabled, your table won't run into issues with the 50k limit.
