Clay Icon

Webhook Automation Failure: Record Limit Reached for Clay Table

ยท
ยท

Hey team, I have a webhook automation set up for a clay table but starting this morning attempts to write records to the table using the webhook are failing with Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}. I have inspected the payload for each erring record and they all look benign (see sample logs below). The requests are also spaced out every 1s so I don't suspect it's a rate limit issue either. My only hunch is the table I am writing to in Clay is almost at 50K rows (49,896 to be exact) but I suppose being on the enterprise plan (which we just signed up for) should ensure that's not an issue. Please advise.

[info] [sendPeopleSourcingRequestsToClay] Sourcing people via Clay for cdp_people_sourcing_for_account_task of 53555, account_id=0016Q000027uik****
[info] [sendPeopleSourcingRequestsToClay] {
[info]   companyInfo: {
[info]     sfdcAccountId: '0016Q000027uik****',
[info]     name: 'Lake Bluff, Village Of',
[info]     domain: 'lakebluff.org',
[info]     linkedinUrl: 'http://www.linkedin.com/company/lake-bluff-school-district-65'
[info]   }
[info] }
[info] --- Running function: sendPeopleSourcingRequestToClay ---
[info] Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}
[info] --- Retrying query sendPeopleSourcingRequestToClay (attempt 2/4) in 1000 ms ---
[info] Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}
[info] --- Retrying query sendPeopleSourcingRequestToClay (attempt 3/4) in 1000 ms ---
[info] Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}
[info] --- Retrying query sendPeopleSourcingRequestToClay (attempt 4/4) in 1000 ms ---
[info] Error evaluating sendPeopleSourcingRequestToClay: {"type":"Forbidden","message":"Record limit reached for webhook","details":null}
[info] --- Failed running function: sendPeopleSourcingRequestToClay ---
  • Avatar of Channeled
    Channeled
    APP
    ยท
    ยท

    While the Enterprise plan does offer unlimited rows in general, this error is likely occurring because you're not using Passthrough tables, which are specifically designed for processing large volumes of data efficiently.To resolve this, you should set up your table as a Passthrough table. This feature will:** Automatically process incoming data ** Enrich it as needed ** Forward it to a designated destination ** Delete the original entries from the tableThis approach helps maintain table manageability while continuously handling new data.Also, keep in mind that webhooks have rate limits of:** 10 records per second processing ** Maximum burst capacity of 20 records Sources: Import Data to Clay Using Webhooks Configuring your table settings | Documentation | Clay University Workspace administration documentation | Documentation | Clay University

  • Avatar of Channeled
    Channeled
    APP
    ยท
    ยท

    Looks like you're still looking for help. We're escalating this over to our support team and someone should be in touch soon. In the meantime, feel free to include any other information that may be helpful in solving your issue!

  • Avatar of Bo (.
    Bo (.
    ยท
    ยท

    Hey Arindam - Thanks for providing those details! The issue isn't with your table size, but rather with the webhook itself - each webhook has a 50k record limit. While I check with the team about potentially raising this limit for your enterprise account, there's a quick workaround: You can create a new webhook for the same table by clicking the + sign to add a fresh webhook. This will let you continue adding records while maintaining everything in the same table, but just to clarify - Standard tables do have a 50k row limit (unless they're passthrough tables) as well. Let me know if you need help setting up the new webhook, and I'll get back to you shortly about the webhook limit if there's anything we can do here. https://downloads.intercomcdn.com/i/o/w28k1kwz/1341169530/be83ca757fc34c10de2fff232026/CleanShot+2025-01-18+at+_06bDXI6CpH%402x.png?expires=1737209700&signature=e31c6c0c8ee7c13f94ff9e6c7642bcb471f96bac1315b97d04eb998c850d5689&req=dSMjF8h4lIRcWfMW1HO4zS9EEMTRlUT287MosJy43Rf8VBra9q9HH%2BjrTrkd%0A%2Finf%0A

  • Avatar of Bo (.
    Bo (.
    ยท
    ยท

    Hey Arindam - I heard back from our team. The issue occurs because the table is set as a regular table rather than a pass-through table. I noticed your account didn't have the passthrough table feature enabled, but we've now activated it. This will resolve the webhook record limit issue since passthrough tables automatically delete rows after processing, preventing the 50k record cap. You can follow the steps here to activate it and create a new webhook inside this table. Let me know if you need any extra help since it's a new feature.

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    Ok gotcha. Just to add some more context on how we use this table currently:

    1. 1.

      the destination table for this webhook is a list of companies

    2. 2.

      we have people search action enabled on the table which feeds the results to another table

    So IIUC once I reconfigure this as a passthrough table, the rows that were already used for people sourcing would be purged allowing the new webhook to start working. And going forward the auto delete will kick in preventing a repeat occurrence?

  • Avatar of Daniela D.
    Daniela D.
    ยท
    ยท

    Hey Arindam! Thanks for reaching out. Happy to helpโœจ. Once you configure your table for "pass-through", rows will auto-delete once enriched. However, because your workflow also includes a people search in a different table, we also want to safe-guard against records being deleted before the people search is complete. One approach is to add a Lookup Row column to search the people table and confirm if a record was found. The "pass-through" logic can then be configured based on whether the Lookup integration has run. Additionally, you can include "delay" columns to ensure the Lookup integration only runs once the people search for a company record is complete. Here's a videoguide showing this. If you can share your table URL, Iโ€™d be happy to review your current setup and provide suggestions for your workflow.

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    Also if I wanted to unblock myself in the meantime while we figure out the best way forward, can I just create a new webhook (that doesn't have the 50k limit) and let things run?

  • Avatar of Daniela D.
    Daniela D.
    ยท
    ยท

    Hey Arindam! Thanks for sharing. Yeah, you can send records to a new webhook table. We made a video covering an example workflow (specific to your table flow) that should help with preventing company records from being deleted before the search & company data sync is complete in the people table: https://www.loom.com/share/0cff503e57f649979ab8ecc1611f7bd9 Let me know if you have any questions!

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    Ok I have set up the table based on the guidance here. Some followups:

    1. 1.

      Can someone vet the setup real quick and chime in if I missed anything?

    2. 2.

      zooming out a bit, now that the table is set up as a passthrough, I wanna confirm the 50K webhook limit no longer applies ie even if 50K records do get inserted by the new webhook it wouldn't stop working since old records would continuously get purged upon enrichment?

    3. 3.

      When does the auto delete kick in? I see new rows that were added and subsequently used for people sourcing but the rows are still around.

    4. 4.

      This is more about people sourcing action than the original problem - I want to confirm that a particular row (company) has already been used for people sourcing in the second table. Is there some indicator for that in the table? I see People Search column getting filled up (see screenshot) but I do not see any contacts found for many of the rows in the second table. Does that mean no contacts were found for this company?

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท
  • Avatar of Daniela D.
    Daniela D.
    ยท
    ยท

    Hey Arindam! Apologies for the delay. Your setup looks good. We confirmed that the auto-delete function is working as it should. Visually it may look like it's not, because rows are being imported as quickly as they are being deleted. For context, It is configured to delete 100 rows every time the limit is reached. People Source Action: We escalated the example you shared to the engineering team. Based on our tests, the search should have imported a record for that company. We're looking into this and will get back with an update. Additional tests also do not show that this impacted other records in the table. For context, records are being imported based on the search criteria. In some instances, the company size filter applied may prevent contacts from being imported for records that do not have company size data in the provider's database. That means it will only import based on a company that matches the criteria provided and if it does not match that criteria, it will not import any contacts for the company record.

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    Great. Thanks for validating the setup and looking into the people source action ๐Ÿ™

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    I wanna confirm the 50K webhook limit no longer applies ie even if 50K records do get inserted by the new webhook it wouldn't stop working since old records would continuously get purged upon enrichment?

    If you could confirm this part for me, that will be appreciated ๐Ÿ™

  • Avatar of Daniela D.
    Daniela D.
    ยท
    ยท

    Hey Arindam! Happy to helpโœจ. That is correct. With auto-delete enabled, your table won't run into issues with the 50k limit.

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    gotcha. thanks!

  • Avatar of Daniela D.
    Daniela D.
    ยท
    ยท

    Hey Arindam! Quick follow up: There's a limit on the number of records that can be sent via webhook per second. For context, the limit is 10 records per second, with a maximum burst capacity of 20 records. This means the request will have to be set to send records within that limit. This would also allow for records to be deleted at the same pace as records are being imported (the table is configured to delete 1000 rows every time the limit is reached). Let me know if you have any questions.

  • Avatar of Daniela D.
    Daniela D.
    ยท
    ยท

    Hey Arindam! Thanks for your patience. We heard back from the team and confirmed that the reason why that record (directsalestfloors) did not return a result is because the LinkedIn url (which shows as unclaimed on LinkedIn) used to run the search only has one contact associated with it and they do not match the search criteria (title shows "Sales Professional"). That's why a profile was not added to the People table. Alternatively, if the domain is used for that company, it will return a record that matches your criteria. Let me know if you have any questions!

  • Avatar of Channeled
    Channeled
    APP
    ยท
    ยท

    Hi Arindam D.! This thread was recently closed by our Support team. If you have a moment, please share your feedback:

  • Avatar of Channeled
    Channeled
    APP
    ยท
    ยท

    Thanks! We've reopened this thread. You can continue to add more detail directly in this thread.

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    Hey folks reopening this thread since I am now seeing that the table has had no new entries since 1/22. Basically this is what I see:

    1. 1.
    2. 2.

      We have successfully sent webhook requests to this table per the logs on our end as of 1/28

    3. 3.

      I have double checked the webhook URL and confirmed they match

    4. 4.

      No rows have also been auto deleted since 1/22 per the logs on Clay UI (see screenshot) so I don't imagine that these companies were added and eventually deleted post enrichment.

    Can you help sort this out?

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    I also see a similar issue on this table https://app.clay.com/workspaces/239721/tables/t_vkep3NjUjYb7/views/gv_cbF9n6TX7DCE where we basically had the same issue of the webhook reaching 50K rows. I set up a new webhook which per my understanding from this thread should have reset the limit and allowed the records to get inserted but I am starting to think the 50K row limit for the table is somehow getting in the way.

  • Avatar of Arturo O.
    Arturo O.
    ยท
    ยท

    Hey Arindam, thanks for reaching out. Checking on this! You're correct, there's a mix of circumstances here. The webhook itself wouldn't remove enriched records so it would stop importing once 50k rows are reached, but the auto-delete feature would be the one to help keep records flowing. However, only the first table has it enabled and your current limits are set to allow up to 1000 rows to remain in the table after the conditions have been met before they're deleted. Quick question and observation, there are many rows in both tables that have not been enriched yet and why they remain in there instead of being deleted. Should they be processed to allow the delete feature to continue and allow more records to get pushed in? https://downloads.intercomcdn.com/i/o/w28k1kwz/1355645171/669422b51f587e2a62bbf150edc7/slack_YqBuYdWhkx.png?expires=1738113300&signature=3c1389fd7ded4c74e57815b26b8d3a2fb2aaaf9f694c4df4eca014235d32bf51&req=dSMiE896mIBYWPMW1HO4zdQog%2BNpbkkSaq7WBBQvqhbGUPDvVXpRcK70XLBx%0AiZYv%0A https://downloads.intercomcdn.com/i/o/w28k1kwz/1355647593/e9583c90f4f16daca292b52844fb/chrome_rYacgmNVAo.png?expires=1738113300&signature=a3fafb8dbc1d562fa930ecf4bb6c720e21e4fe0a6de2bddbb20479fd38cd45eb&req=dSMiE896moRWWvMW1HO4zWVtA7zqomJ2j3zJ7cWIm7wdtvciVYmubLsNlLlH%0Ajg1f%0A

  • Avatar of Arturo O.
    Arturo O.
    ยท
    ยท

    Alternatively, I can bring down the enriched row limit down for you so only fewer records remain after they've been enriched. For example, down to 50-100 per table, once they're enriched and that max limit is reached they'll be removed to let new rows come in and making sure the overall 50k table limit is never reached.

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    ok I have a few follow ups

    1. 1.

      In https://app.clay.com/workspaces/239721/tables/t_xRYMHrqkiufZ/views/gv_sZaSQFhg8dA6 I see a bunch of rows where the auto delete criteria were met (delay2 has success) but these still seem to be around (see screenshot). Why is that?

    2. 2.

      I was told earlier in the thread that each webhook has a 50k record limit.. So when I switched to a new webhook why did records insertion get stopped at 50K? Is it because of a limit on rows on the table itself? Tables should have "unlimited rows" in Enterprise plan, right?

  • Avatar of Arturo O.
    Arturo O.
    ยท
    ยท

    Appreciate the follow up, Arindam! Thought it would help a bit more going over the example via a quick loom to explain what each feature does and how they behave. Let me know about those final thoughts as well: https://www.loom.com/share/28961542d46c4251a1878a2a187311e3

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    I am checking it out. Could someone also look at this table - https://app.clay.com/workspaces/239721/workbooks/wb_9r3vGFmEjDCX/tables/t_kH6YAZ9AR9gM/views/gv_qcrdNCVn4YqN and weigh in if it could also be dropping rows sent via webhook because the row count is close to 50K coz we have logs that confirm some rows being sent this morning but they are not there in the table.

  • Avatar of Arturo O.
    Arturo O.
    ยท
    ยท

    Thank you for sharing! In that table, since you're not at the 50k limit yet it should continue importing new records. With filters applied it states 202 were imported today, how many records didn't make it out of the total? Also, this table doesn't have the auto-delete turned on, you can click on the lower right corner kick that off if needed. https://downloads.intercomcdn.com/i/o/w28k1kwz/1357139538/1bf9ef8d5a453acef554fd277b18/chrome_lCSWNl4VcV.png?expires=1738194300&signature=68d3ad33e7fa7851d8fb7376b995238a5e1fe518bb9d4585042ac088681b6f92&req=dSMiEch9lIRcUfMW1HO4zRmKm9zSfyPzp8FZU7ku%2F0ACYjUTd6JaqgupvdDo%0A1y30%0A

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    I was going off PST timestamps. I actually now see the rate limit errors on the webhook "[Clay] Error: {""type"":""Forbidden"",""message"":""Record limit reached for webhook"",""details"":null}"

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    Curious why it's kicking in now when row count is not yet at 50K?

  • Avatar of Arturo O.
    Arturo O.
    ยท
    ยท

    That's a great question. I found a related note the team has on the topic. Regarding the current limit it reached in that table. There's a possibility that if some records were never added but were sent to that webhook, it counted them towards the overall limit or some of the older records were manually removed at some point. The scenario where you wouldn't encounter a webhook limit is when an the hook is attached to a table that has auto-delete enabled which allows it to remove the standard 50k record limit. Otherwise, if you're looking to keep the data in that table, it would be best to duplicate this table, it will allow you to start a new table with the same logic without any of the data, and to create a new webhook for that new table.

  • Avatar of Arindam D.
    Arindam D.
    ยท
    ยท

    ok back to prepping the table for auto delete. At this point https://app.clay.com/workspaces/239721/tables/t_xRYMHrqkiufZ/views/gv_sZaSQFhg8dA6 has most rows ready for delete (delay2 is set) but I am yet to see the records getting purged. Am I not interpreting the delete criteria correctly or is something else off?

  • Avatar of Arturo O.
    Arturo O.
    ยท
    ยท

    Thank you for the quick call, Arindam! Closing the loop here for now.

  • Avatar of Channeled
    Channeled
    APP
    ยท
    ยท

    Hi Arindam D.! This thread was recently closed by our Support team. If you have a moment, please share your feedback: