With the enterprise plan and the unlimited rows; how do I set-up the pass through to not delete records post enrichment but to auto set-up another table where the next batch of 50,000 can land? Is there anyway around the rolling deletion to get more than 50,000 rows? Tom C. FYI
With the Enterprise plan, you have two main options to handle more than 50,000 rows:
1. Bulk Enrichment (Recommended)
2. Passthrough Tables with Write to Table
Enable passthrough tables to bypass the standard 50,000-row limit with rolling deletion. To preserve records, use the "write to table" feature to automatically copy enriched data to another table before deletion occurs. [3]
Setup Process:
Configure passthrough with your desired retention number (default 100 rows)
Add a "write to table" column that triggers after enrichment
Map enriched data to a destination table
Set conditions to only write completed enrichments
This creates a continuous flow where enriched records are preserved in separate tables while maintaining the rolling deletion in your main processing table. [4]
Is that what you were looking for?
If you need any more help setting up passthrough tables or managing your data beyond the 50,000-row limit, I’m here to assist. Would you like to provide more details about what you’re trying to achieve so we can continue?
Wouldn't I hit the 50,000 row limit on the 'write to' table as well? And then need to create another 'write to" on that new table. Essentially creating the same issue on every table that requires a manual involvement?
You're correct - each destination table also has the 50,000-row limit, creating a cascading issue.
However, there's a constraint that helps: write to table connections are limited to a maximum of 10 tables total across your workspace, and data can only flow in a linear direction (A → B → C, not loops). [1]
For true scalability beyond these limits, consider:
Bulk Enrichment for processing millions of records without table limits
External storage using Google Sheets with ID references, then importing summarized data back
Splitting workflows across multiple tables focused on specific actions
The write to table approach works for moderate scaling but isn't designed for unlimited growth due to these inherent constraints.
Is that what you were looking for?
So the concept of unlimited rows on the enterprise plan is effectively not a value add. It still requires manual intervention to execute plus you still get capped at 500,000 across 10 tables with 50,000 records.
You're correct about the constraints. The Enterprise plan's passthrough tables feature bypasses the 50,000-row limit per table but still requires manual setup and management.
Each table maintains the 50,000-row limit, so with 10 tables maximum, you'd reach approximately 500,000 total rows before needing manual intervention to create new tables or delete old ones.
The "unlimited rows" refers to the ability to process data continuously through passthrough tables with rolling deletion, not unlimited storage capacity. For true unlimited storage without manual intervention, you'd need external solutions or bulk enrichment workflows.
Is that what you were looking for?
If you still need help understanding or managing the row limits and passthrough tables, I’m here to assist. Would you like to share more about your specific use case or challenges so we can explore further options?
