Yes, delays with large datasets like 20k-30k records are normal. Performance can be affected when there are large tables and multiple actions running simultaneously, which shows up as latency where data takes longer to load.
Try a hard refresh to help with performance:
- Mac: Command + Shift + R (Chrome/Firefox) or Option + Command + E (Safari)
- Windows: Ctrl + F5 or Ctrl + Shift + R
Performance issues are more likely with larger tables or complex workflows like tables linked together. [1] [2]
If you still need help with the delay issue or have other questions about sending table enrichment, I’m here to assist. Would you like to provide more details about what you’re trying to achieve or any specific concerns you have?
Exactly - large datasets naturally take longer to process. With 20k-30k records, you're dealing with significant data volume that requires more processing time.
To help with performance, you can break large requests into smaller batches rather than processing everything at once. This reduces the load and can actually speed up your overall workflow.
The delays are normal behavior when working with datasets of this size, especially when multiple enrichments are running simultaneously.
If you need any more help managing the delay with large datasets, I’m here to assist. Would you like to share more about your current workflow or any specific challenges you’re facing?