I'm looking to run a big (17K) table through the APIfy integration but if I run all rows the vast majority of them Time Out, what's a good workaround for this?π€
To handle a large 17K table with Apify without timing out, try these workarounds:
1. Use conditional runs to process smaller batches. Set up a formula to run the integration on specific subsets of your data.
2. Implement rate limiting. In the integration settings, define a request limit and duration to control API calls and prevent overloading.
3. If possible, split your table into smaller ones and process them separately.
If these don't resolve the issue, you may need to request human assistance for a more tailored solution.