Clay Icon

Automating Batch Processing for API Records in Datasets

Β·
Β·

Googling and searching on this, but can't find any info: Is there a way to setup a job to run the "next" set of records/rows in a dataset, rather than the whole table? i.e. API limit might be 200 records every 1 hour - is there a way to get it to continue to run the remaining records every 1 hour without having to manually run it?

  • Avatar of Channeled
    Channeled
    APP
    Β·
    Β·

    Yes, we've recently introduced a scheduling feature that can help automate this process. You can set up scheduled columns to run your workflows automatically at your preferred frequency (daily, weekly, or monthly). To set this up: 1. Click the "β›­ icon" in your table 2. Under "Run the table," select "On a schedule" 3. Choose your frequency and whether to run all columns or only selected ones This feature allows you to update data and run workflows automatically, which can help manage API limits by processing a set number of records at regular intervals. However, it's worth noting that we don't currently have a feature to specifically process the "next" set of records. If you need more precise control, you might need to explore custom solutions or request additional features.

  • Avatar of Justin
    Justin
    Β·
    Β·

    This is limited to minimum once per day, need every hour for example.

  • Avatar of Bo (.
    Bo (.
    Β·
    Β·

    Hey there,

    Unfortunately, there's no built-in way to automatically run the "next" set of records based on API limits. Your only scheduling option is through the gear icon next to the action button, where you can set it to run once per day, week, or month.

    This doesn't provide the hourly API limit workaround you're looking for though. The system currently processes the entire table when scheduled rather than picking up where it left off.

    Let me know if you need any other clarification about scheduling options.

  • Avatar of Channeled
    Channeled
    APP
    Β·
    Β·