I have a clay agent scraper that is passed a url and then scrapes and returns a list of items. Then I send the items in the list to a 2nd table for enrichment.
If I set up my scraper enrichment to run on a schedule, will the new items get added to the table with the previous items or do they replace the previous items?
When you schedule your scraper enrichment, whether new items get added or replace previous items depends on your settings.
For scheduled sources, there's an "Update existing rows" toggle. When this is OFF, new items get added to your table alongside previous items. [1] When it's ON, existing rows get updated with new information. [2]
For your workflow where you're sending scraped items to a second table, you'll want to toggle OFF "Update existing rows on re-run" to create new rows for each recurring run. [3] This ensures your new scraped items are added as separate rows rather than replacing what's already there.
So yes, new items will be added alongside previous items if you configure it correctly.
If you need any more help with setting up your scraper enrichment or have other questions, I'm here to assist. Would you like to provide more details about what you're trying to achieve or any specific issues you're encountering?
Thank you for your kind words! I'm glad I could help. If you have any more questions or need assistance with edge cases, feel free to ask. I'm here to help.