Hi Clay support, I am using the Chrome Extension to scrape a yelp page for interesting information for my company. It successfully pulls the data on each page from the recipe I created, but when I click add to workbook it creates a new one each time. Is there a way to have this add to the same workbook each time for the same recipe?
Based on the current functionality, when you add scraped data to your workspace using the Chrome Extension, it creates a new table each time. The extension opens a new tab with a ready-made table filled with the information you specified. The available documentation doesn't indicate a built-in option to append data to an existing table directly from the Chrome Extension. Each time you use "Add to Workspace," it generates a fresh table. Are you looking to combine data from multiple scraping sessions into one master table, or would you prefer the extension to automatically detect and append to existing tables with the same recipe?
i want different recipes to append to respective tables
The Chrome Extension currently creates a new table each time you use "Add to Workspace" - there isn't a built-in feature to automatically append data to existing tables based on recipe type. When you add scraped data to your workspace, a new tab opens with a ready-made table filled with the information you specified. To achieve your goal of having different recipes append to respective tables, you would need to manually consolidate the data after scraping by copying and pasting from the newly created tables into your existing master tables for each recipe type. Are you currently using the same recipe across multiple Yelp pages, or do you have different recipes for different types of Yelp data that you want organized into separate tables?
Hey Grant! Thanks so much for reaching out - we're so excited you're using the Chrome extension to scrape! To answer your question, currently we do not yet support sending the data to a designated table - it must create a new one each time. However, a popular alternative approach is to download the CSVs and import them manually into the tables you desire. Alternatively, I would suggest trying out our Claygent models to do scraping within a table. Hope this helps! - Hari