Hi, I want a table to scrape a company's website weekly, how do I get a workbook to do this automically?
To set up automated website scraping, you can follow these steps:1. Create a new table in your workspace: โข Open your workspace โข Click "Create New" > "Table" โข Choose "Custom Table" 2. For web scraping, you have several options: โข Use the Clay Chrome extension for basic, static websites โข Utilize Clay's native scrapers for general web scraping โข Implement Zenrows for sites with anti-bot protection โข Use Apify actors for complex scraping needs 3. Configure auto-update settings: โข Enable the "Auto-update" option in your table settings to automatically refresh data for new rows โข Use the "Only run if" condition to set specific parameters for when the scraping should occur Sources: Create New Table Guide: How to share a workbook and table as a template | Clay University Write to table integration overview | Clay University
Hi Michael R.! This thread was recently closed by our Support team. If you have a moment, please share your feedback:
Thanks! We've reopened this thread. You can continue to add more detail directly in this thread.
So just to clarify, am I able to get current cells/rows to refresh on a regular basis?
Yes, you can have your cells/rows refresh automatically. By default, integrations will auto-update every 24 hours.To manage auto-updates:1. Find the Auto-update icon at the bottom right of your table 2. Click "Turn on Table" to enable auto-updates 3. You can turn it off anytime by clicking the Auto-update icon and selecting "Turn Off Table"Note that you can also set conditional run criteria, which lets you specify exactly when and under what circumstances the data should update. Sources: Auto-update and Auto-dedupe Table Hunter | Clay University
Hi Michael R.! This thread was recently closed by our Support team. If you have a moment, please share your feedback:
Thanks! We've reopened this thread. You can continue to add more detail directly in this thread.
So I'm wanting a cell to scrape a website every week. Will this do this?
By default, auto-update runs every 24 hours, not weekly. However, you can use conditional run criteria to control when the scraping occurs. The scraping integration allows you to extract data from webpages including body text, links, emails, phone numbers, and keywords from specified URLs.You can set up conditional run formulas to determine exactly when the scraping should occur. This gives you control over the timing and circumstances of when the data updates.To learn more about setting up conditional runs, you can check out the Clay University lesson on AI formulas and conditional runs. Sources: Hunter | Clay University Scrape Website with Clay
Hey there - just wanted to check in here to see if you needed anything else! Feel free to reply back here if you do.
We haven't heard back from you in a bit, so we're going to go ahead and close things out here - feel free to let us know if you still need something!
Hi Michael R.! This thread was recently closed by our Support team. If you have a moment, please share your feedback: