Hello, I have one website URL with a list of companies where there is a button with "details" per every company. In this details are the phone number, specific company website url and email. Instead of manually clicking every detail button and copying the information out of it I like to use Clay to kind of scrape the website and automatically get all the information per every detail button. Does someone know a good code (maybe claygent) or another way on clay to quickly get this information?
Use the Clay Chrome Extension to scrape individual company detail pages:
1. Install the Clay Chrome Extension from the Chrome Web Store
2. Navigate to one company's detail page (click a "details" button)
3. Open the Clay extension and click "Select Data to Add from Page"
4. Choose "Select a Single Attribute" and map the attributes you want (phone, email, website)
5. Set URL pattern matching by replacing the specific company identifier in the URL with a variable (e.g., :company_name
)
6. Save your recipe - it will automatically apply to all similar detail pages
Alternatively, use Claygent (our AI web scraper) with custom prompts to navigate and extract data from detail pages automatically. You can also use our Scrape Website action to extract content including phone numbers and emails from specified URLs.
I want to use claygent, but could you give me a code for that
Here's a Claygent prompt to extract company details from detail pages: Prompt:Setup: 1. Add Claygent from the AI section in your enrichment panel 2. Input your detail page URLs in the URL field 3. Use the prompt above in the instruction field 4. Claygent will navigate each detail page and extract the requested information You can also use our Scrape Website action which can extract phone numbers and emails directly from specified URLs.
the problem is there is only one url and I want clay to navigate itself to the specific detail page which corresponds with the club name, give me a code for that
Here's a Claygent prompt to navigate from your main list page to each detail page automatically: Prompt:Setup: 1. Create a column with your company names from the list 2. Add Claygent from the AI section 3. Input your main list URL in the URL field 4. Use the prompt above, referencing your company name column 5. Claygent will navigate to each detail page and extract the information Alternatively, use our Scrape Website action which can extract phone numbers and emails directly from URLs if you can construct the detail page URLs.
Hi Jippe - thanks so much for reaching out! Happy to help here. I think the best workflow would be to do the following: 1) Create a table with your main website URL 2) Add the first Claygent column to extract company names and detail URLs 2) Add a second Claygent column that uses the detail URLs to scrape individual pages 4) Use "Write to other columns" in Claygent to automatically populate separate columns for phone, email, website, etc. You could use the following prompt for the first Claygent action: Please visit {{Website URL}} and extract a list of all companies shown on this page. For each company, I need: 1. Company name 2. The URL or link that the "details" button leads to for that specific company Return the results in this JSON format: [ { "company_name": "Company Name", "detail_url": "full URL to detail page" } ] Make sure to capture the complete URL for each detail button, not just partial links.
Then this for the second one: Please visit this company detail page: {{Detail URL}} Extract the following information: - Phone number - Company website URL - Email address - Any other relevant contact information Return the data in JSON format: { "phone": "phone number found", "website": "company website URL", "email": "email address", "other_info": "any additional relevant details" } If any field is not found, return "Not found" for that field.
Hope this helps & let us know if you have any more questions! Best, Aditi
One great tool to find companies website url in bulk like you need is to use : https://linkfinderai.com/ they have an accurate tool for that with api access