Hey guys. Been trying to work something out with Phantombuster and Clay. Trying to push every new engagement on Linkedin (comment, like) into Clay for further enrichment. Has anybody been able to achieve this? Id like to set it up where I have 1 clay table, and then if people liked or commented and the comment they place. Id like to set it up where it keeps refreshing as I keep posting. Will need to chain 2 phantoms together.
๐ Hey there! Our support team has got your message - we'll be back in touch soon. If you haven't already, please include the URL of your table in this thread so that we can help you as quickly as possible!
Yes, what is issue?
You have to schedule a phantom buster run on automated schedule and send the results to your clay table
The problem is that i need 2 phantoms that run of 1 phantom. And i cant get the data to clay table in the way that I want
Id like to get this filled. But need 2 phantoms that run of Activity Extractor Phantom
Hey Thijs! Thanks for reaching out. To confirm, the main blocker here is that you have to use two separate phantoms that provide separate data, but would like to send them to the same table? or with the data being sent? If you have a table you tested this with, happy to take a look. Feel free to share a screenshot of the phantoms you have running off the Activity Extractor Phantom as well or a video explaining your use-case. It'll help the team make tailored suggestionsโจ.
Hey there - just wanted to check in here to see if you needed anything else! Feel free to reply back here if you do.
We haven't heard back from you in a bit, so we're going to go ahead and close things out here - feel free to let us know if you still need something!
Thanks! We've reopened this thread. You can continue to add more detail directly in this thread.
Hey Daniela D. here is a loom: https://www.loom.com/share/a23a4f8b9fba4e7880e43d70a63d0a30?sid=7b03a27d-67f2-49eb-9335-97eea06b3c7e This exactly demonstrates what I am trying to achieve
Hey Thijs! Thanks for sharingโจ. There are two options to approach this: Option 1: Using Two/Three Phantoms * Use the first Phantom to extract your posts. This will include the post snippet, post URL, post date, and other associated fields. * Use the second Phantom to extract comments and likes. This will capture the post URL, comment text, full name, LinkedIn URL, reaction type, and comment date. Once both tables are configured, add a Lookup integration in the second table to pull post text and date from the first table if needed. The Lookup will match based on the post URL and retrieve the associated data if a match is found. Here's a quick guide on using Lookup: https://www.loom.com/share/ad767e3be9914d19a2f6647efe4dfa85?sid=c8b78c32-9bc1-4501-8124-d6c70d2218cf Option 2: Using an Apify Scraper * Use an Apify scraper that combines all the data fields mentioned in a single export action. This eliminates the need for multiple tables. Here's an example: https://apify.com/curious_coder/linkedin-post-search-scraper This scraper can retrieve post content, date, comments, likes, profile URLs, full names, and the comment date. The engagement date (for comments) will be in Unix format, but you can normalize it with a formula in your Clay table. Here's a quick guide on using Apify: https://www.clay.com/university/lesson/deep-dive-apify-actors-limitless-research Let me know if you have any questions about setting up either option! ๐
Hey there - just wanted to check in here to see if you needed anything else! Feel free to reply back here if you do.
We haven't heard back from you in a bit, so we're going to go ahead and close things out here - feel free to let us know if you still need something!