Hi, I am currently struggling to figure out the best way to protect against duplicate rows prior to running paid enrichment actions. My workflow is as follows: Table 1, do a google maps search based on a HTTP API call that returns 10 entries in an array. I then, in the next column, Write to Other Table, writing out some of the values returned in the array, as seperate lines including the address. However now, I want to make sure that each row is unique in the table based on physical address. Is there a way to do this? TIA.
💤 You've caught us outside of our support hours (9am-9pm EST), but don't worry - we'll be back in touch within 24 hours (often sooner!). If you haven't already, please include the URL of your table in the thread below so that we can help you as quickly as possible!
Hi, happy to help! To ensure each row in your table is unique based on physical address, you can use the auto-dedupe feature which prevents duplicates from being added to your table. This feature is particularly useful before running paid enrichment actions to maintain a streamlined workflow. You can also dedupe on company identifiers like website or cid. https://downloads.intercomcdn.com/i/o/w28k1kwz/1214866250/763e24077d89ba6c93b438d470ef/Berrycast_d66hoO0cHi.png?expires=1728895500&signature=a1b263052945baab5dfb1bb04163476e3ae55244114025cae17b0f42638d5031&req=dSImEsF4m4NaWfMW1HO4zT6yEODYsQyV9jneQ8%2B4R98Qvbenh4CHc%2BiaPYDY%0A60Ff%0A
Daniel K. I can easily flip it to CID which makes it simplier, but how do I do this check before or during the write to new table action? My thought is there would be a way to do that check as part of a step in the write to table workflow to avoid it making duplicate rows and then enriching a second time - trying to be most efficient with my credits.
Hey, thanks for reaching out Peter. You can add a auto-dedupe to the table (this is the best option) that won't allow duplicates to live in this table. Here's how [App: Article Inserter] Or you can also keep the data and use this https://www.loom.com/share/a39a4569d95244b7886026f3c70812c7
Hi Peter G.! This thread was recently closed by our Support team. If you have a moment, please share your feedback:
thanks!
Thank you so much for sharing your feedback Peter G.!