I'm trying to us openAI classification using openAI 4o mini. I have about 5 fairly large tables with 2 classification columns in each. In the last 7 hours....it's only processed about 10000 rows. I'm tier 4 on the API so it should be good. Here's a sample of what one of the run info looks like. Any ideas on why it's chugging or how to speed it up. The prompts are fairly simple and low in token use....it's just classifying based on the linkedin description alone.
Even with a tier 4 API, slow processing can occur due to various factors. Here are some potential reasons and solutions: 1. Large data volume: Break your task into smaller batches for faster processing. 2. Complex queries: Ensure your prompts are specific and well-defined. 3. Network issues: Check your connection stability. 4. API quotas: Monitor your usage limits and API quotas, even on higher tiers. 5. Column updates: If you've made multiple changes or stopped runs midway, try creating a new column by duplicating the current one and running it on the remaining rows. To speed up processing, consider using the "Max Cost" parameter in Claygent to set a higher budget for faster results. If issues persist, you may want to review your input data format and double-check your prompts before running tasks.
https://app.clay.com/workspaces/89103/workbooks/wb_ZiBh4hxJpRRc/tables/t_iUxWD6pApjKX/views/gv_ZdQz2nNnCgnf and there's 2 AI columns in each of the tables.
Hey Brian! Apologies for the late response. Did the issue resolve itself? I can see that all rows have run in the current table. Do you have an example of where the rows are still queued up and have not run. The slow processing speed is likely due to a combination of factors: 1. Rate limits on your Tier 4 OpenAI API key 2. Clay's batching system that processes a certain number of rows at a time 3. Multiple concurrent tables competing for API resources To improve performance, try: ** Running tables sequentially rather than concurrently ** Breaking your data into smaller batches * Checking if your OpenAI API key has any concurrent request limitations
ππ½
Hi Brian, Thanks for reaching out. Weβre aware of an issue where rows in some tables may be deleted when running enrichments. Our team is actively working on resolving this and you can follow live updates here: status.clay.com Weβll keep that page updated as we make progress. Thanks so much for your patience while we work on a fix.
Weβve pushed a fix that should deal with the underlying issue that was causing rows to be deleted in tables with auto-dedupe enabled, that being said, we're still monitoring the situation. Additionally, our team is working on data recovery. Weβll share another update as soon as we have more details on data restoration. Thank you for your continued patience.
Hey there - we wanted to let you know that the incident related rows being deleted in auto-dedupe tables has been resolved. If you have any further questions, please don't hesitate to reach back out! https://status.clay.com/incidents/44a7091d-8b49-4a0c-b417-bd27d498c1f9