yeah, Cllaygent credits can get eaten up quick, especially when you're using the web scraper a lot. One way to keep costs down is to make your prompts super specific so Clay doesn’t have to work overtime figuring stuff out. Also, try filtering your list first so you're only scraping pages you actually need. I usually test things on a small batch first to make sure the setup works before going all in. And for simple scraping, I sometimes use cheaper tools like Browse AI first, then just use Claygent for the heavier stuff. Curious if anyone else has found good hacks for this too!