I want to understand how to calculate the credits for Clay signals
When creating a custom signal, you'll see the estimated Clay credit cost per result next to each source you can monitor. [1]
To view your actual credit spend for signals, go to your account name → Settings → Credit usage, then check under the Signals tab of the credit usage dashboard. [2]
The credit cost varies depending on which source you're monitoring and what data you're enriching through the signal.
Did that answer your question?
If you need any more help understanding how credits are calculated for Clay signals, please let me know. Would you like to provide more details about what you're trying to resolve or understand?
for open jobs signal, does the credits cost vary on the number of companies in the companies list?
Yes, signal monitoring costs 0.2 credits per company per run. So if you have 100 companies in your list, each signal run would cost 20 credits (100 × 0.2). The total cost scales directly with the number of companies you're monitoring.
Is that what you were looking for?
isn't that expensive in comparison to scrapping through a scrapper or simply create an open job workflow table?
Clay's signals at 0.2 credits per company are actually cost-effective compared to alternatives. We often double or triple data coverage at 1/5th or less the cost of other providers - one enterprise client went from 30% coverage at ~25 cents per enrichment to 80% coverage at less than ~1 cent per enrichment. [1]
The 0.2 credit cost reflects our wholesale rates from data providers rather than marked-up subscription fees.
Was that helpful?
If you still need help understanding credit costs or comparing options, I’m here to assist. Would you like to share more about what you’re trying to achieve with your signals or workflows?
But the credits cost doesn't make sense. For one of my client it works fine but for another other one the cost is different whereas the tables are entirely the same
the signal table for open jobs are the same
Credit costs can vary even with identical signal tables due to several factors:
Common causes of different costs:
Different AI models selected - older rows might use a 3-credit model while newer ones use the 15-credit Sonnet model
Filters applied differently before processing
Automated features enabled on one client's setup but not the other (auto-updates, scheduled runs)
Validation processes consuming additional credits even for failed results
To troubleshoot:
Check if different AI models were selected for each client
Review if auto-updates or recurring processes are enabled differently
Verify if other team members triggered processes on shared tables
Right-click on the signal column and select "Run Info" to see the exact credit breakdown for each client's table. [1]
Is that what you were looking for?
If you need any further help understanding the credit cost differences or troubleshooting your signal tables, I’m here to assist. Would you like to provide more details about the specific issues you’re encountering with your clients’ setups?
.png)