I managed to create a Google Cloud Run app that is meant to read and write from BigQuery.I tested the /read endpoint and can import JSON data into a Clay table, but how will just move to thread
I managed to create a Google Cloud Run app that is meant to read and write from BigQuery.
I tested the /read endpoint and can import JSON data into a Clay table, but how do I dynamically parse it into rows and columns?
The /write endpoint is ideally meant to process all rows from a table (as JSON). How do you get a Clay table into JSON format?
BigQuery seems to be a common enough database that someone must designed a solution for this before, right?
I ended up using a different method with webhooks that does what I need. Thank you!
The biggest help to me and others would be seeing added documentation with examples for common use cases like this. BigQuery is very common, so seeing documentation around its use within Clay (even if not with a direct integration) would definitely help with onboarding.
A lot of the documentation and videos I’ve seen gloss over the technical specifics of implementation. Seeing the specific steps documented with code samples would go a long way.
Hari K. hello - curious if you have any updates on a bigquery integration or at least a collection of specific steps as discussed above?
Hey there! I've noted down BigQuery here as an integrations request. We can't share exact timelines, but our team's decisions are largely informed from customer feedback. In the meantime, you can access bigquery through an HTTP API column or webhook setup
Thanks! Yes, had a call with a data team at another company and they were showing us the Snowflake<>Clay integration and how they leverage it. We use bigquery and having parity with what the Snowflake integration offers would be hugely beneficial
