Skip to main content

Hi Community,

Does anyone who uses the BigQuery integration have the same problem as me? The integration has created literally tens of thousands of transfer configurations over the last year, which has made the GCP BigQuery Data Transfers UI more or less unusable as a result.

If so: What do people usually do about this?

Is it really necessary for the integration to create a new configuration every hour, or am I missing something?

Hi Padraig,

This is Yuanyuan from the Amplitude Support team - happy to take a look for you.

Could you please share a bit more context about the transfer configurations you mentioned? Do you use this BigQuery import connection? Screenshots would be very helpful but if there is any private information you want to share, please let me know. Then I can start a private thread with you to continue the discussion.

Looking forward to hearing back from you!

Best,
Yuanyuan


P.S. Checkout upcoming events and user meetups on our events page.

I think I can give you some high level details here. We use the built-in BigQuery destination - here is a screenshot of the regular load jobs that run:

You can see that records get exported every half hour.

Over on the GCP side, this gets implemented via a series of one-off data transfer configurations from an AWS bucket under Amplitude’s control - I can show you some of the info from that:
 


Happy to start a private thread if you need additional details.​


Hi Padraig,

Thanks for the extra information! Let me check with the engineer and get back to you.

Best,
Yuanyuan


P.S. Checkout upcoming events and user meetups on our events page.
Hi Padraig,

The engineer let me know that this is a known issue that is in our backlog. Before they can releases a solution, you are free to free to clean up the previously created transfer configs. Sorry for the inconvenience. I will keep you updated!

Best,
Yuanyuan


P.S. Checkout upcoming events and user meetups on our events page.

Reply