I am currently ingesting data from Snowflake, using a valid SQL query.
The sync job does not even start for the time-based sync. How long would I have to wait before the data is synced?
Hi Ian and karthika88 ! Jeremie has escalated this issue to the Technical Support team. I have come to notice that both of you have email address with the same domain name. Are you both part of the same company?
If so, can you clarify if the Snowflake integration is working now? I checked your org and I see that there are a lot of Snowflake destinations that are connected and are ingesting data. If you are working on different Snowflake ingestions but one of them is working while the other is not, is there a way for you to compare between the two set-ups to see if there are any differences?
karthika88 Regarding the question about lag, can you provide more context? Is it that you created an import but the first ingestion job did not start running until some time after? Or did it start running but perhaps not complete until sometime later? Are you still seeing the data timestamped from when you first created the import or are you seeing data timestamped after the time value you have set?
https://www.docs.developers.amplitude.com/data/sources/snowflake/?h=time+based#time-based-import
I am running into the same issue.
Hi Jeremie Gluckman, is there any support on this? I have not resolved the issue. Thank you.
Thanks for following up karthika88 and Ian. This totally slipped thought the cracks. I’m escalating it to the team now.
Thank you! It worked on our end, but there was a lag - that is the jobs did not start immediately according to the value that I had set.
Can you look into why there was a lag ? We did not get any details from the progress dashboard
Great, thanks
Is it possible to delete jobs which have been in “processing” state for a while ?
Hi karthika88! In the UI, there is no way to delete jobs. Can you share with me your use case for why you would want to delete these jobs?
We are running an experiment, and in charts seeing a higher number of exposures than assignments. When we drill into users who have an exposure and not an assignment, they do end up having an “Assignment” event in their user event stream. What is preventing them from being included in the roll-up assignment calculation?…
When im creating we experiment, there is option “Saved Pages” here… I searched everywhere and I could not find a single mention of this in docs and i could not figure out how to set this up. My goal is to define a list of pages, ideally by importing a csv. I know i could create rules, but defining that for tens or hundreds…
I am trying to integrate aplitude in my kotlin android application. My dashboad is on US servase zone. I am sure that i copied the right api key for my project but logs shows that the api key is invaled with the staus code 401 Amplitude( Configuration( apiKey = BuildConfig.AMPLITUDE, context = androidContext(), serverZone…
Hi everybody, I am running an experiment using the visual editor. I have made some copy changes to our subheading. However, when I land on the testing page, I can see the change from version A to version B happening in real-time. I already have a few colleagues complaining that it does not look good. Is there anything I…
The redirect from Control to Treatment does not work automatically under the following conditions: * When the user lands on the Control page via a link. * When the user lands on the Control page via a window.location.href redirect. * When the user lands on the Control page via a header('Location: <target_page>') redirect…