The most recent content from our members.
Hello, I'm using SpringBoot and Kotlin to integrate with the Amplitude Export API. The official Amplitude documentation (https://amplitude.com/docs/apis/analytics/export) states that "data is available to export at a minimum within 2 hours of when the servers received it." However, there's no clear information about the…
Hi i am using destination for exporting data to bigquery. I recently found that the exported data is not existed in bigquery even though the amplitude destination ui show the export was success. In destination UI, but there are actually not data in bigquery. It was just missing
Hi Team, I have a use case that requires exporting user events data to Snowflake. Currently, the lowest export frequency available in the Amplitude Analytics web app is every 3 hours. However, my use case requires scheduling the export to run every 12 hours or once a day. It would be very helpful if the platform allowed…
As a Product Analyst, I frequently work with a large number of charts in Amplitude and often need to audit, organise, or share this work across teams. It would be extremely helpful to have the ability to export a comprehensive list of all charts within a project—including metadata such as chart name, type, associated…
Presently, the schema validation errors feature has only limited utility as the only three methods to see this data are * In the event plan * via email * via slack The biggest issue with this is that for my use cases, the developers responsible for resolving the work are rarely in the amplitude event plan, and as they…
Today, exporting a cohort includes every recent user property. This is great in theory, however when exporting a list of more than 100k, the export takes a long time to generate, and then even longer to pull up in Excel/Numbers. Can you allow for simple column selection for the export cohort tool?
The definition of Saved segment is quite limited, for example it is not possible to: * Exclude users who performed an action * Filter events based on their parameters (and not just the event name) Is there any workaround? Thanks
Hello there, I am facing some discrepancies between the attribution data I see directly via Amplitude’s front end and what is landing on BigQuery via the export. I have created some segmentation fields with rules based on UTM fields. Let's use the ‘L3-Channel’ field and the date of 2025-01-02 for this example. Both…
I am a free plan user and have set up S3 as the Destination in Connections. Although I can confirm that gz files are being transmitted to S3 every hour, I've noticed that the number of exported events is significantly lower than what is shown in the web UI. When searching for specific users, I've discovered that there is…
Hi, is recurring syncs of Amplitude event data to BigQuery through the Amplitude UI paid? Is it only available for paid customers (on which plans)? I’ve seen related topics but the information in them is inconsistent and I don't know which one is valid.
It looks like you're new here. Sign in or register to get started.