Skip to main content

Hello, I just started at a company that uses SnowPlow for sending events to different tools.

I took a look at the snowplow doc and our implementation, and it looks like we're using 'custom structured events' which have a category, a label, and only one property field (which we fill with a json in order to pass several properties)

SnwoPlow’s doc about this is here: https://docs.snowplow.io/docs/understanding-your-pipeline/canonical-event/

 

Question: did anyone successfully integrate Snowplow and Amplitude? If so, how? I did some research and looks like it’s feasible via Google Tag Manager but that sounds like a clunky hack.

From the looks of it, I would be better off going the Amplitude SDK route, but the upside of connecting with Snowplow is the ability for snowplow to send historical data.

 

Thanks in advance!

Hey @GuillaumeWanderu,

I’ve seen the integration you mentioned in the snowplot docs, and it looks like the only way to connect both is via backend tracking so they’re using the GTM server side as a middle man to connect the HTTP API from amplitude, that would work let’s say as a ‘webhook/routing’ not a hack per se. Unfortunately, it looks like Amplitude doesn’t have an integration yet in their portal for that specific tool, so that’s the solution they went through.

Hope this helps!


Thanks Naryie.

what would the end result be in Amplitude? For instance, can we parse all the event properties (that are currently in a json as part of the snowplow implementation)? What about event names?

Trying to figure out if it’s best to look more into this or to deploy the Amplitude SDK for a cleaner implementation.


@GuillaumeWanderu sorry about the long delay. 😥 At the moment, we don't have a direct Snowplow integration built out for importing events into Amplitude. However, we have had other customers who have utilized Snowplow to send their events into S3 where you can subsequently pull the events from using our S3 Import functionality: https://www.docs.developers.amplitude.com/data/sources/amazon-s3/ 

 

This is considering that Amplitude will be the final destination of your events since you are trying to send data downstream to Amplitude from Snowplow. In this case, you should be able to access Amplitude reports but since we're a precision tracking platform we will log events for whatever is sent to Amplitude regardless of if they were tagged directly or sent via a warehouse. This ultimately comes down to your preferred method of ingestion.


Hey @GuillaumeWanderu, apologies for the delay.

To answer your last question, yes, you can either use the solution Esther gave you sending the json file via S3(to store/reroute the data) through the S3 data source, or configuring the tag manager option with your event/user properties as its shown in the snowplot docs.

Hope it helps!


Reply