Skip to main content

I am trying to call the Amplitude Export API (link here: https://developers.amplitude.com/docs/export-api) with R.

Below is the explanation but I really don't see how to apply the following using a function like GET or curl functions.

curl -u API_Key:Secret_Key 'https://amplitude.com/api/2/export?start=20150201T5&end=2015020

Hey @Maxence 
Here are some online tools which I use to translate curl commands into a specific language 

You will have to use the httr R package to make it work. I believe this is similar to the Requests library in Python.

Let me know if this works!


Many thanks @Saish Redkar  ! I managed to make the request as I get successful 200 http response.

BUT I don’t understand the data I retrieved. It’s some kind of gibberish as follow:

Do you know how I can decode this? 


Looks like it’s returning a raw vector.

As per the docs, httr will automatically decode content from the server using the encoding supplied in the content-type HTTP header. 

Try 

content(res, "text") 

or 

content(res, "text", encoding = "ISO-8859-1")

This is what I get. I have a feeling the output could be a zip file as I see in the Export API but I don’t know how to read a zip file in R.

 

 


Hey Maxence,
Yup. The Export API outputs a zipped archive of JSON files, one or multiple files per hour depending on your input parameters.

I’m not well-versed with R that much but I use Python’s Request library pretty much for all my Amplitude API calls.

The httr webpage shows the following.

You can try extracting the output of the API call to a zipped file and then either manually unzipping it or extracting it using R code.

I did some digging and found this which might help you out. Hope this helps!

 

Is your use case to analyze the raw event json files in R?


@Saish Redkar you are a life savior ! :) 

From your suggestion I made it work by writing this:

bin <- content(res, "raw")
writeBin(bin, "myfile.zip")

Which downloaded mant zip files which contains many json files.

And I can read json files, thanks a lot ! :)


Glad that worked! Happy to have helped :)
 


Reply