This guide will walk you through how to set up a very basic implementation using the Supermetrics API to write data from a single Supermetrics query to an Azure Storage Container on a recurring schedule.
Advanced settings like parameters, alternative authentication types, and alternative sinks aren’t covered in this guide.
Prerequisites
To set this up, you’ll need:
- An active Azure subscription
- A resource group within your Azure subscription
- An Azure Data Factory (default settings are sufficient — a Git configuration isn't necessary)
- An Azure Storage account with an Azure Storage Container
Instructions
Step 1: Create a query and K-JSON URL
- Log in to the Supermetrics Hub.
- In the sidebar, click API.
- Select for Supermetrics API from the dropdown menu next to the page title.
- Click Select data source.
- Select a data source and connection to use.
- In the sidebar on the left, build your query.
We recommend saving working queries into the custom schema, and using the relative time range (e.g., last 7 days) for the incremental load. Smaller and lighter queries help ensure stability, so we recommend using those wherever possible. - Click Run to test it and review your data in the preview.
- From the URL format options, select K-JSON.
- Copy the K-JSON URL.
Keep the API key safe
The query URL includes an API key that gives anyone with the URL access to your data. Treat this as you would a password and keep it safe. If you need to share the URL publicly, use the Short URL option that does not display your API key.
Step 2: Create a REST linked service
To connect Supermetrics to Azure Data Factory, you’ll need a linked service with the REST API. Learn more about the REST API in Azure’s documentation.
- Open Azure Data Factory Studio.
- Click Linked services under Manage in the sidebar.
- Click New and then select REST.
- Give your data extract a memorable name (e.g. Supermetrics_Facebook_Campaign).
- Paste the URL you copied from Query Manager into the Base URL field.
- Select Anonymous from the Authentication type dropdown menu.
- Leave all other options as they are and click Create.
Step 3: Create a dataset to use the REST linked service as a source
- Open the Factory resources tab by clicking Author in the sidebar.
- Under Dataset, select New dataset.
- Choose REST, add a meaningful name, and select the Linked Service you created in Step 1.
- Save the dataset.
Step 4: Create a linked service for your Storage Container
To write files to an Azure Storage Container, we’ll need to configure another linked service in Data Factory.
- Click Linked services under Manage on the sidebar.
- Click New and then select Azure Blob Storage.
- Give your linked service a memorable name.
- Select your Azure Subscription and your Azure Storage Account.
- Optionally test the connection to your bucket. The test will only test access to the bucket, not the subdirectories in the bucket. We’ll define the directory structure later.
Step 5: Create a dataset for your Storage Container
Now that we’ve connected our Storage Account to Data Factory, we need to create a dataset that we can use in our pipeline operation.
- Open the Factory resources tab by clicking Author on the sidebar.
- Under Dataset, select New dataset.
- Choose Azure Blob Storage.
- Select DelimitedText.
- Give your dataset a familiar name and select the Linked Service you created in Step 4.
- Check the First row as header box.
- Enter the name of your container, any subdirectories you’d like, and the desired name of your output file.
- Select None for Import Schema.
- Save your dataset and continue on to the next step.
Step 6: Create a pipeline using your dataset and sink
Create a new pipeline or edit an existing one.
- Drag a Copy data activity in your pipeline. (This is under the Move & transform section.)
- Click the new Copy activity and click the Source tab.
- Specify the source — this will be the dataset you created in Step 3.
- Specify your sink — this will be the Dataset you created in Step 5.
- Use the Validate button to ensure your configuration is valid.
- Use the Debug button to simulate a run of your pipeline.
- You should now see a file in your Azure Container generated from the Supermetrics API.
- Once you’re satisfied with your file output, publish your changes.
Step 7: Schedule
- Once you’ve published your changes from Step 6, open your pipeline and click the Add trigger button at the top of the pane.
- Choose Add/New, and then New.
- Leave the Type as “Schedule”, and pick your frequency, timezone, and time of day. (Your frequency is determined by your Supermetrics contract.)
- Click OK to save your trigger. Be sure to publish your changes.