Open navigation

How to use Azure Data Factory with the Supermetrics API

This guide will walk you through how to set up a very basic implementation using the Supermetrics API to write data from a single Supermetrics query to an Azure Storage Container on a recurring schedule.


Advanced settings like parameters, alternative authentication types, and alternative sinks aren’t covered in this guide.


Prerequisites

To set this up, you’ll need:


Instructions

1: Create a query and K-JSON URL

Step 1: Create a query and K-JSON URL

  1. Log in to the Supermetrics Hub.
  2. In the sidebar, click API.
  3. Select for Supermetrics API from the dropdown menu next to the page title.
  4. Click Select data source.
  5. Select a data source and connection to use. 
  6. In the sidebar on the left, build your query.
    We recommend saving working queries into the custom schema, and using the relative time range (e.g., last 7 days) for the incremental load. Smaller and lighter queries help ensure stability, so we recommend using those wherever possible.
  7. Click Run to test it and review your data in the preview.
  8. From the URL format options, select K-JSON
  9. Copy the K-JSON URL.

Keep the API key safe

The query URL includes an API key that gives anyone with the URL access to your data. Treat this as you would a password and keep it safe. If you need to share the URL publicly, use the Short URL option that does not display your API key.

2: Set up the implementation in Azure Data Factory

Step 2: Create a REST linked service

To connect Supermetrics to Azure Data Factory, you’ll need a linked service with the REST API. Learn more about the REST API in Azure’s documentation.

  1. Open Azure Data Factory Studio. 
  2. Click Linked services under Manage in the sidebar. 
  3. Click New and then select REST.
  4. Give your data extract a memorable name (e.g. Supermetrics_Facebook_Campaign).
  5. Paste the URL you copied from Query Manager into the Base URL field.
  6. Select Anonymous from the Authentication type dropdown menu.
  7. Leave all other options as they are and click Create.


Step 3: Create a dataset to use the REST linked service as a source

  1. Open the Factory resources tab by clicking Author in the sidebar. 
  2. Under Dataset, select New dataset.
  3. Choose REST, add a meaningful name, and select the Linked Service you created in Step 1.
  4. Save the dataset.


Step 4: Create a linked service for your Storage Container

To write files to an Azure Storage Container, we’ll need to configure another linked service in Data Factory.

  1. Click Linked services under Manage on the sidebar.
  2. Click New and then select Azure Blob Storage.
  3. Give your linked service a memorable name.
  4. Select your Azure Subscription and your Azure Storage Account.
  5. Optionally test the connection to your bucket. The test will only test access to the bucket, not the subdirectories in the bucket. We’ll define the directory structure later.


Step 5: Create a dataset for your Storage Container

Now that we’ve connected our Storage Account to Data Factory, we need to create a dataset that we can use in our pipeline operation.

  1. Open the Factory resources tab by clicking Author on the sidebar. 
  2. Under Dataset, select New dataset.
  3. Choose Azure Blob Storage.
  4. Select DelimitedText.
  5. Give your dataset a familiar name and select the Linked Service you created in Step 4.
  6. Check the First row as header box.
  7. Enter the name of your container, any subdirectories you’d like, and the desired name of your output file.
  8. Select None for Import Schema.
  9. Save your dataset and continue on to the next step.


Step 6: Create a pipeline using your dataset and sink

Create a new pipeline or edit an existing one.

  1. Drag a Copy data activity in your pipeline. (This is under the Move & transform section.)
  2. Click the new Copy activity and click the Source tab.
  3. Specify the source — this will be the dataset you created in Step 3.
  4. Specify your sink — this will be the Dataset you created in Step 5.
  5. Use the Validate button to ensure your configuration is valid.
  6. Use the Debug button to simulate a run of your pipeline.
  7. You should now see a file in your Azure Container generated from the Supermetrics API.
  8. Once you’re satisfied with your file output, publish your changes.


3: Create a trigger to schedule your pipeline run

Step 7: Schedule

  1. Once you’ve published your changes from Step 6, open your pipeline and click the Add trigger button at the top of the pane.
  2. Choose Add/New, and then New.
  3. Leave the Type as “Schedule”, and pick your frequency, timezone, and time of day. (Your frequency is determined by your Supermetrics contract.)
  4. Click OK to save your trigger. Be sure to publish your changes.

Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.