One Way to Invoke Fabric Data Factory Pipelines Using the Fabric REST API

I’m an automation weenie, that’s a fact. It’s why I’m drawn to frameworks – especially data engineering execution orchestration frameworks.

Speaking of frameworks, if you haven’t checked out Martin Schoombee’s series titled Building a Framework for Orchestration in Azure Data Factory: A Series, you are missing out. First, Martin is scary smart. Second, he shares practical and clever ideas for designing ADF pipelines and managing ADF pipeline execution. Why do I mention Martin’s (excellent) series at the start of this post? I use the ADF REST API to design metadata-driven ADF execution orchestration frameworks. While Martin and I take different approaches, we’re trying to deliver a very similar solution: automate complex data engineering by abstracting away (at least) some of the complexity. And this post is my first about using Fabric Data Factory REST API methods.

This post is current as of 30 May 2024. There are other posts by fantastic bloggers about how to use the Fabric REST API. Fabric development is progressing so fast, some of those posts are less up-to-date. Make no mistake, this post will most likely not age well, and for the very same reason. That’s ok. We bloggers live to serve. I, like all the rest, will endeavor to persevere – and we will all write more posts, Lord willing.

In this post, I share one way to invoke Fabric Data Factory pipelines using the REST API.
I will be using the web version of Postman to call REST API methods.
You can sign up for a free Postman account. Since it’s free, I encourage you to check the box to receive news and offers from them. As I mentioned in an earlier post, you can always unsubscribe if the messages are unhelpful or if they get too “chatty.”


  1. Build a simple pipeline for testing purposes.
  2. Grab the Fabric Authorization token.
  3. Construct the REST API method call.
  4. Test it!

Cool? Let’s roll.

Build a Simple Pipeline

Connect to Fabric.

Click the Data Factory capacity to open the Data Factory Home page:

If you’d rather work in a different workspace, click the Workspaces item from the left menu and then click the “+ New Workspace” button:

When the Create a workspace blade displays, provide a Name and (optional) Description for your shiny, new workspace, and then click the Apply button:

When the workspace displays, click the “+ New” button and then click “Data pipeline”:

When the “New pipeline” dialog displays, enter a Name for your pipeline and then click the Create button:

When the “Build a data pipeline to organize and move your data” screen displays:

  1. Click “Start with a blank canvas”
  2. Scroll to – and then click – the “Wait” activity:

The waitTest pipeline is ready.

Let’s build a second test pipeline while we’re here. To begin, click the API workspace button on the left menu.

When the API workspace displays:

  1. Click the ellipsis beside the waitTest pipeline
  2. Click the “Save as” option:

Name your new (cloned via “Save as”) pipeline “waitTest_params” and then click the Save button:

When the waitTest_params pipeline displays:

  1. Click the Parameters tab
  2. Click the “+ New” button to add a new parameter
  3. Name the new parameter “waitSeconds”
  4. Set the waitSeconds parameter type to Int
  5. Enter a Default value for the waitSeconds parameter (I entered 2)
  6. Click the Save icon:

Ok, we’re done creating test Fabric Data Factory pipelines.

Acquire the Fabric Authorization Token

I’m not sure if the browser matters, but I’m using Edge.

While you’re connected and logged into Fabric, press the F12 key to open the DevTools prompt and then click the “Open DevTools” button to proceed:

When DevTools opens, you may (like me) be overwhelmed by the number of messages displayed by default. If “Console” isn’t selected, click the Console button at the top of the window. You’ll want to focus on the prompt at the bottom of the window:

In the prompt for Console, enter “powerBIAccessToken”. When you begin typing, IntelliSense will kick in and help. Click the full command (“powerBIAccessToken”) to complete the command, unless you really enjoy typing:

When you execute the “powerBIAccessToken” command (press Enter), a string will be returned. Now, make no mistake, the actual string will be about 4-5 times as long as what I’ve pictured here – so don’t let that throw you:

The powerBIAccessToken value is displayed enclosed in apostrophes. The value does not include the apostrophes, so highlight and copy carefully.

The powerBIAccessToken value changes regularly. I can hear some of you thinking, “How often does the powerBIAccessToken value change, Andy?”

I do not know. I just know that it changes from time to time. It may change each time one logs into Fabric.
I share this tidbit with you to spare you troubleshooting the syntax of your REST API method calls later when the syntax is just fine, but you’re using an expired powerBIAccessToken value.
You are welcome.

Construct the Call to the REST API Method

To start building the REST API method call, you need a good template. To execute a Fabric Data Factory pipeline on 30 May 2024, I use the following template:


Here there be placeholders; the text enclosed in the curly braces are placeholders. There are (only) two placeholders and next I show you how to locate their values. I placed this command in Notepad:

If DevTools is still open, close it by clicking the “X” in the almost-upper right corner. If you click the “X” in the really, really-upper right corner, you will close your browser. If I’m right about the powerBIAccessToken value changing every time you log in, you will need to re-capture that value. That’s incentive to click the “X” that’s inline with the Console menu button near the top of the DevTools window. Note: This is why I’m working late to complete this post:

Store the token someplace. If you log out or close the browser, it’s ok. Just remember to get a updated powerBIAccessToken when you start again.

To get the {workspaceId} value, click on the workspace name in the left menu and then copy the GUID-ish-looking value that follows “” in the URL:

I copied the GUID-ish-looking value that follows “” in the URL and pasted it over the {workspaceId} placeholder in my template Notepad file:

Next, open the waitTest pipeline:

When the waitTest pipeline displays, copy the GUID-ish-looking value that follows the URL that begins with “{workspaceId}/pipelines/”. Paste that GUID-ish-looking value over the {PipelineId} placeholder in your Notepad file:

The REST API method call is now complete (Save the file).


Visit the Postman site.

If you do not have an account, you can create a free account.

I confess I’m a n00b using Postman, so I will definitely miss steps here, for which I apologize in advance. I was able to get to the following screen by opting to use the web interface:

First, I change the GET to POST and then copy-paste the freshly-built REST API method call from my Notepad file, remembering to delete “POST” at the beginning of the REST API method call:

Next, I:

  1. Click Authorization on the header menu
  2. Change the Auth Type to “Bearer Token”
  3. Right-click in the Token textbox
  4. Click “Paste as plain text” to paste the powerBIAccessToken value into the field:

We are ready to test! Click the Send button:

… and here’s where we learn that the powerBIAccessToken value is valid for less time than it takes Andy to write a blog post:

Durnit (apologies for the language…)!

The fix is fairly painless:

  1. Press F12 to open DevTools in Edge
  2. Execute the powerBIAccessToken command in the console
  3. Copy the powerBIAccessToken value
  4. Paste the powerBIAccessToken value into the Token textbox in Postman
  5. Click the Send button again:

No response. It turns out no response is healthy and good.

Return to Fabric Data Factory:

  1. Open the Workspace
  2. Click the ellipsis beside the waitTest pipeline
  3. Click “Recent runs”:

There we go – Succeeded!

Next, let’s test the waitTest_params pipeline.

Begin by editing the Postman call:

  1. Click Body
  2. Click raw
  3. Make sure JSON is selected
  4. Add the following parameters JSON to the Body textbox:
  “executionData”: {
    “parameters”: {
      “waitSeconds”: 10

Grab the PipelineID value from the URL like before:

  1. Update the Postman REST API method call
  2. Click Send:

Check the Recent Runs for the waitTest_params pipeline… and “Not started?” Huh. Ok.

We wait a few minutes and look again at Recent Runs. This time we have Succeeded! Let’s examine the run by click the name of the pipeline:

Run details show us the pipeline execution succeeded but it did not successfully pass the parameter value we sent from Postman. The execution took only 2 seconds and we wanted the Wait to wait for the value of the waitSeconds parameter which we sent as 10.

Now I have a confession. This is contrived failure. I did this to demonstrate what I consider to be the worst kind of failure during testing: Execution succeeds but the target of execution does not do what I wanted it to do.

I did not connect the waitSeconds parameter to the “Wait time in seconds” property of the Wait activity. Here’s how we fix it.

Open the waitTest_params pipeline. Click on the Wait activity and then click Settings. Click inside the “Wait time in seconds” property textbox, and then click the “Add dynamic content [Alt + Shift + D]” link to open the Pipeline expression builder:

When the Pipeline expression builder opens:

  1. Click Parameters
  2. Click waitSeconds
  3. Observe the expression in the Pipeline expression builder textbox
  4. Click OK:


The “Wait time in seconds” property is now configured to use the waitSeconds parameter:

Click Send in Postman.

We have a new successful recent run:

And that run took 11 seconds.

Now the pipeline is working as we want.


While I’m pleased to use the Fabric Data Factory REST API to start a pipeline, we have a long way to go before this solution is as cool as Martin’s framework or frameworks I’ve designed for Azure Data Factory (ADF). But it’s a start, and a good start at that.

As always, I welcome your feedback. Please leave a comment with your thoughts and suggestions.


Andy Leonard

Christian, husband, dad, grandpa, Data Philosopher, Data Engineer, Azure Data Factory, SSIS guy, and farmer. I was cloud before cloud was cool. :{>


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.