Microsoft Fabric Pipeline Manage
Perform Basic Actions on Pre-made Microsoft Fabric Pipelines
Revision History
1.0.0.6 Initial Release
Properties
Connection
Type: Connection Input
TenantId
Type: String
The Directory (tenant) ID as specified in the Entra App being used for authentication.
ClientId
Type: String
The Application (client) ID as specified in the Entra App being used for authentication.
ClientSecret
Type: Password
The Client Secret Value generated in the Entra App, this value is only displayed once when generating the Secret.
AccessToken
Type: Password
OAuth Access Token
RefreshToken
Type: Password
OAuth Refresh Token
Action
Type: List Input
The Action to be performed.
Start
Stop
GetStatus
ListAllJobInstances
WorkspaceId
Type: String Input
The Workspace identifier is used to target which workspace your Pipeline is in.
PipelineId
Type: String Input
The Pipeline identifier is used to target the specific Pipeline you want to perform an Action on.
JobId
Type: String Input\Output
The unique identifier for the instance that was just started.
Used by the Start, Stop and GetStatus Actions.
Custom Properties
Type: Custom Properties Input
Used for the Start Action.
Some Pipelines can have custom parameters in their setup.
They can be put on the Node as a Custom Property and will be used by Fabric when the Pipeline is run.
Setup
Definitions
- Workspace: A working space inside Fabric where items are made and worked on. More Info
- Pipeline: A user created tool that automates data ingestion and processing. More Info
- Job / JobId: When a Pipeline is run/started, a Job is created. Every Job has a unique GUID identifier resembling:
1ece390e-2824-3176-b616-7a8173e7fc14
- Run Instance: This is the same as the Job. Microsoft use these terms interchangeably. You may see InstanceId, this is the same as the JobId.
- Parameter: A parameter in a Fabric Pipeline is a placeholder that can be dynamically specified when running the pipeline. More Info
- Here is a general guide on how to set up a workspace in Fabric.
- Here is a general guide on how to create a Pipeline in Fabric.
How To Create a basic SQL to Lakehouse Table Ingestion Pipeline
In this How-To, we will create a Pipeline that will allow us to send in a custom SQL Query from a Flowgear workflow that will copy the output of that query into the specified Lakehouse table.
Prerequisites
- A Microsoft Fabric Workspace
- A destination in your workspace (eg: Lakehouse)
- A SQL source connection
Step 1: Create a New Data Pipeline
- In your Workspace, select + New item then Data Pipeline
- Name it (e.g., "SQLCopyJobPipeline") and select Create.
- The pipeline canvas opens.
Step 2: Add a Pipeline Parameter
- Click the pipeline canvas background to open the Properties pane at the bottom.
- You may need to click on the Settings Gear twice in the top pane to remove the initial tutorial.
- At the bottom, go to the Parameters tab.
- Select + New and create this parameter:
Name | Type | Default |
---|---|---|
SqlQuery |
String | Leave this field blank. |
This parameter will hold the dynamic SQL query for the source we send in from the Flowgear Node.
Step 3: Add and Configure the Copy Activity
- In the pipeline canvas, under Activities → Use copy assistant
- In the new window that has opened, choose SQL Server database
- Enter your credentials for the SQL Database and choose Next
- Choose Query and put in a simple query for source table. We will soon replace this with the Parameter we have made.
- Now select the Destination, either a New Fabric item, a New destination or select an existing option from your OneLake catalog
- For this How-To, we'll use a Lakehouse table
- By Load Settings choose the option Load to existing table and select your table
- Complete the various settings and Mapping, then select Next
- On the last screen Review + save De-select the Option at the bottom Start the data transfer immediately
- We do not want to start the transfer immediately, as we will call it from within Flowgear.
- Click OK
Step 4: Connect the Parameter to the Copy data
- After clicking the last OK in Step 3, you should be back on the Pipeline designer. If not, navigate to your pipeline from your Workspace.
- Click on the Copy data item, it will turn green.
- In the bottom pane, you will see various tabs, click Source
- Click in the text box of Query, you should then see a button underneath it Add dynamic content, click this.
- A new pane opens, choose the Parameters tab and select the SqlQuery we made earlier. The click OK at the bottom.
Step 5: Save the WorkspaceId and PipelineId
- After clicking OK in Step 4, you should be back on the Pipeline designer. If not, navigate to your pipeline from your Workspace.
- On your browser, click on the URL and copy it. Paste it somewhere in a text file. It should look like this:
https://app.fabric.microsoft.com/groups/11111111-1111-1111-1111-111111111111/pipelines/aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa?experience=fabric-ObjectId=99999999-9999-9999-9999-999999999999
- Your WorkspaceId is the GUID between
/groups/
and/pipelines/
. In our example it is11111111-1111-1111-1111-111111111111
- Your PipelineId is the GUID between
/pipelines/
and?experience
without the?
. In our example it isaaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa
. - We now have a Pipeline and all the details to make the Flowgear node.
Flowgear Node
- You should already have your WorkspaceId, PipelineId and Parameter(s) when you want to use this Connector in a Flowgear Workflow.
- Select an existing MS Fabric Connection or create a new one (Remember to
Connect your account
in the Connection Profile) - The
Start
Action
will be selected by default. - Enter your
WorkspaceId
andPipelineId
that you have already obtained. - Parameter(s) are only used for the
Start
Action. We suggest using theMultiline Text
input for theseCustom Properties
for ease of use.- From the above example of
SqlQuery
:- Add a Custom Property to the Connector
- Change the Name to
SqlQuery
- Change the Input Type to
Multiline Text
- Put the SQL Query as the value, for example
SELECT * FROM Contacts WHERE ID = 25;
- From the above example of
- You can now run this node (either individually or part of a workflow) and the Pipeline will be started with the given
SqlQuery
.
Remarks
Authentication
Authentication is done using Microsoft Entra. An Entra Application needs to be created to get the TenantId, ClientId, and ClientSecret that are then used in conjunction with a Microsoft login to generate an AccessToken.
WorkspaceId and PipelineId
These can be retrieved by the user when browsing in the Fabric application: https://app.fabric.microsoft.com/groups/{WorkspaceId}/lakehouses/{LakehouseId}
While you have the Lakehouse open in the app, the URL will match the above pattern, ex. https://app.fabric.microsoft.com/groups/c45d2794-fbd7-4287-8763-1da27232f50c/lakehouses/bf45d06a-ab24-4519-a9c6-7fb2e1116b2b
For the PipelineId: When viewing the created Pipeline, the PipelineId is the GUID after /pipelines/
https://app.fabric.microsoft.com/groups/11dadf68-75c7-439a-8385-3c64fcfcdbec/pipelines/1ece390e-8141-4713-a382-7a8173e7fc14