Step-by-Step Guide to Triggering Fabric Pipelines Using Azure Data Factory
You can trigger Fabric Pipelines with Azure Data Factory, but there are some significant limitations. For instance, Spark jobs and high concurrency notebooks cannot run from pipelines. Certain triggers, such as tumbling window, are not fully supported yet. You may encounter issues with connector authentication and limited support for managed identities. Additionally, some activities like GetMetaData and Mapping Data Flow have their own restrictions. When you trigger Fabric Pipelines, always verify resource limits, manually update tokens, and choose the appropriate authentication methods.
Key Takeaways
You can start Fabric Pipelines from Azure Data Factory. You can use Invoke Pipeline Activity, Web Activity, or event triggers. Each way has good points and some limits.
Set up the needed Azure services first. Give the right permissions to your service principal or managed identity. This helps keep pipeline triggering safe and easy.
Use managed identity for authentication if you can. It keeps your credentials safe and makes setup easier. Service principals also work well if you manage secrets carefully.
Web Activity is the most flexible. It can call Fabric REST APIs. You can pass parameters and trigger pipelines in other workspaces. But you must set up authentication with care.
Watch your pipelines often using Azure Data Factory's dashboard and alerts. This helps you find errors early. It keeps your data workflows running well.
Prerequisites
Required Services
You need to set up some Azure resources before you start. Here is what you need:
Azure Data Factory helps you build and run data pipelines.
Microsoft Fabric Workspace is where you make and manage Fabric Pipelines.
Azure Active Directory (Microsoft Entra) is needed for identity and access.
Service Principal (App Registration) is a secure identity for automation tasks.
Microsoft Entra Security Group lets you add your service principal for easier control.
Power BI Admin Portal is where you turn on settings for service principal access.
Fabric Lakehouse Linked Service connects Azure Data Factory to your Fabric workspace.
Power BI Tenant Settings lets users get OneLake data from outside apps.
Tip: Make sure your Azure subscription has enough resources and permissions before you start.
Permissions
You must give the right permissions so Azure Data Factory can trigger Fabric Pipelines. Do these steps:
Give the Data Factory Contributor role to your user or service principal at the resource group or subscription level. This lets you make and manage pipelines.
In the Fabric workspace, add your service principal or security group as a Member, Contributor, or Admin.
Give control plane permissions in the Fabric portal so your service principal can use and run pipelines.
Use T-SQL
GRANT
statements if you need to let someone access certain tables or resources.Test your permissions by making a simple API call outside Azure Data Factory to check if it works.
Authentication
There are a few ways to let Azure Data Factory talk to Fabric Pipelines:
Managed Identity is the easiest and safest way. You do not need to handle secrets or certificates. Turn on managed identity in your workspace and use it for authentication.
Service Principal (App-Only Authentication) is good if managed identity is not there. Register an app in Azure Active Directory and use its credentials. For better safety, use a certificate instead of a client secret.
OAuth2 (User Delegated) is okay for testing but not great for automated jobs because tokens do not last long.
Note: Always give only the permissions your pipelines need. This keeps your data and resources safe.
Trigger Fabric Pipelines Methods
There are different ways to trigger Fabric Pipelines from Azure Data Factory. Each way has its own steps, good points, and problems. Picking the right way helps you automate your data jobs and keep things safe.
Invoke Pipeline Activity
Invoke Pipeline Activity lets you start one pipeline from another. You can use it in the same workspace or, with the preview, in other workspaces too. This is a good choice if you want to link pipelines together.
Pros:
Easy to use for simple pipeline chains.
The preview lets you trigger pipelines in other workspaces and services, like Azure Data Factory and Synapse.
With the preview, you can watch child pipelines in the Fabric Monitoring Hub.
Cons:
The old version only works in the same Fabric workspace.
You cannot use it to start Synapse or Azure Data Factory pipelines from Fabric.
The old version does not show child pipeline runs.
Some system variables do not work in the preview. You can fix this by sending parent pipeline info as parameters.
You need to set up connection credentials, and the connection must have rights to the child pipeline.
Tip: Check for new updates often. Microsoft is always making this activity better and fixing bugs.
Web Activity Approach
The Web Activity way gives you more choices. You use it to call the Fabric REST API from Azure Data Factory. This is helpful if you want to trigger Fabric Pipelines in a flexible way or send parameters when you run them.
To do this, you add a Web Activity to your pipeline. Set the HTTP method to POST and give the REST API URL for the pipeline you want to run. You also need to handle authentication, usually with managed identity or service principal. This way works for both waiting and not waiting for the pipeline to finish.
How it works:
Give your Data Factory a managed identity and give it the right permissions in Azure Entra ID and the Fabric Admin portal.
Keep your credentials safe in Azure Key Vault. This keeps secrets safe and stops you from putting them in your pipeline.
Set up the Web Activity with the REST API endpoint, HTTP method, and request body.
Use the Web Activity to start Fabric Pipelines by calling the
pipelines/createrun
endpoint.
Pros:
Works in different workspaces and services.
Lets you send parameters and control how things run.
Can work with on-premises or VNET data gateways if you need.
Cons:
You must set up authentication and permissions carefully.
You need to keep secrets safe, best with Azure Key Vault.
If the REST API changes or goes down, your pipelines may not work.
Note: Using managed identity is safer and easier than using client secrets. Always keep credentials in a safe place.
Event Triggers
Event triggers let you start pipelines by time or events. You can use them for regular jobs, batch work, or when something happens. Azure Data Factory has different triggers for different needs.
Pros:
Runs pipelines automatically, so you do not have to do it yourself.
Works for both time-based and event-based jobs.
Good for regular and special data jobs.
Cons:
Some triggers, like tumbling window, may not work for all Fabric Pipelines yet.
Event triggers need the right event sources and permissions set up.
Tip: Use event-based triggers for real-time jobs and schedule triggers for regular batch work.
Choosing the Right Method
Triggering with the REST API has some problems, like tricky authentication and possible API changes. The Web Activity way is the most flexible and works best for most cases. Invoke Pipeline Activity is good for simple chains, mostly in the same workspace. Event triggers help you automate and schedule your jobs.
Pick the way that fits your needs, security, and automation plans. Always test your setup and watch for changes in what is supported.
ADF Pipeline Setup
Add Invoke Pipeline Activity
You can use the Invoke Pipeline Activity to start a pipeline from another pipeline. This is helpful when you want to connect pipelines together.
Follow these steps:
Open your Azure Data Factory workspace. Go to your pipeline.
Find the Invoke Pipeline activity in the Activities pane. Drag it onto your pipeline canvas.
In the settings, pick the pipeline you want to start.
If you need to start a pipeline in another workspace, check that you have the right permissions. Make sure preview features are turned on.
Set up authentication with a managed identity or service principal. Managed identities help you avoid using secrets.
Save your pipeline and publish it.
Tip: Only give your managed identity or service principal the permissions it needs. This helps keep your data safe.
Configure Web Activity
The Web Activity lets you control how you trigger Fabric Pipelines with REST API calls.
Here’s how you set it up:
Register a service principal in Azure. Write down the Client ID, Tenant ID, and Client Secret.
Keep these credentials safe in Azure Key Vault.
In your pipeline, add a Web Activity to get an access token.
Set the URL to
https://login.microsoftonline.com/{tenant_id}/oauth2/v2.0/token
.Use POST as the method.
Set the header to
Content-Type: application/x-www-form-urlencoded
.In the body, add your client ID, secret, tenant ID, and scope.
Use a pipeline variable to save the access token from the response.
Add another Web Activity to call the Fabric Pipeline REST API.
Set the URL to
https://api.fabric.microsoft.com/v1/workspaces/{workspace_id}/items/{pipeline_id}/jobs/instances?jobType=Pipeline
.Use POST as the method.
Set the header to
Content-Type: application/json
. Add the Authorization header with the Bearer token.Use
{}
as the body.
Note: Managed identities are the safest way to handle authentication. They let you get tokens without putting secrets in your pipeline.
Parameter Passing
You can send parameters from Azure Data Factory to Fabric Pipelines. This makes your workflows flexible.
Make pipeline parameters in Azure Data Factory for things like file names or folder paths.
Link these parameters to your activities using expressions like
@pipeline().parameters.parameterName
.When you start Fabric Pipelines, add these parameters in the request body or activity settings.
Inside your Fabric Pipeline, use these parameters to control your data flow.
Best Practice: Keep sensitive values like connection strings in Azure Key Vault. Use linked services to get them safely instead of writing them in your code.
By doing these steps, you can trigger Fabric Pipelines in a safe and flexible way. This keeps your credentials safe and makes your pipelines easy to manage.
Monitoring and Troubleshooting
Monitor Pipeline Runs
It is important to watch your pipeline runs. This helps you know if things work right. Azure Data Factory has tools for this job. You can use the monitoring dashboard in Data Factory. The dashboard shows pipeline runs, trigger runs, and integration runtimes. You can filter what you see and look at details. You can also run pipelines again if you need. Each pipeline has activity runs you can check. You can look at input and output in JSON format. Error messages are easy to find here.
You can get alerts right away if something happens. Use Web Activity with Power Automate to send emails. This tells you when a pipeline works or fails. You can fix problems fast and save money.
Tip: You can pick up to five pipeline activity properties as user properties. This makes it easier to watch your pipelines.
Handle Errors
Sometimes, you will see errors when you trigger Fabric Pipelines. Some common problems are schema mismatches, token expiration, and integration runtime issues. Always look at the error messages in the activity run details. If you see UserErrorTypeInSchemaTableNotSupported
, check your source and destination schemas. Make sure you handle any DBNull
values.
If you turn on diagnostic settings, send logs to Log Analytics. Use KQL queries to find problems with CPU or IO. Always use correlation IDs from failed activities to look for more details.
Best Practices
Follow these steps to fix and improve your pipelines:
Make sure your managed identity or service principal has the right permissions.
Check that your integration runtime is running and can reach the network.
Check your data source and sink settings, including tokens and keys.
Use the ADF Monitor to look at activity run details and error messages.
Turn on Azure Monitor and Log Analytics for more information.
Use error handling activities like Try Catch to manage problems and retries.
Look at execution plans and make data movement better with partitioning and parallelism.
Set up alerts for failures or SLA problems so you can act fast.
Note: Check your monitoring setup often and update it as your data jobs grow. This helps you find problems early and keep your pipelines working well.
You can start Fabric Pipelines with Azure Data Factory in a few ways. You can use Invoke Pipeline Activity, Web Activity, or event triggers. These ways are reliable and can handle big jobs. They have tools to watch your pipelines and try again if something fails. Using automation makes your data work faster and helps with bigger projects. Watch for new things like AI tools for making pipelines and better security. If you want to learn more, check Microsoft’s guides and best practice tips.
To learn more, look at Microsoft’s resources about advanced integration and monitoring tools.
FAQ
How do you pass parameters from Azure Data Factory to a Fabric Pipeline?
First, make parameters in your Data Factory pipeline. When you start the Fabric Pipeline, add these parameters in the request body or in the activity settings. The Fabric Pipeline will use these parameters when it runs.
Can you use managed identity for authentication?
Yes, you can use managed identity. Turn on managed identity in your workspace. Give it the right permissions. This keeps your secrets safe and makes setup simple.
What should you do if a pipeline fails to trigger?
Check the activity run details in Azure Data Factory. Look for error messages there. Make sure your permissions, authentication, and network settings are correct. Fix any problems you find, then try running the pipeline again.
Are all trigger types supported for Fabric Pipelines?
Not every trigger works with Fabric Pipelines. Schedule and manual triggers work best. Tumbling window and some event triggers may not work well. Always check the latest documentation for updates.
How can you monitor the status of a triggered Fabric Pipeline?
Use the Azure Data Factory monitoring dashboard. You can see pipeline runs and filter by status. You can also look at error details. Set up alerts to get notified if something goes wrong.