A Practical Guide to Moving Workloads from Azure Synapse to Fabric
You move workloads from Azure Synapse to Microsoft Fabric by planning carefully. You need to check and understand the differences between the two platforms. Each step in migration needs you to know what is different. For example, you will face some problems like:
There is no direct or automatic way to upgrade.
Some T-SQL commands and features, like OPENROWSET and Synapse Link, do not work or are changed.
You often need to change code and pipelines by hand.
You get help from a special migration plan, automation tools, and expert support during the process.
Key Takeaways
First, make a complete list of your data, pipelines, and dependencies. This helps you know what you need to move. It also helps you plan your migration well.
Set clear goals for your move. Pick the right tools like Azure Synapse Pathway. Design your new system with Microsoft Fabric’s lakehouse architecture.
Move your SQL pools, Spark workloads, and data pipelines one step at a time. Use automation tools if you can. Test each part to stop mistakes.
Test your workloads in a development workspace before you go live. Watch performance closely. Fix problems early so everything runs smoothly.
Keep your data safe with good governance. Update your system often. Train your team so they can use Microsoft Fabric well after migration.
Assess Workloads
Inventory Data and Pipelines
First, make a full list of everything in your Azure Synapse workspace. This helps you know what you need to move. Here are the steps:
Write down all objects like tables, views, stored procedures, functions, and schemas. Use system views such as
sys.tables
andsys.procedures
to get details.Use Azure Monitor and Synapse Dynamic Management Views to study workloads. Check query patterns, user activity, and how resources are used. This helps you find performance problems.
Find out where your data comes from and where it goes. Look for upstream sources like Azure Data Factory pipelines. Also, check downstream users such as Power BI reports.
Decide what you want to move first. Pick workloads that matter most to your business but are not too hard to move.
Choose if you want to move things quickly or redesign your data models and pipelines.
Make a project plan. Set up a team, make a timeline, and set goals like saving money or making queries faster.
💡 Tip: Azure Automation can help you collect inventory and track changes. It works with Azure Monitor and other services to make reporting and compliance easier.
Azure Synapse Compatibility
Check if your workloads will work in Microsoft Fabric. Some features in Azure Synapse are not in Fabric. Use this table to see the main differences:
You might need to change code and libraries, especially for Spark workloads. Check if you use advanced features like real-time analytics or machine learning. Fabric may not support these yet.
Identify Dependencies
Make a list of all dependencies that could affect your move. These include:
Data formats (Parquet, Delta, CSV)
Networking (VNets, Private Endpoints)
Security (Microsoft Entra ID, Azure Key Vault)
Monitoring (Azure Monitor, Log Analytics)
Governance (Microsoft Purview for access controls and compliance)
Linked services, data source connections, and Spark pool settings
🛡️ Note: Security and compliance are important for your migration plan. Use Microsoft Purview to set access controls and keep your data safe during and after the move.
Plan Migration
Define Objectives
Begin by setting clear goals for your move. You should know why you want to use Microsoft Fabric. Some reasons are saving money, better speed, or easier data access. Talk with your team and leaders to pick what is most important. Follow these steps to help you plan:
Make a plan that lists resources, time, and how to handle risks.
Move your data in small groups. This helps you spot problems early.
Test each part to check if your data is right.
Watch the process and use tools to make it better.
📊 Tip: Use key performance indicators (KPIs) to track your progress. These can be reports moved, user numbers, or money saved. Check activity logs to measure these KPIs.
Choose Tools
Pick tools that help you move your data and code. Azure Synapse Pathway can save time and lower mistakes. This tool changes old database objects and SQL code to work with Azure Synapse Analytics. It is faster than doing it by hand and costs less. You can also try Hevo Data and PolyBase for moving data and schemas.
Azure Synapse Pathway changes scripts and lowers human mistakes.
Doing it by hand takes longer and can cause errors.
These tools work with many sources and lots of code.
⚡ Note: Automation tools make moving faster and more steady. They help you avoid mistakes and keep your project going well.
Design Architecture
Plan your new system using Microsoft Fabric’s lakehouse style. Use the Medallion setup with bronze, silver, and gold layers. This fits with Fabric’s OneLake and helps data move well. You might need to rewrite Azure Synapse pipelines and linked services. Automated tools for these are not common yet. Think about security, rules, and how your team will learn new skills like Python and Delta format.
Every business is different. For example, healthcare and finance have strict rules. Start with a minimum viable product (MVP) if you must keep things running while you move. Use microservices if you want more flexibility and easy updates. Always match your plan to your business needs and change it if needed.
Migrate Workloads
Moving your workloads from Azure Synapse to Microsoft Fabric takes many steps. You must work with SQL pools, Spark workloads, data pipelines, and manual Azure Data Factory migrations. Each part has its own way and problems. Follow these steps to move your workloads without trouble and avoid mistakes.
SQL Pools
You can move SQL pools by following simple steps. This keeps your data safe and your systems running.
Set Up a Staging Area
Make or use an Azure Data Lake Storage Gen2 account. Turn on hierarchical namespaces. This storage is a temporary spot for your data.Assign Permissions
Give the Synapse Dedicated SQL Pool managed identity the right permissions. You can also use SAS tokens to sign in.Link Data Lake to SQL Pool
In your SQL pool, make a master key, a database scoped credential, an external file format, and an external data source. These steps let your SQL pool connect to the data lake.Manage Resources
Make workload classifiers. These help you control resources, especially when you move lots of data at night.Extract Data
Use SQL scripts to move data from the Dedicated SQL Pool to the data lake. Save your data in Parquet format by making external tables.Import Data into Fabric
Connect your data lake to Microsoft Fabric. Use CREATE TABLE and COPY INTO statements with SAS tokens to bring your data into Fabric.Migrate Views and Other Objects
Move SQL view definitions and other non-data objects to Fabric.
⚠️ Tip: Always test your move with real queries. Look for missing data, slow speed, and problems with compatibility. Have backup plans and keep copies to stop problems.
Common problems are missing data, apps not working, and slow speed. You can lower these risks by planning, using tools, and testing your move with real workloads. Make a list of non-data objects like functions and stored procedures. Decide if you need to rewrite or replace them.
Spark Workloads
Moving Spark workloads needs good planning. Microsoft Fabric has new features and better speed, but some things from Azure Synapse may not be ready yet.
Assess Your Workloads
Check if Fabric Data Engineering works with your Spark workloads. Some features are still being made.Understand Differences
Learn what is different between Azure Synapse Spark and Fabric Spark. This helps you plan your move.Create a Migration Plan
Make a plan that fits your system. Include all things your workloads need.Migrate Spark Items
Move Spark pools, settings, libraries, notebooks, and job definitions to Fabric.Move Data and Pipelines
Use OneLake shortcuts to get your data in ADLS Gen2. Move pipeline activities like notebooks and Spark job definitions.Migrate Hive Metastore Metadata
Move databases, tables, and partitions from Azure Synapse to the Fabric lakehouse.Set Up Fabric Workspace
Make a new Fabric workspace. Give the right roles and permissions.
🚀 Note: Microsoft Fabric gives you ready Spark resource profiles. These profiles set up compute for different workloads. You get faster data loading and better speed for data engineering. You can change profiles or make your own.
After moving, you may see faster queries and better speed. Fabric’s Spark runtime has built-in speed boosts and smart caching. This can make your data work up to four times faster.
Data Pipelines
Moving data pipelines helps your data keep flowing in your new setup. You need to move both your data and how your pipelines work.
Identify Data and Pipelines
List the data you want to move to OneLake. Find which pipelines you need to move.Choose a Migration Approach
Use ADLS Gen2 as your main storage and make OneLake shortcuts. This skips copying data.
Or, move your data to OneLake using tools like mssparkutils fastcp, AzCopy, Azure Data Factory copy activity, or Azure Storage Explorer.
Migrate Pipeline Activities
Move Spark activities, like notebooks and job definitions, from Azure Synapse to Data Factory data pipelines in Fabric.Reference Target Notebooks
Update your new pipelines to point to the right notebooks.Check Compatibility
Look at what is different between Azure Synapse Spark and Fabric. Change your pipelines if needed.
🛠️ Tip: Moving data can cause problems with data quality, downtime, or compatibility. Use checks, compare schemas, and test to keep your data right. Move data when few people are using it and use backup systems to lower downtime.
You may have trouble with mapping data, moving big data, and costs. Move in steps and use tools that can grow to keep things working.
Manual ADF Migration
Moving Azure Data Factory pipelines to Microsoft Fabric often needs manual work. Some features do not move by themselves, so you must rebuild them in Fabric.
Make linked services again as connections in pipeline activities.
Redefine datasets inside pipeline activities. Fabric does not use datasets as separate things.
Replace Integration Runtimes with On-premises Data Gateways or managed virtual network gateways.
Rebuild validation activities using Get Metadata, Until, and If Condition activities.
Make ADF data flows again as Fabric dataflows. Fabric uses Power Query, which does not work with ADF’s data flows.
For things that are not supported, use the Invoke pipeline activity to call ADF pipelines from far away.
Move triggers by making schedule triggers as Fabric schedules. Use Activator alerts for event-based triggers.
Change how you debug. Fabric uses interactive mode, so you must start activities by hand for testing.
Make Change Data Capture (CDC) items again as Copy job items in Fabric.
Plan for licenses and keep up with Fabric’s new features.
💡 Note: Manual moves take time and work. Write down your steps and test each pipeline after moving. This helps you find mistakes early and makes sure your data flows work.
Validate and Optimize
Test Workloads
You must check if your workloads work after moving. Start by moving a few important workloads first. This way, you can find problems early. Use Fabric Notebooks and shared workspaces to keep your code and models together. Always test your data pipelines and reports in a development workspace before using them in production.
Get your content ready and look for mistakes before you share it.
Use deployment pipelines with steps for development, testing, and production.
Check each step to make sure your data and pipelines work right.
Rebind Power BI datasets and fix permissions after you move.
Use checksums and count records to make sure your data is correct.
🧪 Tip: Try testing pipelines that are not very important first. This helps you improve your process and lowers risk.
Monitor Performance
After you move, you need to watch how your workloads run. Use the Microsoft Fabric Capacity Metrics app to see how much resources you use. Watch both interactive and background jobs. Look for high compute use and times when things slow down.
Track how much capacity you use and compare it to what you bought.
Set alerts for when you get close to your limits.
Break down usage by item type, like pipelines or notebooks.
Change query timeouts and row limits to save resources.
Use scaling to handle times when you need more power.
📈 Note: Put reporting and ETL workloads in different workspaces. This helps you save money and stops resource fights.
Governance
Good governance keeps your data safe and follows the rules. Give clear roles for who owns and takes care of data. Use Microsoft Purview to track where data comes from, label it, and set security rules. Organize your data into groups and workspaces that fit your business.
Set access controls for workspaces, datasets, and columns.
Use the Admin Portal to manage settings and roles in one place.
Watch and check data use with Fabric’s logging tools.
Sync roles with Microsoft Entra ID for easy access control.
Make lists of datasets and pipelines so people can find them.
🛡️ Tip: Make governance rules early. This helps you follow laws like GDPR and HIPAA.
Post-Migration Tuning
Keep making your system better after you move. Use Fabric’s SaaS features to boost speed and save money. Dataflow Generation 2 and smart tuning help you scale and adjust workloads. Watch how people use the system and change capacity if needed.
Plan when to run workloads to avoid busy times and lower costs.
Move old data to archives to save on storage.
Use small test projects to see how much you use before buying more capacity.
Teach your team how to use Fabric tools to keep improving.
Check and update your governance and cost plans often.
🚀 Note: Fabric’s SaaS platform helps you scale, manage, and tune your workloads with built-in tools and automation.
You can move your workloads well if you follow these steps. First, look at your workloads and what they depend on. Next, make a plan with clear goals and pick good tools. Then, move your SQL, Spark, and pipeline workloads. Try to use automation to help you. Last, check your new setup and make it better.
Many people have found that using a lakehouse with OneLake and open Delta formats makes moving easier. New migration tools also help a lot. After you move, keep testing your pipelines and help your team learn. Use Fabric’s real-time analytics to get more from your data. Watch for new features and check your rules often to get the best from Microsoft Fabric.
FAQ
What is the first step when moving workloads from Azure Synapse to Fabric?
Start by making a list of your data, pipelines, and dependencies. This helps you know what you need to move and lets you plan your migration.
Can I automate the migration process?
You can use tools like Azure Synapse Pathway and Hevo Data to help with some parts. But you still need to do some things by hand for certain pipelines and features.
How do I handle unsupported features in Fabric?
Look at your workloads to find features that Fabric does not support. Change or rebuild those parts using Fabric’s tools and follow best practices.
Will my security settings transfer automatically?
Your security settings do not move over by themselves. You have to set up permissions, roles, and access controls again in Fabric to keep your data safe.
How do I test if my migration was successful?
Test your workloads in a development workspace. Check if your data is correct, run your pipelines, and use Fabric’s monitoring tools to make sure everything works.