top of page
Search

Migrating from Azure Data Factory to Fabric Data Factory: A Step-by-Step Guide

  • aferencz21
  • Nov 4
  • 2 min read

As organizations embrace Microsoft Fabric for unified analytics, many teams are asking: How do we move from Azure Data Factory (ADF) to Fabric Data Factory (FDF)? This guide walks you through the migration process, highlights key differences, and shares best practices for a smooth transition.


ree

Why Fabric Data Factory?


Fabric Data Factory brings the power of data integration and orchestration into the Fabric ecosystem. It offers:

  • Power Query-based transformations with Dataflows Gen2.

  • 200+ connectors for diverse data sources.

  • Seamless integration with OneLake and Power BI.

  • Built-in governance and AI-powered Copilot for pipeline design.


Unlike ADF, which is a standalone Azure service, FDF is SaaS-based, simplifying management and enabling end-to-end analytics within Fabric.


ADF vs FDF: Quick Comparison

Feature

Azure Data Factory

Fabric Data Factory

Platform Model

Azure PaaS

SaaS in Microsoft Fabric

Integration Runtime

Requires setup

Cloud compute by default

Dataflows

ADF Data Flows

Power Query-based Dataflows Gen2

Linked Services

Separate objects

Inline Connections

Governance & Security

Azure RBAC

Unified Fabric security

Copilot Integration

Not available

Available for design and troubleshooting

Step-by-Step Migration Guide


1. Prepare Your Environment

  • Ensure ADF and Fabric workspaces share the same Microsoft Entra ID tenant.

  • Create a Fabric workspace in the same region as your ADF instance.

  • Verify permissions and network connectivity for both platforms.


2. Inventory and Assessment

  • List ADF assets: pipelines, triggers, linked services, datasets.

  • Map feature parity:

    • Linked Services → Connections

    • Datasets → Inline properties

    • Triggers → Activator framework

  • Identify gaps (e.g., SSIS support) and plan redesigns.


3. Plan Migration Strategy

  • Decide what to reuse, translate, or redesign.

  • Build test scenarios for validation.

  • Establish rollback and side-by-side testing plans.


4. Bring ADF into Fabric (Optional Preview)

  • Use the “Bring your ADF to Fabric” feature:

    • In Fabric workspace, select New Item → Azure Data Factory.

    • Mount existing pipelines and test before full upgrade.


5. Upgrade Pipelines

  • Use Microsoft.FabricPipelineUpgrade PowerShell module:

    • Import pipelines.

    • Map linked services to Fabric connections.

    • Validate upgraded pipelines against test scenarios.


6. Post-Migration Tasks

  • Update monitoring and alerting.

  • Rotate secrets and apply new naming conventions.

  • Optimize for OneLake integration and leverage Fabric-native features like Copilot.


Best Practices

  • Migrate in phases for large environments.

  • Engage Microsoft partners for complex scenarios.

  • Document all changes for compliance and governance.


Ready to start? Fabric Data Factory simplifies orchestration and unlocks unified analytics with OneLake and Power BI. By following these guidelines, you’ll ensure a smooth migration and position your organization for future innovation.


References


 
 
 

Recent Posts

See All

Comments


bottom of page