• Online, Self-Paced
Course Description

Once you have data in storage, you'll need to have some mechanism for transforming the data into a usable format. Azure Data Factory is a data integration service that is used to create automated data pipelines that can be used to copy and transform data.

In this course, you'll learn about the Azure Data Factory and the Integration Runtime. Next, you'll explore the features of the Azure Data Factory, such as linked services and datasets, pipelines and activities, and triggers. Finally, you'll learn how to create an Azure Data Factory using the Azure portal, Azure Data Factory Linked services and datasets, and Azure Data Factory pipelines and activities, as well as how to trigger a pipeline manually or using a schedule.

This course is one in a collection that prepares learners for the Designing and Implementing a Data Science Solution on Azure (DP-100) exam.

Learning Objectives

{"discover the key concepts covered in this course"}

Framework Connections

The materials within this course focus on the Knowledge Skills and Abilities (KSAs) identified within the Specialty Areas listed below. Click to view Specialty Area details within the interactive National Cybersecurity Workforce Framework.