• Online, Self-Paced
Course Description

Azure's Data Factory is a key component for end-to-end cloud analytics solutions. This course covers the provisioning of the components of an Azure Data Factory and implementation of data processing activities in a data-driven workflow.

Learning Objectives

Creating an Azure Data Factory

  • start the course
  • identify key features of Azure Data Factory
  • identify key components and data sources for Azure Data Factory
  • list Azure Data Factory functions, variables, and naming rules
  • recognize the main steps and prerequisites to create and publish a Data Factory with Visual Studio
  • create and publish a Data Factory with Visual Studio
  • recognize the capabilities of Data Factory Datasets
  • identify key features of Data Factory Datasets
  • recognize the structure of Data Factory Datasets
  • create a Data Factory Dataset with Visual Studio

Creating Pipelines and Activities

  • recognize key properties and the JSON structure of pipelines and activities in Azure Data Factory
  • identify the key policies that affect the run-time behavior of an activity in Azure Data Factory
  • create and publish pipelines
  • monitor pipelines with the Azure Portal
  • configure activity and dataset scheduling
  • configure dataset availability
  • configure dataset policies
  • recognize data slicing features and concepts for parallel processing and re-running failed data slices
  • identify how to chain multiple activities
  • model complex dataset schedules

Practice: Create a Data Factory

  • create and publish a Data Factory and monitor pipelines with Azure Portal

Framework Connections

The materials within this course focus on the Knowledge Skills and Abilities (KSAs) identified within the Specialty Areas listed below. Click to view Specialty Area details within the interactive National Cybersecurity Workforce Framework.