Course Details

Previous Page


Provisioning an Azure Data Factory


Overview/Description
Target Audience
Prerequisites
Expected Duration
Lesson Objectives
Course Number
Expertise Level



Overview/Description
Azure's Data Factory is a key component for end-to-end cloud analytics solutions. This course covers the provisioning of the components of an Azure Data Factory and implementation of data processing activities in a data-driven workflow.

Target Audience
Professionals who are preparing to take the 70-475: Designing and Implementing Big Data Analytics Solutions certification exam, and who are experienced in designing, programming, implementing, automating, and monitoring Microsoft Azure cloud platform solutions. Exam candidates should also be adept at using development tools, techniques, and design methodologies associated with the implementations of cloud-based big data analytics solutions.

Prerequisites
None

Expected Duration (hours)
1.3

Lesson Objectives

Provisioning an Azure Data Factory

  • start the course
  • identify key features of Azure Data Factory
  • identify key components and data sources for Azure Data Factory
  • list Azure Data Factory functions, variables, and naming rules
  • recognize the main steps and prerequisites to create and publish a Data Factory with Visual Studio
  • create and publish a Data Factory with Visual Studio
  • recognize the capabilities of Data Factory Datasets
  • identify key features of Data Factory Datasets
  • recognize the structure of Data Factory Datasets
  • create a Data Factory Dataset with Visual Studio
  • recognize key properties and the JSON structure of pipelines and activities in Azure Data Factory
  • identify the key policies that affect the run-time behavior of an activity in Azure Data Factory
  • create and publish pipelines
  • monitor pipelines with the Azure Portal
  • configure activity and dataset scheduling
  • configure dataset availability
  • configure dataset policies
  • recognize data slicing features and concepts for parallel processing and re-running failed data slices
  • identify how to chain multiple activities
  • model complex dataset schedules
  • create and publish a Data Factory and monitor pipelines with Azure Portal
  • Course Number:
    df_dibd_a05_it_enus

    Expertise Level
    Intermediate