• Online, Self-Paced
Course Description

Creating a data pipeline is essential to making any data-related product. AWS Data Pipeline, AWS Batch, and AWS Workflow frameworks allow you to manage data using ETL data management across various AWS tools and services, making AWS a perfect platform for combining data from multiple sources.

In this course, you'll learn how to automate data movement and transformation processes on AWS and define data-driven pipelines and workflows. Investigating how data pipelines enable seamless, scalable, and fault-tolerant data transfer between AWS storage and computational tools helps illuminate the full potential of AWS in machine learning.

By the end of this course, you'll have a working knowledge of the most common use cases of AWS Data Pipeline, AWS Batch, and AWS Workflow, bringing you closer to being fully prepared for the AWS Certified Machine Learning - Specialty certification exam.

Learning Objectives

{"discover the key concepts covered in this course"}

Framework Connections

The materials within this course focus on the NICE Framework Task, Knowledge, and Skill statements identified within the indicated NICE Framework component(s):

Specialty Areas

  • Systems Architecture

Specialty Areas have been removed from the NICE Framework. With the recent release of the new NICE Framework data, updates to courses are underway. Until this course can be updated, this historical information is provided to give better context as to how it can help you with your cybersecurity goals.

Feedback

If you would like to provide feedback for this course, please e-mail the NICCS SO at NICCS@hq.dhs.gov.