Designing and Implementing Big Data Platform Solutions - exam 70-475
- Understand how Azure Data Factory fits into the Azure Data Services landscape
- Understand Data Factory architecture, including integration runtimes, pipelines, datasets, linked services and activities
- Learn how the copy process works across multiple regions
- Learn how to connect to on-premises data sources
- Understand the various file formats supported by Azure Data Factory
- Learn how to execute, schedule and trigger piplelines
- Understand management and monitoring of Azure Data Factory piplelines
- Previous experience with enterprise integration systems
This module will investigate the critical concepts required for architecting data workflows with Azure Data Factory. We will start by reviewing where Data Factory fits in a Lambda architecture. We will also define the phases of a data pipeline in Data Factory. Next, we will review the Data Factory architecture. This will include the various objects that make up a data factory pipeline along with how each of them fits together to form a pipeline. We will then do a deep dive into the copy activity which is the most commonly used activity. From there, we will look at bring-your own vs on-demand compute in a data pipeline ,and finally, we will tackle pipeline execution using manual, tumbling window, schedule and event based triggers.
Skill Me Up subscriptions include unlimited access to on-demand courses with live lab lab environments with our Real Time Labs feature for hands-on lab access.
- Access to Real Time Lab environments and lab guides
- Course Completion Certificates when you pass assessments
- MUCH MORE!