Module 1: Explore Azure Databricks
Azure Databricks is a cloud service that provides a scalable platform for data analytics using Apache Spark.
- Introduction
- Get started with Azure Databricks
- Identify Azure Databricks workloads
- Understand key concepts
- Data governance using Unity Catalog and Microsoft Purview
- Exercise - Explore Azure Databricks
- Module assessment
- Summary
Module 2: Perform Data Analysis with Azure Databricks
Learn how to perform data analysis using Azure Databricks. Explore various data ingestion methods and how to integrate data from sources like Azure Data Lake and Azure SQL Database. This module guides you through using collaborative notebooks to perform exploratory data analysis (EDA), so you can visualize, manipulate, and examine data to uncover patterns, anomalies, and correlations.
- Introduction
- Ingest data with Azure Databricks
- Data exploration tools in Azure Databricks
- Data analysis using DataFrame APIs
- Exercise - Explore data with Azure Databricks
- Module assessment
- Summary
Module 3: Use Apache Spark in Azure Databricks
Azure Databricks is built on Apache Spark and enables data engineers and analysts to run Spark jobs to transform, analyze and visualize data at scale.
- Introduction
- Get to know Spark
- Create a Spark cluster
- Use Spark in notebooks
- Use Spark to work with data files
- Visualize data
- Exercise - Use Spark in Azure Databricks
- Module assessment
- Summary
Module 4: Manage data with Delta Lake
Delta Lake is a data management solution in Azure Databricks providing features including ACID transactions, schema enforcement, and time travel ensuring data consistency, integrity, and versioning capabilities.
- Introduction
- Get started with Delta Lake
- Create Delta tables
- Implement schema enforcement
- Data versioning and time travel in Delta Lake
- Data integrity with Delta Lake
- Exercise - Use Delta Lake in Azure Databricks
- Module assessment
- Summary
Module 5: Build Lakeflow Declarative Pipelines
Building Lakeflow Declarative Pipelines enables real-time, scalable, and reliable data processing using Delta Lake's advanced features in Azure Databricks.
- Introduction
- Explore Lakeflow Declarative Pipelines
- Data ingestion and integration
- Real-time processing
- Exercise - Create a Lakeflow Declarative Pipeline
- Module assessment
- Summary
Module 6: Deploy workloads with Lakeflow Jobs
Deploying workloads with Lakeflow Jobs involves orchestrating and automating complex data processing pipelines, machine learning workflows, and analytics tasks. In this module, you learn how to deploy workloads with Databricks Lakeflow Jobs.
- Introduction
- What are Lakeflow Jobs?
- Understand key components of Lakeflow Jobs
- Explore the benefits of Lakeflow Jobs
- Deploy workloads using Lakeflow Jobs
- Exercise - Create a Lakeflow Job
- Module assessment
- Summary
Exams and assessments
There are no formal examinations within this course. There will be a module review and summary following the practical hands-on lab, quiz and slide-deck deliveries. This will further enforce learning and support additional resource finds, for continued learning and development.
Hands-on learning
Within this course there are opportunities for learners to engage in hands-on labs to support module learning.
In addition, each module will also have a quiz, to support knowledge capture.