Implementing a Data Analytics Solution with Azure Synapse Analytics Training

  • Learn via: Classroom
  • Duration: 1 Day
  • Level: Intermediate
  • Price: From €1,248+VAT
We can host this training at your preferred location. Contact us!

This is a single day Instructor Lead Course designed to give the learners instruction on the SQL dedicated and serverless Spark pools and providing instruction of data wrangling and the ELT process using Synapse Pipelines which is very similar to those familiar with Azure Data Factory (ADF) to move data into the Synapse dedicated pool database.

Audience Profile

The Audience should have familiarity with notebooks that use different languages and a Spark engine, such as Databricks, Jupyter Notebooks, Zeppelin notebooks and more. They should also have some experience with SQL, Python, and Azure tools, such as Data Factory.

MODULE 1: Introduction to Azure Synapse Analytics

Learn about the features and capabilities of Azure Synapse Analytics - a cloud-based platform for big data processing and analysis.

  • Introduction
  • What is Azure Synapse Analytics
  • How Azure Synapse Analytics works
  • When to use Azure Synapse Analytics
  • Exercise - Explore Azure Synapse Analytics
  • Knowledge check
  • Summary

MODULE 2: Use Azure Synapse serverless SQL pool to query files in a data lake

With Azure Synapse serverless SQL pool, you can leverage your SQL skills to explore and analyze data in files, without the need to load the data into a relational database.

  • Introduction
  • Understand Azure Synapse serverless SQL pool capabilities and use cases
  • Query files using a serverless SQL pool
  • Create external database objects
  • Exercise - Query files using a serverless SQL pool
  • Knowledge check
  • Summary

MODULE 3: Analyze data with Apache Spark in Azure Synapse Analytics

Apache Spark is a core technology for large-scale data analytics. Learn how to use Spark in Azure Synapse Analytics to analyze and visualize data in a data lake.

  • Introduction
  • Get to know Apache Spark
  • Use Spark in Azure Synapse Analytics
  • Analyze data with Spark
  • Visualize data with Spark
  • Exercise - Analyze data with Spark
  • Knowledge check
  • Summary

MODULE 4:Use Delta Lake in Azure Synapse Analytics

Delta Lake is an open source relational storage area for Spark that you can use to implement a data lakehouse architecture in Azure Synapse Analytics.

  • Introduction
  • Understand Delta Lake
  • Create Delta Lake tables
  • Create catalog tables
  • Use Delta Lake with streaming data
  • Use Delta Lake in a SQL pool
  • Exercise - Use Delta Lake in Azure Synapse Analytics
  • Knowledge check
  • Summary

MODULE 5: Analyze data in a relational data warehouse

Relational data warehouses are a core element of most enterprise Business Intelligence (BI) solutions, and are used as the basis for data models, reports, and analysis.

  • Introduction
  • Design a data warehouse schema
  • Create data warehouse tables
  • Load data warehouse tables
  • Query a data warehouse
  • Exercise - Explore a data warehouse
  • Knowledge check
  • Summary


Contact us for more detail about our trainings and for all other enquiries!

Upcoming Trainings

Join our public courses in our Istanbul, London and Ankara facilities. Private class trainings will be organized at the location of your preference, according to your schedule.

Classroom / Virtual Classroom
07 August 2024
Istanbul, Ankara, London
1 Day
Classroom / Virtual Classroom
11 August 2024
Istanbul, Ankara, London
1 Day
Classroom / Virtual Classroom
22 August 2024
Istanbul, Ankara, London
1 Day
Classroom / Virtual Classroom
26 August 2024
Istanbul, Ankara, London
1 Day
Classroom / Virtual Classroom
11 September 2024
Istanbul, Ankara, London
1 Day
Classroom / Virtual Classroom
18 September 2024
Istanbul, Ankara, London
1 Day
Classroom / Virtual Classroom
24 September 2024
Istanbul, Ankara, London
1 Day
Classroom / Virtual Classroom
26 October 2024
Istanbul, Ankara, London
1 Day
By using this website you agree to let us use cookies. For further information about our use of cookies, check out our Cookie Policy.