Course Overview
In this course, students will implement various data platform technologies into solutions that are in line with business and technical requirements including on-premises, cloud, and hybrid data scenarios incorporating both relational and No-SQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.
The students will also explore how to implement data security including authentication, authorisation, data policies and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which include the optimisation and disaster recovery of big data, batch processing and streaming data solutions.
Who's It For?
The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about the data platform technologies that exist on Microsoft Azure.
The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.
What You'll Learn
After completing this course, students will be able to:
- Know the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model
- Perform data preparation task that can contribute to the data science project
- Work with NoSQL data using Azure Cosmos DB
- Provision an Azure SQL database to store data
- Provision and load data into Azure SQL Data Warehouse
- Set up a stream analytics job to stream data and know how to query the incoming data to perform analysis of the data
- Use Azure Data factory to orchestrate the data movement and transformation from a wide range of data platform technologies
Study Method
- VirtualVirtual Delivery - Live and interactive classroom-style learning conducted completely online
- BlendedBlended delivery - both online course content and partial face to face requirements
- In-classIn class delivery - predominately face to face course content conducted at a specific location
- OnlineOnline delivery - online course content with the exception of assessments and work placement
Duration and Study Load
- 3 days
Entry Requirements
In addition to their professional experience, students who take this training should have technical knowledge equivalent to AZ-900T01 Azure Fundamentals.
Subjects
In this module, students will learn how to work with NoSQL data using Azure Cosmos DB. They will learn how to provision the service, and how they can load and interrogate data in the service using Visual Studio Code extensions, and the Azure Cosmos DB .NET Core SDK. They will also learn how to configure the availability options so that users are able to access the data from anywhere in the world.
Lessons
- Create an Azure Cosmos DB database built to scale
- Insert and query data in your Azure Cosmos DB database
- Provision a .NET Core app for Cosmos DB in Visual Studio Code
- Distribute your data globally with Azure Cosmos DB
In this module, students will explore the Azure relational data platform options including SQL Database and SQL Data Warehouse. The student will be able to explain why they would choose one service over another, and how to provision, connect and manage each of the services.
Lessons
- SQL Database and SQL Data Warehouse
- Provision an Azure SQL database to store data
- Provision and load data into Azure SQL Data Warehouse
In this module, students will learn the concepts of event processing and streaming data and how this applies to Events Hubs and Azure Stream Analytics. The students will then set up a stream analytics job to stream data and learn how to query the incoming data to perform analysis of the data. Finally, you will learn how to manage and monitor running jobs.
Lessons
- Explain data streams and event processing
- Querying streaming data using Stream Analytics
- How to process data with Azure Blob and Stream Analytics
- How to process data with Event Hubs and Stream Analytics
In this module, students will learn how Azure Data factory can be used to orchestrate the data movement and transformation from a wide range of data platform technologies. They will be able to explain the capabilities of the technology and be able to set up an end to end data pipeline that ingests and transforms data.
Lessons
- Explain how Azure Data Factory works
- Create Linked Services and datasets
- Create pipelines and activities
- Azure Data Factory pipeline execution and triggers
In this module, students will learn how Azure Storage provides a multi-layered security model to protect your data. The students will explore how security can range from setting up secure networks and access keys, to defining permission through to monitoring with Advanced Threat Detection.
Lessons
- Configuring Network Security
- Configuring Authentication
- Configuring Authorisation
- Auditing Security
About DDLS
DDLS is Australia’s largest provider of corporate IT and process training, with the largest portfolio of strategic partners and courses in Australia. We partner with world-class companies to help organisations and individuals in the IT industry remain up-to-date with new processes, technology and platforms to reduce risk and enable efficient business practices. We have convenient locations in almost every capital in Australia as well as the Philippines, flexible delivery modalities, industry-accredited trainers, and state-of-the-art course material and labs to produce the highest quality learning outcomes for our clients.
DDLS promotes a balanced approach to training with a focus on the key areas of Technology, Process and People. We provide extensive training options tailored to your organisation’s needs – from vendor-certified courses to customised training, including bespoke in-house developed courses.