Course Details | Cloudthat

Course Overview

Candidates for this exam must have solid knowledge and experience developing apps for Azure and working with Azure Cosmos DB database technologies. They should be proficient at developing applications by using the Core (SQL) API and SDKs, writing efficient queries and creating appropriate index policies, provisioning and managing resources in Azure, and creating server-side objects with JavaScript. They should be able to interpret JSON, read C# or Java code, and use PowerShell.

Note: Currently the exam is available in Beta version

After completing this course, students will be able to:

  • Design and implement data models
  • Design and implement data distribution
  • Integrate an Azure Cosmos DB solution
  • Optimize an Azure Cosmos DB solution
  • Maintain an Azure Cosmos DB solution

Upcoming Batches

India Online Enroll
Start Date End Date

To be Decided

Key Features

  • Our training modules are equipped with 50%-60% hands-on lab sessions.
  • Highly interactive virtual and classroom teaching.
  • Qualified instructor-led training and mentoring sessions.
  • Practice lab and projects aligned to Azure learning modules.
  • Integrated teaching assistance and support.

Who Should Attend

  • A candidate for the Azure Cosmos DB Developer Specialty certification should have subject matter expertise designing, implementing, and monitoring cloud-native applications that store and manage data. Responsibilities for this role include designing and implementing data models and data distribution, loading data into an Azure Cosmos DB database, and optimizing and maintaining the solution. These professionals integrate the solution with other Azure services. They also design, implement, and monitor solutions that consider security, availability, resilience, and performance requirements. A candidate for this exam must have solid knowledge and experience developing apps for Azure and working with Azure Cosmos DB database technologies. They should be proficient at developing applications by using the Core (SQL) API and SDKs, writing efficient queries and creating appropriate index policies, provisioning and managing resources in Azure, and creating server-side objects with JavaScript. They should be able to interpret JSON, read C# or Java code, and use PowerShell.

Prerequisites

  • Knowledge of ability to navigate the Azure portal (Equivalent of AZ-900) and Microsoft Azure
  • Experience of Azure supported language at the intermediate level (Python, Java, JavaScript or C#)
  • Ability to write code to connect and perform operations on a NoSQL or SQL database product (Oracle, MongoDB, Cassandra, SQL Server or similar)

Course Outline Download Course Outline

Design and implement a non-relational data model for Azure Cosmos DB Core API

  • Develop a design by storing multiple entity types in the same container
  • Develop a design by storing multiple related entities in the same document
  • Develop a model that denormalizes data across documents
  • Develop a design by referencing between documents
  • Identify primary and unique keys
  • Identify data and associated access patterns
  • Specify a default TTL on a container for a transactional store

Design a data partitioning strategy for Azure Cosmos DB Core API

  • Choose a partition strategy based on a specific workload
  • Choose a partition key
  • Plan for transactions when choosing a partition key
  • Evaluate the cost of using a cross-partition query
  • Calculate and evaluate data distribution based on partition key selection
  • Calculate and evaluate throughput distribution based on partition key selection
  • Construct and implement a synthetic partition key
  • Design partitioning for workloads that require multiple partition keys

Plan and implement sizing and scaling for a database created with Azure Cosmos DB

  • Evaluate the throughput and data storage requirements for a specific workload
  • Choose between serverless and provisioned models
  • Choose when to use database-level provisioned throughput
  • Design for granular scale units and resource governance
  • Evaluate the cost of the global distribution of data
  • Configure throughput for Azure Cosmos DB by using the Azure portal

Implement client connectivity options in the Azure Cosmos DB SDK

  • Choose a connectivity mode (gateway versus direct)
  • Implement a connectivity mode
  • Create a connection to a database
  • Enable offline development by using the Azure Cosmos DB emulator
  • Handle connection errors
  • Implement a singleton for the client
  • Specify a region for global distribution
  • Configure client-side threading and parallelism options
  • Enable SDK logging

Implement data access by using the Azure Cosmos DB SQL language

  • Implement queries that use arrays, nested objects, aggregation, and ordering
  • Implement a correlated subquery
  • Implement queries that use array and type-checking functions
  • Implement queries that use mathematical, string, and date functions
  • Implement queries based on variable data

Implement data access by using SQL API SDKs

  • Choose when to use a point operation versus a query operation
  • Implement a point operation that creates, updates, and deletes documents
  • Implement an update by using a patch operation
  • Manage multi-document transactions using SDK Transactional Batch
  • Perform a multi-document load using SDK Bulk
  • Implement optimistic concurrency control using ETags
  • Implement session consistency by using session tokens
  • Implement a query operation that includes pagination
  • Implement a query operation by using a continuation token
  • Handle transient errors and 429s
  • Specify TTL for a document
  • Retrieve and use query metrics

Implement server-side programming in Azure Cosmos DB Core API by using JavaScript

  • Write, deploy, and call a stored procedure
  • Design stored procedures to work with multiple items transactionally
  • Implement triggers
  • Implement a user-defined function

Design and implement a replication strategy for Azure Cosmos DB

  • Choose when to distribute data
  • Define automatic failover policies for regional failure for Azure Cosmos DB Core API
  • Perform manual failovers to move single master write regions
  • Choose a consistency model
  • Identify use cases for different consistency models
  • Evaluate the impact of consistency model choices on availability and associated RU cost
  • Evaluate the impact of consistency model choices on performance and latency
  • Specify application connections to replicated data

Design and implement multi-region write

  • Choose when to use multi-region write
  • Implement multi-region write
  • Implement a custom conflict resolution policy for Azure Cosmos DB Core API

Enable Azure Cosmos DB analytical workloads

  • Enable Azure Synapse Link
  • Choose between Azure Synapse Link and Spark Connector
  • Enable the analytical store on a container
  • Enable a connection to an analytical store and query from Azure Synapse Spark or Azure Synapse SQL
  • Perform a query against the transactional store from Spark
  • Write data back to the transactional store from Spark

Implement solutions across services

  • Integrate events with other applications by using Azure Functions and Azure Event Hubs
  • denormalize data by using Change Feed and Azure Functions
  • Enforce referential integrity by using Change Feed and Azure Functions
  • Aggregate data by using Change Feed and Azure Functions, including reporting
  • Archive data by using Change Feed and Azure Functions
  • Implement Azure Cognitive Search for an Azure Cosmos DB solution

Optimize query performance in Azure Cosmos DB Core API

  • Adjust indexes on the database
  • Calculate the cost of the query
  • Retrieve request unit cost of a point operation or query
  • Implement Azure Cosmos DB integrated cache

Design and implement change feeds for an Azure Cosmos DB Core API

  • Develop an Azure Functions trigger to process a change feed
  • Consume a change feed from within an application by using the SDK
  • Manage the number of change feed instances by using the change feed estimator
  • Implement denormalization by using a change feed
  • Implement referential enforcement by using a change feed
  • Implement aggregation persistence by using a change feed
  • Implement data archiving by using a change feed

Define and implement an indexing strategy for an Azure Cosmos DB Core API

  • Choose when to use a read-heavy versus write-heavy index strategy
  • Choose an appropriate index type
  • Configure a custom indexing policy by using the Azure portal
  • Implement a composite index
  • Optimize index performance

Monitor and Troubleshoot an Azure Cosmos DB solution

  • Evaluate response status code and failure metrics
  • Monitor metrics for normalized throughput usage by using Azure Monitor
  • Monitor server-side latency metrics by using Azure Monitor
  • Monitor data replication in relation to latency and availability
  • Configure Azure Monitor alerts for Azure Cosmos DB
  • Implement and query Azure Cosmos DB logs
  • Monitor throughput across partitions
  • Monitor distribution of data across partitions
  • Monitor security by using logging and auditing

Implement backup and restore for an Azure Cosmos DB solution

  • Choose between periodic and continuous backup
  • Configure periodic backup
  • Configure continuous backup and recovery
  • Locate a recovery point for a point-in-time recovery
  • Recover a database or container from a recovery point

Implement security for an Azure Cosmos DB solution

  • Choose between service-managed and customer-managed encryption keys
  • Configure network-level access control for Azure Cosmos DB
  • Configure data encryption for Azure Cosmos DB
  • Manage control plane access to Azure Cosmos DB by using Azure role-based access control (RBAC)
  • Manage data plane access to Azure Cosmos DB by using keys
  • Manage data plane access to Azure Cosmos DB by using Azure Active Directory
  • Configure Cross-Origin Resource Sharing (CORS) settings
  • Manage account keys by using Azure Key Vault
  • Implement customer-managed keys for encryption
  • Implement Always Encrypted

Implement data movement for an Azure Cosmos DB solution

  • Choose a data movement strategy
  • Move data by using client SDK bulk operations
  • Move data by using Azure Data Factory and Azure Synapse pipelines
  • Move data by using a Kafka connector
  • Move data by using Azure Stream Analytics
  • Move data by using the Azure Cosmos DB Spark Connector

Implement a DevOps process for an Azure Cosmos DB solution

  • Choose when to use declarative versus imperative operations
  • Provision and manage Azure Cosmos DB resources by using Azure Resource Manager templates (ARM templates)
  • Migrate between standard and autoscale throughput by using PowerShell or Azure CLI
  • Initiate a regional failover by using PowerShell or Azure CLI
  • Maintain index policies in production by using ARM templates

Certification

    • By earning DP-420 certification, you can be specialized as Azure Cosmos DB Developer
    • Demonstrate skills to configure products, prices, discounts, loyalty, and affiliations
    • On successful completion of Microsoft DP-420 training aspirants receive a Course Completion Certificate from CloudThat
    • By successfully clearing the DP-420 exams, aspirants earn Microsoft Certification

Our Top Trainers

Pavan Bhawsar

Pavan is a Microsoft Certified Trainer at CloudThat. He is an enthusiastic and passionate trainer, empathic observer towards the trending technologies with demonstrated skill in Azure and hybrid Cloud Administration. He has 6+ years of corporate experience, etc.

Yanish Sahu

Yanish works as a Corporate Trainer with CloudThat Technologies. He is very passionate about learning new technologies. He is part of CloudThat’ s Microsoft team which helps to deploy Microsoft 365 setup. He is skilled on topics etc.

Sohini Rakshit

Sohini serves as a trainer for Azure cloud at CloudThat Technologies. She is an experienced Cloud Consultant and Architect, Corporate trainer working on Microsoft Azure around multi-tier distributed application design planning and deployment for several Cloud Solutions. She holds a etc.

Lakhan Kriplani

He had involved in various client projects to set up infrastructure on Cloud for various Analytics applications, E-Commerce, setup CICD Pipeline using AWS services. He has experience in developing highly secure, scalable web applications using MVC architecture. etc.

Jagadesh Gonnagar

He has been a part of several large and complex software development projects with global clients. He has worked in USA for over 11 years before relocating to India. He has expertise in Database design/development, Web development, etc.

Devi Vara Prasad

A Microsoft Certified Trainer with more than 15+ years of Corporate, Online and Classroom Training Experience, well versed in AWS an Azure Cloud platforms and have been delivering trainings for more than 5years. Also has a vast etc.

Anjali Srivastava

Anjali serves as a Research Associate for IoT (including AI/ML) and Azure cloud at CloudThat technologies. She is an experienced Solutions Engineer & Developer, corporate trainer working on Microsoft Azure around multi-tier distributed application design planning and deployment for several etc.

Ajay Kumar Lodha

Ajay is cloud obsessed and cloud addict, that's how he describes himself. Ajay has been working with all the major cloud computing platforms like AWS, Azure, and GCP for more than 5 years now. He is into etc.

Prarthit Mehta

Prarthit has been involved in various large and complex projects with global clients. He has experience in Microsoft, MS office 365 & AWS Infrastructure technologies and Windows servers, designing Active Directory and managing various domain services. He etc.

Priyant Gupta

Priyant is working in the Microsoft Technology space for last 5 years and is sharing the knowledge gained on Azure Administration, Azure Data Engineering and Dynamics 365 CE Apps. He has trained 1000+ professionals as a corporate trainer at etc.

Course Fee

Select Course date

Add to Wishlist

Course Price at

₹ 34900 + 18% GST

Enroll Now