Azure, Cloud Computing

6 Mins Read

A Guide to Understand Event Handling in Azure Blob Storage

Overview

Data transfers occur frequently in this era of high-performance computing power capable of handling vast volumes of data, infinite cloud storage options, and real-time data processing methods.

These transfers involve actions such as copying a raw data file from an FTP server to a Data Lake, uploading an image to a Blob container, or deleting an existing file from a Data Lake folder. These processes are so commonplace that separate mechanisms have been developed to handle such scenarios.

One common scenario involves handling and reacting to Azure Blob Storage events. Let’s deep dive into the ‘events’ in Azure Blob Storage and some required terminology related to them.

Introduction

An ‘Event’ can be described as a trigger that explains something has happened in the system. Every event provides some information like the source of the event, the cause of the event, the time it took place, and a unique identifier to identify that particular event.

Pioneers in Cloud Consulting & Migration Services

  • Reduced infrastructural costs
  • Accelerated application deployment
Get Started

What are the different types of Events available?

Different types of events are associated with Azure Blob Storage, and each has its own set of REST APIs. When a client (either a user or a service) operates, such as creation or deletion in Azure Blob Storage, the action will occur by calling the corresponding REST API, and the event will be triggered based on that REST API. The most common types of events in Azure Blob Storage are:

  • Azure Blob Storage events: These events will be triggered when something has happened with Azure Blob, such as creation, deletion, or replacement, by calling Azure Blob REST APIs in Azure Blob Storage. Examples of Azure Blob Storage events are Microsoft.Storage.BlobCreated, Microsoft.Storage.BlobDeleted, Microsoft.Storage.BlobTierChanged
  • Data Lake Gen2 Storage events: These events will be triggered when something has happened in Data Lake by calling Data Lake Gen2 REST APIs. Examples of Data Lake Gen2 storage events are the same as those of Azure Blob Storage events and Microsoft.Storage.DirectoryCreated, Microsoft.Storage.DirectoryDeleted, Microsoft.Storage.DirectoryRenamed.
  • SFTP events: These events will be triggered when SFTP APIs are called. Examples of SFTP events are the same as those of the Data Lake Gen2 storage events.
  • Policy related events: A policy in Azure Blob Storage defines one or more rules to perform a certain action when a given condition is met. The policy related events will be triggered when such actions are performed. Some of the examples of policy related events are Microsoft.Storage.BlobInventoryPolicyCompleted, Microsoft.Storage.LifecyclePolicyCompleted.

Azure Event Grid

Azure offers a service called ‘ Event Grid ‘ to deliver a message that an event has occurred in a specific source to the defined destination. So, an Event Grid can be defined as a serverless, highly scalable event broker that provides reliable message delivery to notify events.

Terminology in Event Grid:

  • Publishers: An ‘Event Publisher’ is a user or organization that sends events to the event grid. Event publishers include Azure Blob Storage, Resource Groups, Azure Subscriptions, Event Hubs, Custom Topics, etc.
  • Subscriptions: An ‘Event Subscription’ defines the endpoint for handling the events where we can filter the events based on the topic. It means that a subscription tells the event grid which events we are interested in on a topic.
  • Handlers: An ‘Event Handler’ can be defined as the destination where we want our event to be sent. Once the event transmission from publisher to handler is completed, the handler takes further action to process the received event. Examples of event handlers are Azure Functions, Logic Apps, Azure Automation, WebHooks, etc.
  • Topic: The ‘Event Topic’ describes the full path to the resource that causes the event. This path contains information like subscription, resource group, storage account, container name, etc.
  • Subject: The ‘Event Subject’ describes the path to the event publisher.

AD

Fig: Event Model

Step-by-Step Guide

This demo shows how to set up an event handling mechanism to write a message into a queue whenever a new file is uploaded into the Data Lake Gen2 Blob Container.

  1. Here, for this demo, I have created a new resource group named ‘event-handling-demo’.
  2. Let’s create a Data Lake Gen2 storage account named ‘eventhandlingdemo’.

step2

3. Let’s create a ‘demo-container’ container and add a ‘demo-directory’ folder to the Data Lake Gen2 storage account.

step3

4. Let’s create a queue called ‘event-message-queue’ inside that Data Lake Gen2 storage account to store the messages written by the Function App.

step4

5. We will use an Azure Function as an Event Handler to process the event and write the message into a queue. Let’s create a function app named ‘my-event-handler’ with a Python runtime stack.

step5

6. Inside that Function App, let’s create a function called ‘demo-handler-function’ by using the ‘Azure Event Grid trigger’ preexisting template.

step6

7. Now, the function that we created will have two files in it. One is ‘__init__.py’, which will be called whenever the event is triggered, and the other is ‘function.json’, which contains the binding logic of our event source and the destination where it needs to be sent.

  • Replace the ‘__init__.py’ with the Python code below.
  • Replace the ‘function.json’ with the below JSON code.

8. Let’s set up an event trigger, and for this, we need to create an event subscription.

step8

9. The event type filters we must select for this demo are ‘Blob Created’ and ‘Blob Deleted’.

step9

10. For that Event Subscription, we need to create an endpoint for our Event Handler, the Function app we created.

step10

11. We need to integrate the Function App with Event Grid and Queue.

step11

12. Let’s upload a file called ‘CloudThat_Logo.png’ into the container.

step12

13. If we go to the Function App monitor page, there will be a function run for the event that was triggered due to the file that we uploaded in our container.

step13

14. If we open the queue now, there will be a message already written by the Function App.

step14

Conclusion

As you can see, without the need for complicated code or expensive and time-consuming polling services, Azure Storage Events allows you to react to different kinds of events, and the best thing is you only pay for what you use.

Drop a query if you have any questions regarding Azure Storage and we will get back to you quickly.

Making IT Networks Enterprise-ready – Cloud Management Services

  • Accelerated cloud migration
  • End-to-end view of the cloud environment
Get Started

About CloudThat

CloudThat is an official AWS (Amazon Web Services) Advanced Consulting Partner and Training partner, AWS Migration Partner, AWS Data and Analytics Partner, AWS DevOps Competency Partner, Amazon QuickSight Service Delivery Partner, AWS EKS Service Delivery Partner, and Microsoft Gold Partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best-in-industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.

To get started, go through our Consultancy page and Managed Services PackageCloudThat’s offerings.

FAQs

1. What are the different Azure Blob REST API calls?

ANS: – The main and most used Azure Blob REST API calls are:

  1. CopyBlob
  2. PutBlob
  3. PutBlockList
  4. FlushWithClose

2. Which Azure Storage does not support Event Grid?

ANS: – Storage (general purpose v1) does not support integration with Event Grid.

WRITTEN BY Yaswanth Tippa

Yaswanth Tippa is working as a Research Associate - Data and AIoT at CloudThat. He is a highly passionate and self-motivated individual with experience in data engineering and cloud computing with substantial expertise in building solutions for complex business problems involving large-scale data warehousing and reporting.

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!