Azure, Cloud Computing

5 Mins Read

A Demonstration on Automating & Orchestrating the Data using Azure Data Factory Service

Voiced by Amazon Polly

Overview

In this lab, the services involved are SQL server, SQL databases, Storage account, Data Factory, and SQL Server Management Studio (SSMS).
Here, we will use data created in the SQL database as the source and the destination as Storage Account, where data will get stored in container JSON format.

Pioneers in Cloud Consulting & Migration Services

  • Reduced infrastructural costs
  • Accelerated application deployment
Get Started

Steps to Create SQL Database Server

  1. Click the Azure Portal link Home – Microsoft Azure, search for the SQL Servers, and click the Create
  2. Select your Subscription and Resource Group and provide the required details as shown below screenshots.

step2

3. Select Use SQL Authentication Method and provide Server admin login and password, then click on the Review + Create button as shown in the below screenshot to get created the server.

step3

4. After successfully creating the server, go to the Networking tab under the Security section and click Add your client IP address. The reason for adding an IP address is to access the server publicly while accessing it through SQL Server Management Studio.

5. Download the SSMS tool using Download SQL Server Management Studio (SSMS) – SQL Server Management Studio (SSMS) | Microsoft Learn.

step5

Steps to Create SQL Database in the above created Server

  1. Search for the SQL Databases, provide the required fields, and select the Server you created in the above steps.
  2. Click on the Review + Create

step2ds

Configuration setup for the SSMS Studio

  1. After installing the SSMS studio tool, open the studio and Provide the Server name and Authentication details as you have done this in the first step while creating the server.
  2. Now click on Connect, and you will be entered into the studio.

step2ssms

3. Click on New Query and execute the two queries that create one table and insert some data into that table, as shown below.

step3ssms

Steps to Create Storage Account

  1. Create a sample storage account as shown below. The main reason for having a storage account is to store the finalized data automatically from the ADF service.
  2. Also, create Container in the same storage account.

step2s

step2sb

Steps to Create Azure Data Factory

  1. Search for the Data Factory service and create it.

step1ad

2. After successfully creating it, go to the resource and click on Launch Studio.

step2ad

Steps to Configure the Azure Data Factory

  1. After you click the Launch Studio, you will be redirected to a different page, as shown below.
  2. Now click on Ingest operation, as it’s useful only for copying the data from one source to a different source. Do as per the below screenshots.

step2c

3. As we want to trigger it only for a time, select Run once now as below.

step3c

4. Here, selects the source type as Azure SQL Database because we have stored the data in the table format.

5. Click on New Connection to establish the connection between ADF and SQL Database and follow as per the screenshots below.

step5c

step5cb

6. Once the connection is established, it will fetch all the table details from the database, shown below. Also, you can preview the table.

step6c

step6cb

7. Select the Destination data store as Blob Storage and establish the connection between these two services by creating a New connection.

step7c

step7cb

8. After a successful connection, please give the folder path and file name as shown below.

step8c

9. In this step, you must select the file format and pattern on which format data should be stored in the destination. Select JSON format as shown below screenshots. Click on Next.

step9c

step9cb

10. The summary is that data will get automated from the source data store, SQL database, and stored in the format of JSON in the Blob Storage by triggering only once.

11. Click on Next.

step11c

step11cb

12. After completion of the deployment, go to the storage account -> containers, and the new file will get created with the name dbstorage, as we have given this name as the filename in the configuration steps of the ADF.

step12c

13. Download the blob file, and you will get the JSON data below, as we have inserted this data in the database.

step13c

Conclusion

Using this ADF service, we can move the data from the on-premises to cloud systems by automating and orchestrating data transformation.

Also, we can schedule this service to automate the data to the cloud system.

Making IT Networks Enterprise-ready – Cloud Management Services

  • Accelerated cloud migration
  • End-to-end view of the cloud environment
Get Started

About CloudThat

CloudThat is a leading provider of Cloud Training and Consulting services with a global presence in India, the USA, Asia, Europe, and Africa. Specializing in AWS, Microsoft Azure, GCP, VMware, Databricks, and more, the company serves mid-market and enterprise clients, offering comprehensive expertise in Cloud Migration, Data Platforms, DevOps, IoT, AI/ML, and more.

CloudThat is the first Indian Company to win the prestigious Microsoft Partner 2024 Award and is recognized as a top-tier partner with AWS and Microsoft, including the prestigious ‘Think Big’ partner award from AWS and the Microsoft Superstars FY 2023 award in Asia & India. Having trained 850k+ professionals in 600+ cloud certifications and completed 500+ consulting projects globally, CloudThat is an official AWS Advanced Consulting Partner, Microsoft Gold Partner, AWS Training PartnerAWS Migration PartnerAWS Data and Analytics PartnerAWS DevOps Competency PartnerAWS GenAI Competency PartnerAmazon QuickSight Service Delivery PartnerAmazon EKS Service Delivery Partner AWS Microsoft Workload PartnersAmazon EC2 Service Delivery PartnerAmazon ECS Service Delivery PartnerAWS Glue Service Delivery PartnerAmazon Redshift Service Delivery PartnerAWS Control Tower Service Delivery PartnerAWS WAF Service Delivery PartnerAmazon CloudFront Service Delivery PartnerAmazon OpenSearch Service Delivery PartnerAWS DMS Service Delivery PartnerAWS Systems Manager Service Delivery PartnerAmazon RDS Service Delivery PartnerAWS CloudFormation Service Delivery PartnerAWS ConfigAmazon EMR and many more.

FAQs

1. Can we schedule the Data Factory service?

ANS: – Yes, we can schedule and run only once.

2. Can we select the different services as the destination store?

ANS: – Yes, we will get a list of services while selecting the destination service.

WRITTEN BY Sridhar Andavarapu

Sridhar Andavarapu is a Senior Research Associate at CloudThat, specializing in AWS, Python, SQL, data analytics, and Generative AI. With extensive experience in building scalable data pipelines, interactive dashboards, and AI-driven analytics solutions, he helps businesses transform complex datasets into actionable insights. Passionate about emerging technologies, Sridhar actively researches and shares insights on AI, cloud analytics, and business intelligence. Through his work, he aims to bridge the gap between data and strategy, helping enterprises unlock the full potential of their analytics infrastructure.

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!