Azure, Cloud Computing

5 Mins Read

A Demonstration on Automating & Orchestrating the Data using Azure Data Factory Service

Voiced by Amazon Polly


In this lab, the services involved are SQL server, SQL databases, Storage account, Data Factory, and SQL Server Management Studio (SSMS).
Here, we will use data created in the SQL database as the source and the destination as Storage Account, where data will get stored in container JSON format.

Steps to Create SQL Database Server

  1. Click the Azure Portal link Home – Microsoft Azure, search for the SQL Servers, and click the Create
  2. Select your Subscription and Resource Group and provide the required details as shown below screenshots.


3. Select Use SQL Authentication Method and provide Server admin login and password, then click on the Review + Create button as shown in the below screenshot to get created the server.


4. After successfully creating the server, go to the Networking tab under the Security section and click Add your client IP address. The reason for adding an IP address is to access the server publicly while accessing it through SQL Server Management Studio.

5. Download the SSMS tool using Download SQL Server Management Studio (SSMS) – SQL Server Management Studio (SSMS) | Microsoft Learn.


Pioneers in Cloud Consulting & Migration Services

  • Reduced infrastructural costs
  • Accelerated application deployment
Get Started

Steps to Create SQL Database in the above created Server

  1. Search for the SQL Databases, provide the required fields, and select the Server you created in the above steps.
  2. Click on the Review + Create


Configuration setup for the SSMS Studio

  1. After installing the SSMS studio tool, open the studio and Provide the Server name and Authentication details as you have done this in the first step while creating the server.
  2. Now click on Connect, and you will be entered into the studio.


3. Click on New Query and execute the two queries that create one table and insert some data into that table, as shown below.


Steps to Create Storage Account

  1. Create a sample storage account as shown below. The main reason for having a storage account is to store the finalized data automatically from the ADF service.
  2. Also, create Container in the same storage account.



Steps to Create Azure Data Factory

  1. Search for the Data Factory service and create it.


2. After successfully creating it, go to the resource and click on Launch Studio.


Steps to Configure the Azure Data Factory

  1. After you click the Launch Studio, you will be redirected to a different page, as shown below.
  2. Now click on Ingest operation, as it’s useful only for copying the data from one source to a different source. Do as per the below screenshots.


3. As we want to trigger it only for a time, select Run once now as below.


4. Here, selects the source type as Azure SQL Database because we have stored the data in the table format.

5. Click on New Connection to establish the connection between ADF and SQL Database and follow as per the screenshots below.



6. Once the connection is established, it will fetch all the table details from the database, shown below. Also, you can preview the table.



7. Select the Destination data store as Blob Storage and establish the connection between these two services by creating a New connection.



8. After a successful connection, please give the folder path and file name as shown below.


9. In this step, you must select the file format and pattern on which format data should be stored in the destination. Select JSON format as shown below screenshots. Click on Next.



10. The summary is that data will get automated from the source data store, SQL database, and stored in the format of JSON in the Blob Storage by triggering only once.

11. Click on Next.



12. After completion of the deployment, go to the storage account -> containers, and the new file will get created with the name dbstorage, as we have given this name as the filename in the configuration steps of the ADF.


13. Download the blob file, and you will get the JSON data below, as we have inserted this data in the database.



Using this ADF service, we can move the data from the on-premises to cloud systems by automating and orchestrating data transformation.

Also, we can schedule this service to automate the data to the cloud system.

Making IT Networks Enterprise-ready – Cloud Management Services

  • Accelerated cloud migration
  • End-to-end view of the cloud environment
Get Started

About CloudThat

CloudThat is also the official AWS (Amazon Web Services) Advanced Consulting Partner and Training partner and Microsoft gold partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best in industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.

Drop a query if you have any questions regarding Azure Data Factory and I will get back to you quickly.

To get started, go through our Consultancy page and Managed Services Package that is CloudThat’s offerings.


1. Can we schedule the Data Factory service?

ANS: – Yes, we can schedule and run only once.

2. Can we select the different services as the destination store?

ANS: – Yes, we will get a list of services while selecting the destination service.

WRITTEN BY Sridhar Andavarapu

Sridhar works as a Research Associate at CloudThat. He is highly skilled in both frontend and backend with good practical knowledge of various skills like Python, Azure Services, AWS Services, and ReactJS. Sridhar is interested in sharing his knowledge with others for improving their skills too.



    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!