Cloud Computing, Data Analytics

3 Mins Read

Init Scripts in Databricks for Consistent Environments and Error-Free Deployment

Introduction

Init Scripts (a.k.a Initialization Scripts) are shell scripts that run to start required processes as part of the boot process. In other words, Init Scripts are like a set of instructions that a computer follows when it starts up. The init scripts perform tasks such as checking hardware components, loading essential software, configuring network settings, and starting important services or programs.

In the context of Databricks, an Init Script is a shell script that runs during the startup of each cluster node before the Apache Spark driver or worker JVM starts. When you work with data in Databricks, you often need to set up paths for environment variables, install specific libraries, etc. an init script can help you in such cases by executing a series of steps each time you start your Databricks Cluster. So, an init script in Databricks is an automated script that can prepare your computing environment before you start your data analysis and machine learning tasks.

Types of Init Script in Databricks

Databricks officially supports two kinds of init scripts:

  1. Cluster-level Init Scripts:
  • These init scripts are called ‘Cluster-scoped Init Scripts’.
  • They run on every cluster configured with the script.
  • This is the recommended way of running init scripts from Databricks.
  • Cluster-level init scripts help you standardize the setup across multiple clusters in the workspace.

2. Workspace-level Init Scripts:

  • These init scripts are called ‘Global Init Scripts’.
  • They run on every cluster available in the workspace.
  • These init scripts can ensure that a specific cluster configuration is enforced consistently across the workspace.

When you configure the above two types of init scripts in your workspace, Databricks follows a specific order of execution while running init scripts. The order of execution will be:

  1. Global init script
  2. Cluster-scoped init script

Remember that each time you create a new init script or modify the existing init script, you must restart the cluster it is executing on.

Pioneers in Cloud Consulting & Migration Services

  • Reduced infrastructural costs
  • Accelerated application deployment
Get Started

Environment Variables

Cluster-scoped init scripts and global init scripts support the below Databricks environment variables:

  1. DB_CLUSTER_ID: This variable returns the ID of the cluster on which the init script is currently running.
  2. DB_CONTAINER_IP: This variable returns the private IP address of the container in which spark runs. The init script runs inside this container.
  3. DB_IS_DRIVER: This variable returns a Boolean value based on whether the init script runs on a driver node.
  4. DB_DRIVER_IP: This variable returns the IP address of the driver node.
  5. DB_INSTANCE_TYPE: This variable returns the instance hosting the virtual machine.
  6. DB_CLUSTER_NAME: This variable returns the cluster name on which the init script executes.
  7. DB_IS_JOB_CLUSTER: This variable returns a Boolean value based on whether the cluster was created to run a job.

Use Cases

Init scripts in Databricks offer several use cases that can enhance your workflow and streamline your data analysis processes. Below are some of the most common scenarios where the usage of init scripts can be beneficial:

  1. Library installations: With init scripts, we can install libraries and their dependencies not included in the Databricks runtime. This ensures that all the required software components are readily available when you start your Databricks workspace or cluster, saving you the time and effort of manually installing them each time.
  2. Configuring artifact repository: The required libraries may sometimes reside inside an artifactory. To comply with the organization’s security policies, you should only install libraries from that artifactory. Init scripts can help in such cases by automating the artifactory configuration like setting paths of artifactory, passing credentials, retrieving tokens to access artifactory data, etc.
  3. Data Preprocessing: If you need to perform certain data preprocessing tasks before you start your data analysis, then init scripts can help. For example, you can use init scripts to download, prepare datasets, clean data, or transform data into a suitable format ensuring your data is ready for analysis before you start your work.
  4. Configuring custom SSL certificate authority: To avoid connection errors to your endpoints, you may have to import custom CA certificates, which must be loaded into ‘/etc/ssl/certs’ for Databricks to verify them. In this case, you can use an init script to load them from their source path into the Databricks recommended path every time you run your cluster.
  5. Configuring 3rd party observability tools such as Datadog/Amazon Cloudwatch etc.
  6. Configuring 3rd party governance tools such as Immuta/Protegrity etc.
  7. Configuring External Hive Metastore: In Databricks, when you work with large amounts of data, you may have multiple datasets or tables stored in various formats like Parquet, CSV, or JSON. The external Hive Metastore is a centralized catalog that stores information about those datasets, including their location, structure, and metadata. Instead of manually specifying the details of each dataset, you can register the datasets with the Hive Metastore with the help of an init script. The registration process involves specifying the location of the datasets, such as Azure Blob Storage, Amazon S3, etc.

Example

Below shell script is a sample global init script that can be used to copy a custom CA certificate called ‘MyCA.pem’ that is located in the ‘/dbfs/user/user_name/’ path into ‘/etc/ssl/certs’ path in dbfs each time when you start the cluster.

Conclusion

Init scripts allow you to standardize the setup process across teams and projects by ensuring consistent environments and reducing the chances of discrepancies or errors.

You can easily share and reproduce your workspace or cluster configurations with colleagues or other teams by automating the setup steps.

Making IT Networks Enterprise-ready – Cloud Management Services

  • Accelerated cloud migration
  • End-to-end view of the cloud environment
Get Started

About CloudThat

CloudThat is an official AWS (Amazon Web Services) Advanced Consulting Partner and Training partner and Microsoft Gold Partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best-in-industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.

Drop a query if you have any questions regarding Init Scripts, Databricks, I will get back to you quickly.

To get started, go through our Consultancy page and Managed Services Package, CloudThat’s offerings.

FAQs

1. Are there any other types of Init Scripts that Databricks stop supporting?

ANS: – Yes. Databricks stop supporting ‘Legacy Global Init Scripts’ and ‘Cluster-name Init Scripts’. Databricks deprecate these and cannot be used on new workspaces.

2. How can I check if my Databricks workspace still contains ‘Legacy Global Init Scripts’?

ANS: – ‘/databricks/init’ is the reserved location for legacy global init scripts in every Databricks workspace.

3. Who can create Global Init Scripts?

ANS: – Only ‘Workspace Admins’ can create global init scripts in a Databricks workspace.

WRITTEN BY Yaswanth Tippa

Yaswanth Tippa is working as a Research Associate - Data and AIoT at CloudThat. He is a highly passionate and self-motivated individual with experience in data engineering and cloud computing with substantial expertise in building solutions for complex business problems involving large-scale data warehousing and reporting.

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!