OCT. 10, 2025
8 Min Read
It’s particularly useful for operations that aren’t supported through the graphical interface, such as creating secret scopes or adjusting access control lists (ACLs), which must be done through the CLI or API. In this hands-on tutorial, I’ll guide you through installing the CLI, setting up secure authentication, and managing secrets effectively. You’ll find code snippets and screenshots along the way to make each step clear and actionable.
Installing Databricks CLI
Before managing secrets, you’ll first need to install the Databricks CLI. Ensure you install the latest release since the new version introduces a revised command structure compared to the legacy one. The full installation guide is available in the official Databricks documentation.
On Windows, I’m using winget to handle the installation. Launch Command Prompt or PowerShell and run the following commands:
winget search databricks
winget install Databricks.DatabricksCLI
Once the setup finishes, verify that the CLI is installed by checking its version:
databricks -v
If the CLI is installed correctly, you’ll see the version number returned in your terminal.

Authenticating with Databricks CLI
Next comes authentication. There are multiple ways to authenticate the Databricks CLI, and one of the simplest methods is using a Personal Access Token (PAT). However, PATs are not the most secure and have certain restrictions. For instance, account admins cannot use them to access the Databricks Account Console.
For stronger security and broader access, I suggest using User-to-Machine (U2M) authentication, which supports both workspace and account-level operations.
If you’re an account administrator looking to access the Databricks Account Console via CLI, follow steps 1–5 from the official U2M authentication documentation. You might also want to check out the related guide: How to Generate a PAT for a Databricks Service Principal (Step-by-Step Guide).
To authenticate using U2M, first locate your Databricks workspace URL. Then, in your terminal, execute the command below:
databricks auth login --host https://dbc-<your_workspace_id>.cloud.databricks.com
The CLI will ask you to confirm a profile name. By default, it uses your workspace ID, but it’s more convenient to rename this to something readable — like dev for your development workspace or a project-specific label.
Here’s an example of setting a custom alias:
# Example of setting a more readable profile name
databricks auth login --host https://dbc-1234567890.cloud.databricks.com
# Then rename the generated profile in the CLI config to 'dev'
This allows you to reference your environment later using --profile dev when running commands.

After running the authentication command, your default web browser will open your Databricks workspace for identity verification. If you’re not already logged in, it will prompt you to sign in. Once completed, the CLI generates a .databrickscfg file in your user home directory, containing your authentication credentials and profile configuration.

Setting up the CLI profile
Inside the .databrickscfg file, you’ll find the profile that was just created during authentication. This section includes your workspace URL, token or authentication type, and other relevant configuration data. You can manage several environments (for example, dev, staging, and prod) by defining multiple profiles in this same file and switching between them via the --profile flag when executing CLI commands.

In my example, I have a single profile called dev, which uses databricks-cli as its authentication type. You’ll also notice a DEFAULT profile is automatically included. It’s a good practice to set this DEFAULT profile to your most frequently used workspace so you don’t have to type --profile dev every time you run a command.
To configure it, simply copy the host and auth_type values from your dev profile and paste them under the DEFAULT section within the .databrickscfg file.

Common Databricks CLI commands
If you haven’t logged in yet, start with:
databricks auth login
This initiates the login sequence based on your configured authentication type. Once authenticated, you can start using the CLI to interact with your Databricks workspace.

To confirm that everything works properly, try listing your clusters:
databricks clusters list
This displays all available clusters in your workspace, including their names, IDs, and current states, confirming your setup is working.
Creating a secret scope
To see what secret-related options are available, run:
databricks secrets
You’ll get a list of available subcommands, such as creating scopes, adding secrets, or setting access permissions.

Let’s create a new secret scope. Replace "demo" with your chosen scope name and "dev" with your CLI profile name:
databricks secrets create-scope demo -p dev
This command creates a new secret scope in your Databricks workspace under the specified profile.
Adding a secret to a scope
Now that the scope exists, you can add a secret to it:
databricks secrets put-secret demo my_secret
This command prompts you to securely enter a secret value. The characters you type won’t appear on the screen, ensuring your sensitive information stays hidden.

Managing secret ACLs
Next, let’s manage who has access to your new scope. To view its current ACLs (Access Control Lists), use:
databricks secrets list-acls demo
As the scope’s creator, you automatically have MANAGE permission. Other users won’t be able to view, modify, or delete secrets unless you explicitly grant them rights.

To provide another user or group with access, use the put-acl command. For instance, to give the account_admins group READ permissions, run:
databricks secrets put-acl demo account_admins READ
This command grants read-only access to that group for all secrets in the demo scope.
To confirm, list the ACLs again:
databricks secrets list-acls demo
You should now see updated entries reflecting the new permissions.

Storing JSON credentials as secrets
Sometimes you’ll need to store structured data, such as a service account key in JSON format. You can pipe the JSON directly into the CLI using this approach:
(cat << EOF
{
"type": "service_account",
"project_id": "project-123456",
"private_key_id": "1233445",
"private_key": "-----BEGIN PRIVATE KEY-----\nMII12345555...",
"client_id": "123456",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/project-123456.iam.gserviceaccount.com",
"universe_domain": "googleapis.com"
}
EOF
) | databricks secrets put-secret myscope mykey
This stores the JSON as a secret called mykey under the scope myscope.
Note: Running multi-line commands on Windows can be problematic. If you run into issues, try using Git Bash or Windows Subsystem for Linux (WSL), which handles these commands more reliably.
Reading secrets from a notebook or a job
Once your secrets are stored, you can retrieve them from within Databricks notebooks or jobs using:
my_secret = dbutils.secrets.get(scope="demo", key="my_secret")
This command retrieves the actual secret value and assigns it to the variable my_secret, allowing you to use it in your scripts. For instance, when authenticating to an external API or database.
If you attempt to print the secret directly, Databricks will mask it by displaying REDACTED instead of the actual value. This behavior helps prevent accidental exposure of credentials.

But if you’re curious, there’s a little trick:

my_secret = dbutils.secrets.get(scope="demo", key="my_secret")
for char in my_secret:
print(char)
By printing the secret character-by-character, Databricks won’t apply redaction, and you’ll see the full secret vertically in the output.
This happens because the redaction filter applies only when the full string is printed, not when iterating over its characters.
Use with caution Although this trick works, you should never use it in shared notebooks, logs, or production code. Redaction exists to protect sensitive information from accidental leaks.
Summary
Databricks secrets offer a convenient built-in option for securely managing credentials in your workspace. However, for production workloads or environments requiring stricter governance, it’s better to use dedicated secret management systems like AWS KMS, Azure Key Vault, or other cloud-native solutions.
Still, when simplicity or accessibility is your goal, this guide gives you a complete, safe, and effective approach to managing secrets directly through the Databricks CLI.

