w.storage_credentials: Storage Credentials

class databricks.sdk.service.catalog.StorageCredentialsAPI

A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. Each storage credential is subject to Unity Catalog access-control policies that control which users and groups can access the credential. If a user does not have access to a storage credential in Unity Catalog, the request fails and Unity Catalog does not attempt to authenticate to your cloud tenant on the user’s behalf.

Databricks recommends using external locations rather than using storage credentials directly.

To create storage credentials, you must be a Databricks account admin. The account admin who creates the storage credential can delegate ownership to another user or group to manage permissions on it.

create(name: str [, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentityRequest], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], comment: Optional[str], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], read_only: Optional[bool], skip_validation: Optional[bool]]) StorageCredentialInfo

Usage:

import os
import time

from databricks.sdk import WorkspaceClient
from databricks.sdk.service import catalog

w = WorkspaceClient()

created = w.storage_credentials.create(
    name=f'sdk-{time.time_ns()}',
    aws_iam_role=catalog.AwsIamRole(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]))

# cleanup
w.storage_credentials.delete(delete=created.name)

Create a storage credential.

Creates a new storage credential.

Parameters:
  • name – str The credential name. The name must be unique within the metastore.

  • aws_iam_roleAwsIamRoleRequest (optional) The AWS IAM role configuration.

  • azure_managed_identityAzureManagedIdentityRequest (optional) The Azure managed identity configuration.

  • azure_service_principalAzureServicePrincipal (optional) The Azure service principal configuration.

  • cloudflare_api_tokenCloudflareApiToken (optional) The Cloudflare API token configuration.

  • comment – str (optional) Comment associated with the credential.

  • databricks_gcp_service_accountDatabricksGcpServiceAccountRequest (optional) The <Databricks> managed GCP service account configuration.

  • read_only – bool (optional) Whether the storage credential is only usable for read operations.

  • skip_validation – bool (optional) Supplying true to this argument skips validation of the created credential.

Returns:

StorageCredentialInfo

delete(name: str [, force: Optional[bool]])

Delete a credential.

Deletes a storage credential from the metastore. The caller must be an owner of the storage credential.

Parameters:
  • name – str Name of the storage credential.

  • force – bool (optional) Force deletion even if there are dependent external locations or external tables.

get(name: str) StorageCredentialInfo

Usage:

import os
import time

from databricks.sdk import WorkspaceClient
from databricks.sdk.service import catalog

w = WorkspaceClient()

created = w.storage_credentials.create(
    name=f'sdk-{time.time_ns()}',
    aws_iam_role=catalog.AwsIamRoleRequest(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]))

by_name = w.storage_credentials.get(name=created.name)

# cleanup
w.storage_credentials.delete(name=created.name)

Get a credential.

Gets a storage credential from the metastore. The caller must be a metastore admin, the owner of the storage credential, or have some permission on the storage credential.

Parameters:

name – str Name of the storage credential.

Returns:

StorageCredentialInfo

list([, max_results: Optional[int], page_token: Optional[str]]) Iterator[StorageCredentialInfo]

Usage:

from databricks.sdk import WorkspaceClient

w = WorkspaceClient()

all = w.storage_credentials.list()

List credentials.

Gets an array of storage credentials (as __StorageCredentialInfo__ objects). The array is limited to only those storage credentials the caller has permission to access. If the caller is a metastore admin, retrieval of credentials is unrestricted. There is no guarantee of a specific ordering of the elements in the array.

Parameters:
  • max_results – int (optional) Maximum number of storage credentials to return. If not set, all the storage credentials are returned (not recommended). - when set to a value greater than 0, the page length is the minimum of this value and a server configured value; - when set to 0, the page length is set to a server configured value (recommended); - when set to a value less than 0, an invalid parameter error is returned;

  • page_token – str (optional) Opaque pagination token to go to next page based on previous query.

Returns:

Iterator over StorageCredentialInfo

update(name: str [, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentityResponse], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], comment: Optional[str], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], force: Optional[bool], new_name: Optional[str], owner: Optional[str], read_only: Optional[bool], skip_validation: Optional[bool]]) StorageCredentialInfo

Usage:

import os
import time

from databricks.sdk import WorkspaceClient
from databricks.sdk.service import catalog

w = WorkspaceClient()

created = w.storage_credentials.create(
    name=f'sdk-{time.time_ns()}',
    aws_iam_role=catalog.AwsIamRole(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]))

_ = w.storage_credentials.update(
    name=created.name,
    comment=f'sdk-{time.time_ns()}',
    aws_iam_role=catalog.AwsIamRole(role_arn=os.environ["TEST_METASTORE_DATA_ACCESS_ARN"]))

# cleanup
w.storage_credentials.delete(delete=created.name)

Update a credential.

Updates a storage credential on the metastore.

Parameters:
  • name – str Name of the storage credential.

  • aws_iam_roleAwsIamRoleRequest (optional) The AWS IAM role configuration.

  • azure_managed_identityAzureManagedIdentityResponse (optional) The Azure managed identity configuration.

  • azure_service_principalAzureServicePrincipal (optional) The Azure service principal configuration.

  • cloudflare_api_tokenCloudflareApiToken (optional) The Cloudflare API token configuration.

  • comment – str (optional) Comment associated with the credential.

  • databricks_gcp_service_accountDatabricksGcpServiceAccountRequest (optional) The <Databricks> managed GCP service account configuration.

  • force – bool (optional) Force update even if there are dependent external locations or external tables.

  • new_name – str (optional) New name for the storage credential.

  • owner – str (optional) Username of current owner of credential.

  • read_only – bool (optional) Whether the storage credential is only usable for read operations.

  • skip_validation – bool (optional) Supplying true to this argument skips validation of the updated credential.

Returns:

StorageCredentialInfo

validate([, aws_iam_role: Optional[AwsIamRoleRequest], azure_managed_identity: Optional[AzureManagedIdentityRequest], azure_service_principal: Optional[AzureServicePrincipal], cloudflare_api_token: Optional[CloudflareApiToken], databricks_gcp_service_account: Optional[DatabricksGcpServiceAccountRequest], external_location_name: Optional[str], read_only: Optional[bool], storage_credential_name: Optional[str], url: Optional[str]]) ValidateStorageCredentialResponse

Validate a storage credential.

Validates a storage credential. At least one of __external_location_name__ and __url__ need to be provided. If only one of them is provided, it will be used for validation. And if both are provided, the __url__ will be used for validation, and __external_location_name__ will be ignored when checking overlapping urls.

Either the __storage_credential_name__ or the cloud-specific credential must be provided.

The caller must be a metastore admin or the storage credential owner or have the CREATE_EXTERNAL_LOCATION privilege on the metastore and the storage credential.

Parameters:
  • aws_iam_roleAwsIamRoleRequest (optional) The AWS IAM role configuration.

  • azure_managed_identityAzureManagedIdentityRequest (optional) The Azure managed identity configuration.

  • azure_service_principalAzureServicePrincipal (optional) The Azure service principal configuration.

  • cloudflare_api_tokenCloudflareApiToken (optional) The Cloudflare API token configuration.

  • databricks_gcp_service_accountDatabricksGcpServiceAccountRequest (optional) The Databricks created GCP service account configuration.

  • external_location_name – str (optional) The name of an existing external location to validate.

  • read_only – bool (optional) Whether the storage credential is only usable for read operations.

  • storage_credential_name – str (optional) The name of the storage credential to validate.

  • url – str (optional) The external location url to validate.

Returns:

ValidateStorageCredentialResponse